Radio Inspire

How To Learn Sign Language

Controlling Your Computer Using Gestures & Capacitive Touch Inputs: Piksey Atto Demo – DIY #32


Hey everyone, it’s Frenoy. And in this video,
I am going to show you how I built this project, which allows me to control my computer using
capacitive touch inputs and gestures. I will be using the Piksey Atto for
this and if you are a maker, then you should definitely check out the Kickstarter campaign, using the link in the description. So let’s dive in. This video will give you an overview of how it all comes together, and there will be links in the description
to written posts which can be used to build this yourself. The project has 3 main sections. First we have the touch IC which is the TTP224.
This IC simplifies the entire process of adding touch inputs to your projects and there are
several modules available that use it. Next we have the gesture module which uses the
APDS-9960 sensor. This is actually a proximity, ambient light, RGB and gesture sensor though
we will only be using it to detect gestures. The gesture sensor was actually an afterthought
and that is why I am using an external module. I decided to add this after the Pixel 4 was
launched. If you are not aware, then the Pixel 4 used a radar sensor to detect gestures which
allow you to change songs, skip calls and so on, but this doesn’t seem to work very
well. Finally, we have the Piksey Atto that communicates
with the other modules and also acts like a USB keyboard which allows us to control
the computer. The Atto is based on the ATmega32U4 microcontroller and you can also use other
boards like the Leonardo that make use of the same chip. Those boards are much bigger
and do not have castellated holes which means that you will not be able to create an elegant
solution like this. Here’s how the touch module works. The sensor
can operate on a voltage ranging from 2.4V to 5.5V and it has 4 input and 4 output pins.
At the input, we have a fixed 22pF capacitor and a touch pad on the PCB. Normally the output
is LOW but when you place a finger on the touch pad, you change the total capacitance
at the input pin. This is partly due to the fact that you are grounded and that some charge
can escape through your body. The touch IC can detect this change and trigger the output
to go HIGH. This is the default logic operation. The IC also has a pin which allows you to
control the logic levels and you can reverse it such that the output logic is HIGH when
no touch input is detected and it is LOW when touch has been detected. Similarly, you can
also set it to toggle HIGH and LOW with each alternate touch. The outputs are simply connected
to 4 individual pins on the Atto – pins 14, 15, 16, 17 and these are read as digital
inputs in the sketch. The APDS-9960 gesture sensor is a bit more
complicated but in summary it has an IR LED and about 4 photodiodes that are directionally
sensitive. This is a simplified representation of the sensor. The light from the IR Led cannot
directly fall on the photodiodes due to a light barrier between them. The photodiode
can only pick up the light being reflected from any object that is infront of the sensor. When you swipe right for instance, the left
photodiode will be triggered first, followed by the right one and this can be detected
by the internal microcontroller. It can detect other types of gestures in a similar manner.
The sensor works at a maximum voltage of 3.8V though you can also obtain modules that have
a built-in voltage regulator and level shifters to take care of level conversion. The one
I am using does not have any of these so I added a tiny 3.3V voltage regulator to operate
the module at 3.3V. It uses I2C to communicate with the microcontroller. Finally, we have the Atto which as mentioned
before, acts like a keyboard. The ATmega32U4 has a built in USB port which makes this possible.
Other microcontrollers used in the Arduino Uno, Nano and even the Piksey Pico do not
support USB directly and they need an external USB to serial converter for programming and
communication. The Atto is programmed to be a human interface device or HID and the computer
thinks it is simply a USB keyboard. We can then send all sorts of key strokes to control
the computer the way we like. The sketch uses the Arduino keyboard library to take care
of this. The first sketch only deals with using the touch inputs, while the second one
adds the gesture capability. Here we have the first sketch. We simply define
the pins which are connected to the output from the TTP224 IC. We then define these pins
as inputs and initialize the keyboard. We then read the inputs and send the appropriate
key strokes. Each of the keys on your keyboard can be represented by something called ASCII
codes. These are used as identifiers by the computer. You can also directly send these
ASCII codes instead of typing out the characters. There are two main keyboard functions that
are used in this sketch. The write function is similar to when you type a key and release
it quickly, whereas the press function is similar to when you want to type a key and
hold it for a bit, mainly because you want to use a key combination by pressing multiple
keys for instance. When you use the press function, you have to call a function to release
the keys as required. ReleaseAll will release all the keys that have been pressed though
you can also release individual keys by using the release function. There probably will be better ways to do this,
but I felt doing it this way was the simplest to understand. When the first button is touched,
the sketch simply Locks the computer by pressing the windows and L key together and then releasing
them. I’ve added a 2 second delay to each shortcut to prevent the commands from being
sent repeatedly. If needed, you can adjust this to make the inputs more responsive. Similarly,
programs can be launched by hitting the windows key and then typing the program name. Button
2 launches firefox, button 3 launches VLC and 4 launches the windows media player. 0x20
is the hexadecimal ASCII code for the space key. Here is a demo for launching windows
media player and another one for locking the computer This second sketch is written for a MAC and
it uses the gesture module. In order to use the module, we needed to first to install
the APDS-9960 library. Once done, you will be able to view the example sketches which
should help you get started. The final sketch is similar to the previous
one and we start by including the module library. It then has to be initialized and you have
to set the sensitivity by using a number between 1 to 100. Using higher values can lead to
false triggers so I decided to use 50. For the touch inputs, I decided to lock the
computer, open firefox, open final cut pro and play a song using itunes. If you are using
windows, then you can update these as we did in the last sketch. The gesture inputs are
taken care off in this final IF statement. When a gesture is detected, the sketch simply
reads it and uses a switch case to call the appropriate code section. Keep a note of the
case values shown here. Before we look at what the gestures do, we
need to talk about the orientation. This is the default orientation of the sensor module
that is used in the library. When I move my hand this way, it will detect it as an UP
gesture, the opposite direction will be DOWN, this will be RIGHT and this will be left. I decided to rotate the sensor a bit just
to make it easier to assemble. This is the final position as can be seen here. When I
move my hand from left to right, it will detect this as an UP gesture and I will use it to
skip the track. When I move my hand from right to left, it will detect this as a DOWN gesture
and I will use it to go back to the previous track. Similarly, moving my hand upwards, will trigger
the LEFT gesture and this will be used to increase the volume, while the right gesture
will be used to decrease the volume. This can be seen in the sketch. Since changing
the volume level by one number is not very noticeable, I decided to add multiple statements
like these to change the volume a couple of times. You can update this as per your requirement. The keyboard shortcuts were obtained from
the controls menu and almost all applications or programs have keyboard shortcuts that can
be used with this sketch. Now that we know how the sketch works, it
is time to assemble the board. For this, we would need the Atto, the TTP224 IC, 4 22pF
capacitors in the 0603 package and also a 10K resistor which is optional. The 10K resistor
is used as a pull-up resistor along with a solder jumper which can be used to toggle
the output state. I first soldered one pin of the IC to hold it in place and then soldered
the remaining pins. A similar method was used for the resistors and capacitors. It was then time to solder the Atto and I
used a similar method of using one pin to position it in place and then soldering the
remaining pins. Once everything was completed, I decided to carry out a quick visual check
with a USB microscope. It was then time to give the PCB a good clean. I added a tiny 3.3V voltage regulator to the
APDS-9960 module, and soldered the wires in place for the power and I2C communication
by using the pin out card. Finally, I uploaded the sketch to the board
by selecting the Arduino Leonardo as the board, the correct COM port and then hitting the
upload button. The Atto uses the same microcontroller and same bootloader as the Arduino Leonardo.
This means that you can program it by simply using Arduino Leonardo as the board in the
Arduino IDE. There will be an update released to the Piksey boards package which will add
support for the Atto. You will then be able to select Piksey Atto from the menu but as
can be seen, this is totally optional. Here’s a demo of the final board in action.
The 4th button will open up itunes and play a song. I can then use the right gesture to
skip a track, the left gesture to go back to the previous track or restart the current
one. The down gesture to lower the volume and finally the up gesture to increase the
volume. And that is how easy it is to add some touch
and gesture control to your computer. There will be links in the description which will
also contain the sketches and design files for you to use. This was a demo video for
the Piksey Atto and we will now get back to our regular DIY videos. Please do consider
subscribing to this channel if you like projects like these. Thank you for watching and I will
see you in the next one.

6 Replies to “Controlling Your Computer Using Gestures & Capacitive Touch Inputs: Piksey Atto Demo – DIY #32”

Leave a Reply

Your email address will not be published. Required fields are marked *