logo Created with Sketch.

LipSync

How would you use a touch screen mobile device if you couldn’t use your hands to touch the device? With support from Google.org, the Neil Squire Society will release the LipSync, a mouth controlled input device, which will enable people with little or no hand movement to operate a touchscreen device.

The Lipsync is a mouth operated joystick that allows a person to control a computer cursor with a minimum of head and neck movement. All the electronics are housed in the ‘head’ of the device so there are no additional control boxes, making the LipSync a good candidate for portable, wheelchair-mounted applications. The mouthpiece is attached to a precision miniature joystick sensor that requires only a very slight pressure on the shaft in order to move a cursor on the screen. The mouthpiece is hollow and allows a person to perform left and right mouse button clicks by alternatively puffing or sipping into the tube.

An estimated 1,000,000 people in Canada and the United States have limited or no use of their arms—meaning they are unable to use touchscreen devices that could provide access to helpful apps and services.

While solutions exist for desktop computers, they can cost up to $1,500 USD and do not work well on mobile devices.

We are releasing the project open source after a little more user testing, so it can be affordably made at the community level by makers, engineers, tinkerers, and hobbyists. The total cost will be less than $300 to source and assemble 3D printed parts, an Arduino board, bluechip module and other components, and can be built as a weekend project. For more information about Lipsync or to download the resources, please visit this site.

Get Involved

Get involved with the LipSync Project:







Need assistance with this form?