Wearable computers are drumming up lots of attention and sales, but innovators have yet to settle on the best interface. Google Glass has touch and voice; others have one of the two, and many smartwatches do little more than project what’s on the user’s smartphone display.
It has proven difficult to design a rich, easy-to-use interface for devices whose screens are only a few finger widths across.
But for every problem there’s a startup, and this one’s no exception. A fledgling company, Chirp Microsystems is developing a gesture-based operating system to work with a new chip that uses sound, rather than vision, to track the user’s movements. The company is based on advances made at two University of California labs.
Inspired by medical technology, the system uses ultrasound, rather than light, to detect hand gestures within a range of about a meter. The system can sense gestures that don’t occur directly in front of its display, and it uses far less battery power than existing gesture camera-based interfaces: It runs up to 30 hours continuously on a tiny battery.
Chirp is vying to provide a service that works within a variety of wearable devices, rather than sell a single device. Its founders hope it will boost user experience on smartwatches and other wearable computers, such as Google Glass.
“With Glass, for example, you have the display screen that hovers in front of your eyeball, you can talk to it and that’s pretty useful but you can’t touch it. The idea here for Glass would be if you’ve already got this screen in front of your eyeball, if you just hold your hand up, then you can interact with the menu. That could be very powerful,” Chirp co-founder Richard Przybyla told Singularity Hub.
So how does it work? An array of resonators sends ultrasound pulses outward to echo off the objects in their path: the user’s hand, for instance. The echoes return to the resonators and a connected chip measures the elapsed time. With resonators arrayed along vertical and horizontal axes, the time measurements can be used to detect a range of hand movements.
“It’s like sonar that’s been around for many years. The main difference with ours is that we have a small chip that we can put more than one ultrasound transmitter on, so we can tell which direction the sound is coming back in because, as it comes back, it will arrive at different transducers at different times,” Przybyla said.
Przybyla and his colleagues are still fine-tuning the gestures Chirp will recognize, but a demo shows using an imaginary joystick and swiping pages without touching the screen.
Images: Robnroll via Shuterstock.com, Richard Przybyla