New Chip to Detect Gestures in Front of Tiny Wearable Screens

15 2 Loading

smartwatch-wearable-computing
Wearable computers are drumming up lots of attention and sales, but innovators have yet to settle on the best interface. Google Glass has touch and voice; others have one of the two, and many smartwatches do little more than project what’s on the user’s smartphone display.

It has proven difficult to design a rich, easy-to-use interface for devices whose screens are only a few finger widths across.

But for every problem there’s a startup, and this one’s no exception. A fledgling company, Chirp Microsystems is developing a gesture-based operating system to work with a new chip that uses sound, rather than vision, to track the user’s movements. The company is based on advances made at two University of California labs.

Inspired by medical technology, the system uses ultrasound, rather than light, to detect hand gestures within a range of about a meter. The system can sense gestures that don’t occur directly in front of its display, and it uses far less battery power than existing gesture camera-based interfaces: It runs up to 30 hours continuously on a tiny battery.

chirp-chipChirp is vying to provide a service that works within a variety of wearable devices, rather than sell a single device. Its founders hope it will boost user experience on smartwatches and other wearable computers, such as Google Glass.

“With Glass, for example, you have the display screen that hovers in front of your eyeball, you can talk to it and that’s pretty useful but you can’t touch it. The idea here for Glass would be if you’ve already got this screen in front of your eyeball, if you just hold your hand up, then you can interact with the menu. That could be very powerful,” Chirp co-founder Richard Przybyla told Singularity Hub.

So how does it work? An array of resonators sends ultrasound pulses outward to echo off the objects in their path: the user’s hand, for instance. The echoes return to the resonators and a connected chip measures the elapsed time. With resonators arrayed along vertical and horizontal axes, the time measurements can be used to detect a range of hand movements.

“It’s like sonar that’s been around for many years. The main difference with ours is that we have a small chip that we can put more than one ultrasound transmitter on, so we can tell which direction the sound is coming back in because, as it comes back, it will arrive at different transducers at different times,” Przybyla said.

Przybyla and his colleagues are still fine-tuning the gestures Chirp will recognize, but a demo shows using an imaginary joystick and swiping pages without touching the screen.

Images: Robnroll via Shuterstock.com, Richard Przybyla

Discussion — 2 Responses

  • Michael Lovett November 14, 2013 on 2:14 pm

    I still do not understand why everyone is so gaga over these supposed “smart watches”. I owned and used a mobile phone WATCH (yes, it looked pretty much like what Dick Tracy had) that I bought out of Hong Kong in 2004. I lived in mainland China from 2001 to 2006, and I was using a 2G network in 2004, doing video chats, phone to phone, on a mobile phone watch.
    People in USA, Australia, Britain, and most of Europe don’t get the “toys of Asia” until well after two, three, sometimes five years. I am so sick of mobile phones now, that I gave mine up over a year ago and now only use Google Voice on my laptop. I come home, log in, check for voicemail. if I need to call someone, then I look for a payphone. No payphone? No worries–I am 52, and I certainly didn’t need to have a phone with me 24/7 growing up, or in my 20s or 30s and I certainly don’t need one now. Everyone has been brainwashed into thinking they MUST have a phone with them at all times. Welcome to 1984 people. George Orwell predicted you.

  • truejay November 14, 2013 on 5:49 pm

    2014 is the new start of wearable technology.