Leap Motion’s Gesture Control Finds Niche Uses in Medicine, Art and Augmented Reality

5,577 2 Loading

leap-motion-controller 1

We first became acquainted with Leap Motion back in 2012. The company makes a small device about the size and shape of a pack of gum that uses infrared sensors and clever software to model objects inside its field of view—most obviously, these objects might be a pair of hands controlling a computer using gestures.

Though Leap’s early inspiration was to make 3D modeling more intuitive, comparisons to gesture-controlled sci-fi holographic displays led some to surmise the Leap controller could be an heir apparent to the touch screen and mouse.

It was, perhaps, a little too much hype for any startup to live up to out of the gate. After a round of unenthusiastic reviews and slow sales, they’ve had to reposition themselves for the longer haul. However, though Leap might be a little hung over from 2013’s euphoric highs, the firm is still here, and their device is still cool.

Like many emerging technologies that inflate our imagination of the possible early on, Leap may prove its worth less in ubiquity and more in specialized uses.

Last year, Elon Musk and his engineers at SpaceX used a Leap Motion controller to help them 3D model rocket parts. And while some folks suggest we use gloves, body suits, or Kinect to model our bodies for virtual worlds, a Leap Motion controller might serve as well or better for close, detailed wireless modeling (e.g., our hands).

It’s in such niche applications that Leap may grow and mature its offering. And they know it. When I spoke with Leap’s executives last year, the conversation revolved around developers. They’ve got the hardware, but they need the developer community to figure out how to integrate it.

To that end, Leap and two of its financial backers (Founders Fund and SOS Ventures) created the AXLR8R program to support developers working on gesture-based products and keep the product fresh. At a recent a demo day in San Francisco, a group of AXLR8R  startups pitched ideas on how they plan to incorporate Leap.

One company, MotionSavvy, is working on a Leap-equipped tablet for the deaf. The tablet would read and model users’ hands and fingers as they communicate with sign language, and an app would translate their signs into speech. It would work in reverse too, translating speech to text.

Another offering, Cori, would augment voice-controlled systems like Siri with gesture-controls—the combination of the two could improve usability. For example, swipe in front of your tablet to hit snooze in the morning then tell it to turn up the lights. Or ask for the news then swipe through articles as you brush your teeth.

Ethereal pairs Photoshop with a Leap controller so users can paint onscreen with their fingers. Whereas we simply take brush to canvas in the real world, doing something similar on a computer requires all kinds of specialized tools and commands. Using our hands to interact with digital design programs might make them more intuitive.

GetVu is a headmounted augmented reality display (about the size of an Oculus Rift) that overlays virtual objects on the real world. It’s got a Leap controller embedded in the device allowing people to use their hands to interact with the display.

For example, the device might overlay an astronomy app on the night sky and when you point at stars or planets, it reveals interesting facts and figures. Or vice-versa, the device gives a tour of the sky, highlighting and explaining the objects in your field of view.

Another company, Drift Coast, wants to make minimally-invasive medical procedures easier. Such procedures snake a catheter through arteries instead of making direct incisions. But navigating tight turns in arteries can be awkward. Drift Coast is making a touchless interface for MRI-guided catheters using a Leap controller.

“Our ‘plug-and-play’ system can control the tip of the MARC catheter remotely to make difficult turns with a gentle movement of doctors' hands, without touching anything.”

And what’s a Singularity Hub post without robots?

Mirror Training uses a Leap device to control a robotic arm. With enough precision, the company hopes such an interface might make bomb disposal robots easier to control. And we can imagine all kinds of mechanical equipment being directed by gesture in non-military environments too, like on a factory floor or in a construction zone.

Will all of these ideas go big? Probably not. But they show the variety of applications for touchless interfaces. And there are still undoubtedly more to come.

Leap Motion may not unseat the touch screen and mouse, but it very well might, along with other technologies (like voice control), be integrated into systems alongside them. There won’t be one way to control our computers in the future—rather, it’ll be a combination of a variety of tools from touch to touchless.

Image Credit: Leap Motion

Jason Dorrier

Jason is managing editor of Singularity Hub. He cut his teeth doing research and writing about finance and economics before moving on to science, technology, and the future. He is curious about pretty much everything, and sad he'll only ever know a tiny fraction of it all.

Discussion — 2 Responses

  • Kristof June 3, 2014 on 12:47 am

    If a leap motion the size of a pack of gum can read your hands what would the Leap motion a foot-long or a yard long be able to read? Is it scaleable; wouldn’t be able to scan entire body if it was larger?

  • Jeff Kang June 7, 2014 on 8:56 pm

    >It was, perhaps, a little too much hype for any startup to live up to out of the gate. After a round of unenthusiastic reviews and slow sales, they’ve had to reposition themselves for the longer haul.

    >Another offering, Cori, would augment voice-controlled systems like Siri with gesture-controls—the combination of the two could improve usability.

    Leap Motion + eye tracker > Leap Motion alone – Leap Motion Table mode (interact with a surface)

    Leap Motion and the eye-tracking companies should partner up, and build some more basic and practical applications. E.g. look at an interface element to highlight it, and then do a Leap Motion “click-where-I’m-looking-at” gesture.

    > “Uwyn’s GameWAVE, available for Mac and Windows, recently debuted in Airspace. It allows you to completely control keyboard and mouse-based video games, as well as your OS.
    > 20 distinct swipe and circle gestures, 6 continuous movement”.

    20!? With an eye tracker ($99, which isn’t that far from the Leap Motion), you probably don’t need as many gestures. E.g. having a just few gestures for single-clicking, double-clicking, and dragging could allow you to do quite a bit.

    If you plan to have 20 actions, you could just have one Leap Motion gesture to bring up a menu of 20 on-screen buttons, and then use a “select-what-am-looking-at” gesture to activate one of the virtual buttons.

    Another advantage is that visual interfaces, like on-screen buttons, would require less learning and memorization. Attaching macros to a growing list of unique air gestures, or keyboard shortcuts like Control + Alt + Shift + that you have to memorize can be more difficult to maintain. With on-screen buttons and controls, you can pretty much design them to look however you want. Button labels can be descriptive and self-documenting. Therefore, it will be very easy for people to understand the sample interfaces that will be shared with one another. If they’re edited, tweaked, and re-shared, people will be able to see the changes more immediately, and learn them much faster.

    Leap Motion also needs to hurry up, and bring “Table Mode”, where you can use any surface as a touch interface. There is a device called Haptix that transforms any flat surface into a 3-D multitouch surface, and they completed their KickStarter campaign last fall. It’s kind of like the Leap Motion, but it looks like it doesn’t involve as many air gestures. Leap Motion has said that Table Mode is on the roadmap, but that’s it. Doing air gestures all day is not ergonomically friendly.

    A user was able to get Table mode with Leap Motion by hacking together some transparent glass so that the Leap Motion could distinguish the hands. Here’s a video of the user playing Reflex.te, a reaction game that’s usually meant for the mouse: youtube/com/watch?v=cgGkwGJcB1c.