Ladies and gentlemen, we are one step closer to having a fully functional holodeck. Thanks to the Shinoda Lab at theUniversity of Tokyo, you can now touch holograms. Concentrated blasts of ultrasound are used in conjunction with traditional holography to give you the impression of feeling the objects you see. It’s an amazing concept and will allow an entirely new way to interact in virtual reality. Marvel at the video from Shinoda Labs after the break (sorry, no sound).
Called the Airborne Ultrasound Tactile Display, the specially calibrated ultrasound emitter gives you the impression of physical pressure at the location of a holographic object. Because you aren’t actually touching the hologram, there’s no decrease in the quality of the image. Unlike a traditional speaker, the ultrasound can be focused at a particular location, so you only feel pressure at a certain point. This precision allows AUTD to let you feel individual drops of virtual rain, a bouncing ball, or even a tiny animal running across your palm.
I think it is hilarious and awesome that a key component of the Shinoda Lab setup is a remote control from the Nintendo Wii. Two wiimotes serve as IR sensors that track the movement of your hand in the hologram space. In order to see your hand, you have to have a tiny marker place on your finger that is very reflective to IR light. As far as hand-tracking goes, it’s a remarkably simple setup. There are some obvious limitations you can see from the video. The size of the hand isn’t well preserved in the virtual space (probably because only one part of the hand is tracked in IR). Still, I’m sure Nintendo is overjoyed with the inventiveness of the Shinoda Lab. Especially since Shinoda just exhibited AUTD at the SIGGRAPH conference in New Orleans.
The guys at Tokyo University are really knocking it out of the park these days. First they made super fast robot hands play baseball, now they could let us play catch with a virtual team. I applaud their innovation and am excited by how many different ways it can be applied. I mean, the AUTD seems custom fit to be integrated into Augmented Reality technology, haptics, and human-computer interfaces. How versatile can you get?
Interacting with visual object is going to be a big development in the fields of computing, gaming, and art. Just looking at the AUTD from the video you can see how they could be easily adapted into a multi-directional 3D setup. When that happens, the AUTD will go from a novelty to a tool that brings about full immersion virtual reality. In the meantime, I look forward to seeing more demonstrations of Tokyo University’s current setup, including its limitations and what it sounds like. Damn, this stuff is cool. I’m going to go get some wiimotes, lasers, and a subwoofer and see if I can’t battle some Klingons.