real-life-robocop (1)

A student at the Florida International University (FIU) dons a sensor-laden pair of gloves and vest and an Oculus Rift virtual reality headset. He lifts his arm, makes a fist—and across the room a robot awakens and mimics his movements.

Using a potent cocktail of new technologies and $20,000 from a private contributor, Jeremy Robins, a team of FIU researchers and students says they've engineered a telepresence robot suitable for law enforcement—a real telepresence RoboCop.

The team began work in 2012 with two robots on loan from the Institute for Human and Machine Cognition (IHMC). The current prototype, or TeleBot, is six feet tall, weighs 75 pounds, and goes by the name "Hutch." Such telepresence robots might help get disabled cops back into action.

Hutch is outfitted with a pair of stereoscopic camera eyes that relay a video feed to the Oculus Rift, giving the wearer a 3D view of the robot’s immediate surroundings. The vest communicates arm movements to the robot, and the gloves operate its hands. Users pilot the robot with a joystick.

Sensored gloves control the robot's hands.
Sensored gloves control the robot's hands.

“With telebots, a disabled police officer will be capable of performing many, if not most, of the functions of a normal patrol office—interacting with the community, patrolling, responding to 911 calls, issuing citations,” said Robins.

Of course, Telebot isn’t perfect and won’t be doing much real police work just yet.

The hands, for example, can manipulate objects but, presumably, don't provide sensory feedback to the wearer. Such feedback, or haptics, is one of the hardest problems in robotics. Without haptic feedback, it’s difficult to manipulate objects.

For example, watch this video of a blindfolded man plucking stems from cherries with an amazing Case Western prosthetic hand. With the hand's sensory feedback mechanism turned off he crushes most of the cherries. With the haptics turn on, however, he's far more dexterous.

Another potential problem? The robot’s head is slower to turn than the human driver’s. This may cause lag as the Rift’s view catches up with the user’s head movement. Such funky sensory input mismatches may cause disorientation for the user. Ideally they’re more closely in sync. (Also, minimal lag between head movements and the virtual world is one reason the Rift is so immersive.)

Still, TeleBot is a great example of a growing trend. Telepresence robots are entering a number of industries. In medicine, doctors can visit and deliver care to patients hundreds or thousands of miles away. Business people can attend meetings without hopping a plane. Remote workers can visit the home office and connect with coworkers and collaborators.

We’ve written about telepresence robots used in hospitals (InTouch Health's RP-Vita), offices (Cisco and iRobot), or pretty much wherever (Double). They range from outrageously expensive ($95,000) to nearly affordable ($2,500).

But no telepresence bot we've seen vows to uphold the law and punish scofflaws, and none combines immersive visuals and controls like TeleBot—which is why it’s cool.

Image Credit: Florida International University/YouTube

Jason is managing editor of Singularity Hub. He cut his teeth doing research and writing about finance and economics before moving on to science, technology, and the future. He is curious about pretty much everything, and sad he'll only ever know a tiny fraction of it all.