Touchscreens and swarm robotics go together like geeks and video games.

Mark Micire’s PhD dissertation puts robotic control at his fingertips. The UMass Lowell student developed a command and control program for the Microsoft Surface touchscreen so that swarm robots can be easily guided. Watching Micire’s program in action makes it look like he’s playing StarCraft, only with real robots. Teams of bots can be color coded, groups can be selected by circling them with a finger, and robots can be commanded to move either individually or en masse. You can even manually drive a robot with a special pop-up interface. What’s the use for this multitouch control system? There are military applications, but Micire has a strong background in search and rescue robotics. Touchscreen swarms could be the next innovation in disaster relief. Watch Micire’s demo his system in the video below.

Swarm robots come in a huge variety of shapes, sizes, and capabilities, but they generally follow one strategy: many hands make light work. Bots are used together to divide and conquer a problem, overcoming challenges with a large number of workers. Many of the swarms we’ve seen are autonomous. In critical missions, however, robot AI may not yet be sufficient to find a solution to a problem. That’s why human guidance is still very important. Micire’s touchscreen program gives the operator various levels of control. This would allow humans to adjust their involvement as the case warrants. If robot autonomy can handle a situation, commands could simply instruct bots in a general way – go there, look for human bodies, report back. The system also allows for more direct commands – follow this route, etc. When direct human control is needed, operators can manually drive bots using the DREAM interface (Dynamically Resizing Ergonomic And Multitouch), as you can see at 2:23 in the video below.

Unfortunately, the demo above doesn’t show the system controlling physical robots. The bots on the screen are simulated. But Micire can control real world robots using the touchscreen, as you can see in the video below:

Micire’s work seem immediately applicable to fieldwork for robotics. It’s really no surprise considering his experience in search and rescue operations. It’s important to remember that robots already form a valuable part of S&R teams, allowing humans to explore areas that are too hazardous or difficult to reach. Case in point, here’s Micire’s video of work he did in Mississippi after Hurrican Katrina back in 2005

Of course, there are many applications for guided swarms besides disaster relief. Autonomous and guided drones are valuable assets in modern warfare, and advanced control systems like Micire’s could improve their usability. We’ve already seen how a similar technology (telestrators) are being developed by the US Air Force. Additionally, touchscreen controls could help swarms tackle industrial maintenance, exploration, or even surgery.

Swarm robotics holds a lot of potential, but most of the focus we’ve seen has been on the robots themselves. Micire’s work shows an interesting way in which humans can be readily inserted into the robot’s decision making process. In the near term that will allow these swarms to perform better as human intelligence still exceeds AI. Once autonomy out-paces human decision making we may still use such command systems as a means of providing over-arching control of our robots. It will be interesting to see if the potential of Micire’s program attracts any real-world applications in the years ahead.

…Maybe someone needs help fending off a zergling rush?

Video of Mark Micire’s complete PhD defense can be found here.
[screen capture and video credits: Mark Micire]
[source: UMass Lowell, Mark Micire]