Xsens Full Body Suit – Motion Capture for the Mad Genius (video)

130 3 Loading
XSens

Models are pretty...I'm talking about computer generated models using kinetic tracking sensors, of course. What did you think?

The real world just sucker punched virtual reality. Motion capture has long been used to make digital characters more believable, even when they were completely insane – I still have fond memories of the original Mortal Kombat. Anyone who’s watched a beyond the scenes documentary on modern movie making knows that actors in funny body stockings covered in ping pong balls are a staple of Hollywood’s approach to CGI. Today’s motion capture, however, is going beyond traditional camera techniques and green screens to unleash some of the true potential of the concept. Xsens’ MVN system is a lycra suit studded with 17 light weight kinetic tracking devices that communicate wirelessly with a computer to capture ever twist and turn of your body. MVN doesn’t watch you move, it feels you move. Not only that, but it does so without a tether and from up to 150 meters (~500 feet) away. That freedom lets MVN capture motion in ridiculous places – like falling out of a goddamn plane. Check out Xsens’ stunning showreel for MVN in the videos below. Not only is kinetic tracking changing the face of digital entertainment, it’s giving scientists a new way to control robots and understand the human body.

Xsens’ latest showreel for their MVN motion capture system highlights its growing popularity. It was a key instrument in movies like Iron Man 2, Alice in Wonderland (Tim Burton’s recent remake), and the alien comedy Paul. It’s served for dozens of video games in a variety of genres: sports, horror, dancing, etc. If you want to make a virtual character move like a real human being, chances are MVN can do the job.

While not nearly as high quality as the 2011 showreel, I had to include this older demo of MVN’s prowess below. Why? Because it opens with the suit being tested during skydiving. That’s just awesome. The figure skating that follows is considerably less badass, but still pretty impressive from a technical point of view.

While visual tracking has proven itself as a great tool for motion capture in the past, it’s outclassed by modern kinetic techniques. MVN uses tracking modules that snap onto their lycra suit, with each module recording data about its movement in 3D space. Those trackers don’t have to be seen by an external camera in order to be relevant, meaning you can get a full frame model from any angle of the actor being recorded. Xsens even shows you real time visualization of the virtual skeleton you’re tracking, letting you fine tune your performance. This lends itself to more realistic motion capture and better entertainment.

Of course, many people have been pushing the limits of digital performances in the past few years (James Cameron with Avatar, for instance). Kinetic motion capture, however, could innovate other industries as well. MVN BioMech lets scientists record detailed information of the way people walk and move in the real world – lending insight into how a striker can score a winning goal, or how muscular dystrophy changes a patient’s stance. Research like this could also help those (rather creepy) projects that look to enhance security surveillance by correlating your gait and body language with your emotional status. Turns out you can tell who’s a terrorist by the way they walk (seriously).

Xsens’ style of motion capture also might help us control and train robots. We’ve already seen the technology used to teleoperate the humanoid robot Mahru. In the following video, an engineer uses Xsens’ MTx system to control a robot arm. Imagine using a similar method to record the movements you want a robot to repeat a million times on its own. You could ‘program’ an entire assembly line in minutes.

Real time recordings of the human body via wireless 3D tracking modules – I really love this concept and I think its recent success points to it being a valuable method of connecting the physical and digital worlds. I mean, add in some haptics and some augmented reality and you’ve got all you need to make the virtual and real worlds overlap perfectly. Xsens isn’t the only name in the industry, and we’ve certainly seen plenty of teams that specialize in entertainment, biomechanical, and robot operation applications. Still, I have to applaud the MVN, and the company’s sense of style. Throwing an employee out of a plane to prove how well your product works? That’s pure genius. Evil…but genius.

[image and video credits: Xsens]
[source: Xsens]

Discussion — 3 Responses

  • digi_owl May 26, 2011 on 11:16 am

    So, how long before we see movies done using motion capture, scans of dead actors and voices done via some kind of vocaloid system?

    If the studios build up a large library of motion capture they can probably create whole movies using this without ever involving more then a couple of computer operators.

    • Joe Nickence digi_owl May 26, 2011 on 5:30 pm

      It’s achievable today. I’m sure Hollywood is simply waiting out the legalities of last wills and testaments of the deceased, and the possible objections of descendants. Most actors, if they had their wits about them, made attempts at any future screenings of their image, for proceeds to go to supporting their estate, or at least their descendants.

  • Joe Nickence May 26, 2011 on 5:24 pm

    In the third video, I’m reminded of the movie “Saturn 3″, where Harvey Keitel is programming the robot. While he is distracted by Farrah Fawcett, he drums his fingers on the table, causing the bot to do exactly the same.

    Good stuff. We need to be teaching out future overlords how to walk and maintain balance with their avatars now, and avoid all kinds of embarrassing falls during their rampage later. :-)