Scientists 3D Print a Complex Robotic Hand With Bones, Tendons, and Ligaments

We don’t think twice about using our hands throughout the day for tasks that still thwart sophisticated robots—pouring coffee without spilling when half-awake, folding laundry without ripping delicate fabrics.

The complexity of our hands is partly to thank. They are wonders of biological engineering: Hard skeleton keeps their shape and integrity and lets fingers bear weight. Soft tissues, such as muscles and ligaments, give them dexterity. Thanks to evolution, all these “biomaterials” self-assemble.

Recreating them artificially is another matter.

Scientists have tried to use additive manufacturing—better known as 3D printing—to recreate complex structures from hands to hearts. But the technology stumbles when integrating multiple materials into one printing process. 3D printing a robotic hand, for example, requires multiple printers—one to make the skeleton, another for soft tissue materials—and the assembly of parts. These multiple steps increase manufacturing time and complexity.

Scientists have long sought to combine different materials into a single 3D printing process. A team from the soft robotics lab at ETH Zurich has found a way.

The team equipped a 3D inkjet printer—which is based on the same technology in normal office printers—with machine vision, allowing it to rapidly adapt to different materials. The approach, called vision-controlled jetting, continuously gathers information about a structure’s shape during printing to fine-tune how it prints the next layer, regardless of the type of material.

In a test, the team 3D printed a synthetic hand in one go. Complete with skeleton, ligaments, and tendons, the hand can grasp different objects when it “feels” pressure at its fingertips.

They also 3D printed a structure like a human heart, complete with chambers, one-way valves, and the ability to pump fluid at a rate roughly 40 percent of an adult human’s heart.

The study is “very impressive,” Dr. Yong Lin Kong at the University of Utah, who was not involved in the work but wrote an accompanying commentary, told Nature. 3D inkjet printing is already a mature technology, he added, but this study shows machine vision makes it possible to expand the technology’s capabilities to more complex structures and multiple materials.

The Problem With 3D Inkjet Printing

Recreating a structure using conventional methods is tedious and error-prone. Engineers cast a mold to form the desired shape—say, the skeleton of a hand—then combine the initial structure with other materials.

It’s a mind-numbing process requiring careful calibration. Like installing a cabinet door, any errors leave it lopsided. For something as complex as a robot hand, the results can be rather Frankenstein.

Traditional methods also make it difficult to incorporate materials with different properties, and they tend to lack the fine details required in something as complex as a synthetic hand. All these limitations kneecap what a robotic hand—and other functional structures—can do.

Then 3D inkjet printing came along. Common versions of these printers squeeze a liquid resin material through hundreds of thousands of individually controlled nozzles—like an office printer printing a photo at high resolution. Once a layer is printed, a UV light “sets” the resin, turning it from liquid to solid. Then the printer gets to work on the next layer. In this way, the printer builds a 3D object, layer by layer, at the microscopic level.

Although incredibly quick and precise, the technology has its problems. It isn’t great at binding different materials together, for instance. To 3D print a functional robot, engineers must either print parts with multiple printers and then assemble them after, or they can print an initial structure, cast around the part, and add additional types of materials with desired properties.

One main drawback is the thickness of each layer isn’t always the same. Differences in the speed of “ink,” interference between nozzles, and shrinkage during the “setting” process can all cause tiny differences. But these inconsistencies add up with more layers, resulting in malfunctioning objects and printing failure.

Engineers tackle this problem by adding a blade or roller. Like flattening newly laid concrete during roadwork, this step levels each layer before the next one starts. The solution, unfortunately, comes with other headaches. Because the rollers are only compatible with some materials—others gunk up the scraper—they limit the range of materials that can be used.

What if we don’t need this step at all?

Eyes on the Prize

The team’s solution is machine vision. Rather than scraping away extra material, scanning each layer as it’s printing helps the system detect and compensate for small mistakes in real time.

The machine vision system uses four cameras and two lasers to scan the entire printing surface at microscopic resolution.

This process helps the printer self-correct, explained the team. By understanding where there’s too much or too little material, the printer can change the amount of ink deposited in the next layer, essentially filling previous “potholes.” The result is a powerful 3D printing system in which extra material doesn’t need to be scraped off.

This isn’t the first time machine vision has been used in 3D printers. But the new system can scan 660 times faster than older ones, and it can analyze the growing structure’s physical shape in less than a second, wrote Kong. This allows the 3D printer to access a much larger library of materials, including substances that support complex structures during printing but are removed later.

Translation? The system can print a new generation of bio-inspired robots far faster than any previous technologies.

As a test, the team printed a synthetic hand with two types of materials: a rigid, load-bearing material to act as a skeleton and a soft bendable material to make tendons and ligaments. They printed channels throughout the hand to control its movement with air pressure and at the same time integrated a membrane to sense touch—essentially, the fingertips.

They hooked the hand to external electrical components and integrated it into a little walking robot. Thanks to its pressure-sensing fingertips, it could pick up different objects—a pen or an empty plastic water bottle.

The system also printed a human-like heart structure with multiple chambers. When pressurizing the synthetic heart, it pumped fluids like its biological counterpart.

Everything was printed in one go.

Next Steps

The results are fascinating because they feel like a breakthrough for a technology that’s already in a mature state, Kong said. Although commercially available for decades, just by adding machine vision gives the technology new life.

“Excitingly, these diverse examples were printed using just a few materials,” he added. The team aims to expand the materials they can print with and directly add electronic sensors for sensing and movement during printing. The system could also incorporate other fabrication methods—for example, spraying a coat of biologically active molecules to the surface of the hands.

Robert Katzschmann, a professor at ETH Zurich and an author on the new paper, is optimistic about the system’s broader use. “You could think of medical implants…[or] use this for prototyping things in tissue engineering,” he said. “The technology itself will only grow.”

Image Credit: ETH Zurich/Thomas Buchner

Shelly Fan
Shelly Fanhttps://neurofantastic.com/
Shelly Xuelai Fan is a neuroscientist-turned-science writer. She completed her PhD in neuroscience at the University of British Columbia, where she developed novel treatments for neurodegeneration. While studying biological brains, she became fascinated with AI and all things biotech. Following graduation, she moved to UCSF to study blood-based factors that rejuvenate aged brains. She is the co-founder of Vantastic Media, a media venture that explores science stories through text and video, and runs the award-winning blog NeuroFantastic.com. Her first book, "Will AI Replace Us?" (Thames & Hudson) was published in 2019.
RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured