Explore Topics:
AIBiotechnologyRoboticsComputingFutureScienceSpaceEnergyTech
Robotics

Storybook Time With the PR2 – This Robot Can Read Anything It Sees

Aaron Saenz
May 18, 2011
PR Literacy

Share

PR Literacy

The University of Pennsylvania has been elevating students' understanding of literature since 1740...and robots' since 2011. UPenn's GRASP Lab has taught one of Willow Garage's PR2 robots how to read out loud - the man-sized research platform is able to locate text in its environment and convert it to speech. In the video below, graduate student Menglong Zhu walks the PR2 through an impressive variation of fonts in real world settings to show off how well the new literacy code works. Posters, emergency signs, even handwriting - the GRASP PR2 reads them all and at strange angles and orientations as well. It's pretty damn impressive to see a full sized robot rolling around and reading what it sees. Makes you wonder when it will be invited to storytime over at the local elementary school.

Zhu does a great job demonstrating all the different real world text the PR2 can handle in the following video...but it does get a bit repetitive. Feel free to skip around.

The PR2 is far from the first literate robot, there have been automated machines with optical character recognition (OCR) for decades. More recently, 2009 saw a bot in Japan that could read books, and last year a different machine in the UK did the whole "roaming and reading" routine. There are two things that set the work at GRASP Lab apart, however. First, the PR2 itself. This isn't a dedicated reading robot, or a small cart on wheels with a camera and OCR- it's a life-sized humanoid that can accomplish a whole range of different activities. Instead of building a robot to read books, GRASP took an existing robot and taught it to be literate.

Secondly, and perhaps more importantly, the results produced by Zhu (with Post Doc Kosta Derpanis and Professor Kostas Daniilidis, by the way) are open source. As with all the Willow Garage PR2 robots that have been given to research institutions like UPenn, the work here is going to be freely shared through the Robot Operating System (ROS) library as open source code. It's not just the PR2 then that benefits from Zhu's work. Ostensibly, one day ROS enabled machines everywhere could use this code to make themselves literate - it's like Hooked on Phonics for robots.

Be Part of the Future

Sign up for SingularityHub's weekly briefing to receive top stories about groundbreaking technologies and visionary thinkers.

100% Free. No Spam. Unsubscribe any time.

I'm really interested to see how this work gets shoe-horned into other projects. We've seen how a single innovation -the adaptation of the Kinect 3D sensor, has lead to an explosion of ideas for ROS. Might a reliable literacy program for the PR2 produce similar results? Too soon to tell, but the possibility alone highlights how powerful and accelerating the concept of open source robotics can be. Today we're teaching robots how to read. Tomorrow maybe they put what they read to good use. ...That or they get distracted by romance novels. You never know with robots.

Screen capture and video credit: dreamdragon1988 (menglong zhu)

Source: ros

Related Articles

A digital render of a human or robot with prismatic rainbows.

A ChatGPT Moment Is Coming for Robotics. AI World Models Could Help Make It Happen.

Aaron Frank
Scientists are making cyborg cockroaches for search and rescue operations.

Automated Cyborg Cockroach Factory Could Churn Out a Bug a Minute for Search and Rescue

Edd Gent
MIT’s New Robot Dog Learned to Walk and Climb in a Simulation Whipped Up by Generative AI

MIT’s New Robot Dog Learned to Walk and Climb in a Simulation Whipped Up by Generative AI

Edd Gent
A digital render of a human or robot with prismatic rainbows.
Robotics

A ChatGPT Moment Is Coming for Robotics. AI World Models Could Help Make It Happen.

Aaron Frank
Scientists are making cyborg cockroaches for search and rescue operations.
Future

Automated Cyborg Cockroach Factory Could Churn Out a Bug a Minute for Search and Rescue

Edd Gent
MIT’s New Robot Dog Learned to Walk and Climb in a Simulation Whipped Up by Generative AI
Robotics

MIT’s New Robot Dog Learned to Walk and Climb in a Simulation Whipped Up by Generative AI

Edd Gent

What we’re reading

Be Part of the Future

Sign up for SingularityHub's weekly briefing to receive top stories about groundbreaking technologies and visionary thinkers.

100% Free. No Spam. Unsubscribe any time.

SingularityHub chronicles the technological frontier with coverage of the breakthroughs, players, and issues shaping the future.

Follow Us On Social

About

  • About Hub
  • About Singularity

Get in Touch

  • Contact Us
  • Pitch Us
  • Brand Partnerships

Legal

  • Privacy Policy
  • Terms of Use
© 2025 Singularity