Robot learns tasks by interpreting written text
If we need help with a certain task, many of us turn to the Internet for guidance. But, as a recent article on the MIT Technology Review website reveals, it might not be long before robots are doing the same.
Researchers behind a European project called RoboHow are currently exploring ways robots can learn to interpret language. A prototype robot, called PR2, has already been created in Germany. PR2 has made pizzas and pancakes simply by reading instructions on how-to website WikiHow and watching YouTube tutorials.
The aim of the four year project is to reach a stage wherein machines are able to carry out everyday tasks as proficiently as humans. Instead of having to program a robot to perform a set number of motions, the idea is that robots will learn new tasks simply by humans communicating instructions to them.
If successful, the project could have huge implications in the home and in the workplace, as machines and robots become increasingly integrated within our day-to-day lives.
Michael Beetz, head of the Artificial Intelligence Institute at the University of Bremen, where the project is based, commented: "If you have a robot in a factory, you want to say 'Take the screw and put it into the nut and fasten the nut' [...] You want the robot to generate the parameters automatically out of the semantic description of objects."
Along with making pizzas and pancakes, PR2 is learning how to carry out simple tasks in the laboratory, like handling chemicals.
When a robot learns the actions required to perform a certain task, the information gained is uploaded onto a database, called Open Ease. Other robots are able to access the information on this database, which enables them to learn the same task.
Professor at the Carnegie Mellon University's Robotics Institute, Siddhartha Srinivasa, said the ability to connect action with language in robots is extremely important, nevertheless complex. Success will "require a tight integration of natural language, grounding the understanding via perception, and planning complex actions via manipulation algorithms," he said.