A team of MIT researchers have published a paper detailing a system by which robots can learn complex tasks without being explicitly programmed, simply by observing a human — allowing a robot worker to be trained in the same way as a new human employee.
"The vision is to put programming in the hands of domain experts, who can program robots through intuitive ways, rather than describing orders to an engineer to add to their code," says first author Ankit Shah of the team's work. "That way, robots won’t have to perform preprogrammed tasks any more. Factory workers can teach a robot to do multiple complex assembly tasks. Domestic robots can learn how to stack cabinets, load the dishwasher, or set the table from people at home."
The work resulted in a system dubbed PUnS, or Planning with Uncertain Specifications. "The robot is essentially hedging its bets in terms of what’s intended in a task," Shah explains, "and takes actions that satisfy its belief, instead of us giving it a clear specification."
The robot was given the task of laying a table with eight objects: A mug, a glass, a spoon, a fork, a knife, a dinner plate, a smaller plate, and a bowl. After observing humans setting the objects down, the robot was told to set the table - and did so without error in physical testing, and with only six mistakes in 20,000 simulated attempts.
The next step is increasing the system's flexibility and understanding of natural-language commands. "Say a person demonstrates to a robot how to set a table at only one spot. The person may say, 'do the same thing for all other spots,' or, 'place the knife before the fork here instead,'" Shah notes. "We want to develop methods for the system to naturally adapt to handle those verbal commands, without needing additional demonstrations."