WPI Researchers Tackle the Challenges of Putting Robots on the Shop Floor and in Our Homes

February 03, 2014

Image removed.

Two researchers in the pioneering Robotics Engineering Program at Worcester Polytechnic Institute (WPI) have received awards through the National Science Foundation's (NSF) National Robotics Initiative to investigate and overcome fundamental challenges involved with robots working alongside people in settings such as manufacturing plants and the homes of the elderly.

Dmitry Berenson, assistant professor of computer science and robotics engineering, is the principal investigator of a three-year, $600,000 award for a project that will draw on the disparate fields of robot motion planning and human task planning. Berenson will develop algorithms that will permit robots to collaborate with people on manufacturing operations. Sonia Chernova, also an assistant professor of computer science and robotics engineering, is co-principal investigator of a three-year, $425,000 award for research that will draw on the experience of thousands of online "teachers" to learn more about how "everyday people" can most effectively show robots how to do simple tasks.

In the first project, Berenson will collaborate with Julie Shah, assistant professor of aeronautics and astronautics at MIT, and director of the Interactive Robotics Group in MIT's Computer Science and Artificial Intelligence Laboratory, to begin laying the groundwork for robots that can safely and productively cooperate with people to build products in small-scale manufacturing operations.

By helping to make smaller manufacturers more competitive, the researchers hope to play a role in revitalizing American manufacturing, Berenson said. "Right now, if you have a product and you would like to manufacture it competitively, you may have to send that work to factories overseas, which can be a daunting challenge. Robots allow small manufactures to enjoy the same cost benefits that large companies have already realized. In fact, by putting robots on the manufacturing floor, it may be possible to bring many small-scale manufacturing jobs back home."

Preparing robots to work with people on manufacturing tasks means gaining a greater understanding of how people carry out such operations. To that end, the researchers will spend a great deal of time observing people in the lab. Their goal is to help robots answer two basic questions about their human collaborators: what are they doing, and how are they doing it? "In any manufacturing operation, there is a tradeoff between speed and efficiency on the one hand, and safety on the other," Berenson says. "This is especially true when you add a robot to the mix. By helping robots understand how people move and do things, we can help them enhance productivity without being a hindrance or safety risk."

Answering the "what" question will draw on the field of task planning or discrete optimization, Shah's research focus, which can be used to determine the most efficient process for assembling a product. "A robot needs to understand that process and be able to look at the state of things—where the parts are and what has already been done—and anticipate what the person will do next, so it can do something that helps and doesn't interfere."

Berenson is an expert in motion planning, which will be employed to address the "how" question. Different people perform the same task in different ways. In doing so, they may move their limbs in different directions, or at different speeds, and occupy different parts of the work space. "A robot will need to take these differences into account as it plans its own movements so as not to get in the way of or potentially injure its human co-worker."

In laboratory studies at MIT and WPI, volunteers will carry out a variety of assembly tasks as their actions are captured with cameras and laser scanners. These recordings will be used to create a library of human task behaviors that can be integrated with algorithms that will allow the robots to answer the "what" and "how" questions about their human partners. In the final stage of the research, the team will have human volunteers collaborate with robots on assembly tasks to measure the machines' impact on productivity and to see how comfortable people are collaborating with mechanical co-workers.

"Task planning and motion planning have not really been brought together in this way before," Berenson said. "But if robots are to work successfully with people, these two fields must be integrated."

In her research, Chernova is focused on robots that will co-exist with people not as collaborators, but as helpers. Her work builds upon research she is conducting with a $500,000 NSF CAREER Award and is aimed at figuring out how people who have no background in or experience programming robots can teach robots to perform simple tasks.

The ability for people who are not programmers to train robots will be essential if robots are to assist the elderly or disabled in their homes. One of the challenges of having everyday people teach robots is that different people will teach the same tasks in different ways. Chernova and her colleague, Andrea Thomaz, associate professor in the School of Interactive Computing at Georgia Institute of Technology, who is the principal investigator on the NFS award, will expose robots to a large and diverse group of teachers who will be asked to show the robots how to do simple household tasks. Household tasks were chosen because they are familiar to most people, each has multiple steps, and there is more than one "right" way to complete each one.

Watch a video about Sonia Chernova's research.

The robots will be programmed with a library of "primitive actions" (for example, grasping and lifting an object or moving from one place to another) that can be linked together to create a complete task. "This is called task-level learning," Chernova says. "It differs from motion-controlled learning, where you literally move the robot to show it what you want it to do. Motion-controlled learning would work well in manufacturing, were precise, repeatable movements are important. But if you are going to teach a robot to clean your kitchen, you don’t want to do so at that level."

To engage a large and diverse population of "citizen teachers," the researchers will turn to the Internet. Using RobotsFor.Me, a web-based remote laboratory developed by Chernova and her team, they will recruit hundreds of subjects who will interact with simulated and actual robots online.

The researchers will observe how each subject goes about training the robots to perform such tasks as folding laundry or setting the table. They will then synthesize the diversity of teaching approaches and styles into models that will help robots adapt to a range of teachers.

"People have fairly predictable patterns in how they like to do things," Chernova said, "But in the real world, a robot assistant will likely receive instruction from multiple teachers. Through this work, we will develop models that will allow robots to be more flexible by anticipating how different people might go about teaching them."

One of the challenges that arises when more than one person teaches a robot a new task is how teachers can assess what the robot already knows. "If I were to lecture to a colleague's class," Chernova said, "I could find out what the students already know by checking the syllabus, looking over previous lectures, or just asking them. But how can a teacher find out what a robot knows? And how can a robot communicate to a teacher that it is confused? These are the kinds of questions we hope to answer through this research."

DEPARTMENT(S)