Robots to handle kitchen chores

Electrotechnology Robot technology and automation
A young DTU researcher has brought us one step closer to a kitchen robot that can help with cooking.

Who has not dreamed of getting help with kitchen chores such as peeling carrots, chopping onions, or putting things away? PhD student Adrian Llopart Maurin has set himself the task of solving this challenge with the help of robot technology. In collaboration with the Korean University KAIST, he is developing a humanoid service robot.

While we should not expect to find kitchen helper robots under this year’s Christmas tree, they may become a familiar sight in the next couple of years. Robotics and AI research are moving at incredible speed driven in particular by the knowledge gleaned from self-driving cars, enabling robots to recognize things in their surroundings—e.g. other road users, traffic signs, and trees.

This knowledge is relatively simple to adapt, so instead of traffic-related data, the robot is fed images of kitchen service such as cups, bowls, and glasses, enabling it to recognize these items and respond accordingly.

Adrian Llopart Maurin has conducted research into robots—partly at DTU Electrical Engineering—and partly at the internationally recognized Korean Technical University, KAIST. His research is based on the Mybot humanoid service robot developed at KAIST. During the project, the robot was given a new head and its name changed to Siambot.

“A robot such as Mybot or Siambot consists of a multitude of systems, each of which is coded to perform a single task—e.g. to see and recognize a cup, to lift an arm, to take hold of something, or to move from one place to another. I’ve been working on integrating the various systems and in this way enable the robot to perform simple tasks,” says Adrian Llopart Maurin.

3D vision
The robot’s eyes consist of two cameras—one films in standard colour and 2D, while the other films in depth using a points map to reproduce objects in 3D. By feeding the robot with a wide range of images, Adrian has enabled it to recognize a cup, for example—irrespective of type or colour—and determine how best to grip the cup in order to pick it up.

“One of the specific tasks I gave the robot was to put a glass of wine on table (A). The only information the robot got in advance was that on the next table—table (B)—there was a wine bottle and a glass. The robot then understood that it had to move to table (B), take the wine bottle—and on the basis of a calculation of the bottle’s height and the location of the glass on the table—pour wine into the glass and place the glass on table (A),” explains Adrian Llopart Maurin.

The future aim is to create a robot that can help serve at restaurants or assist with kitchen tasks in the home. Who knows—perhaps it will actually be able to create an entire meal on its own?

“Several companies are in the process of developing this type of robot and there is definitely a huge market for those that succeed. The final breakthrough is expected over the next few years when we will see robots capable of understanding voice messages and carrying out the cooking tasks they are assigned,” says Ole Ravn, head of DTU’s work with automation.