
The mechanics of eating are more complex than they might appear. For about a decade, researchers in the Personal Robotics Lab at the University of Washington have been working to build a robot that can help feed people who can’t eat on their own.
Researchers’ first breakthrough, back when the lab was at Carnegie Mellon University, was creating a robotic arm that could use a fork to feed someone a marshmallow. Since then, the robot graduated from feeding users fruit salads to full meals composed of nearly anything that can be picked up with a fork. Researchers also investigated how the robot can enhance the social aspects of dining.
Up until recently, this work has mostly been evaluated in the lab. But last year, researchers deployed the assistive feeding arm in a pair of studies outside the lab. In the first, six users with motor impairments used the robot to feed themselves a meal in a UW cafeteria, an office or a conference room. In the second study, one of those users, Jonathan Ko, a community researcher and co-author on the research, used the system at home for five days, having it feed him ten meals.
The team will present its research March 5 at the ACM/IEEE International Conference on Human-Robot Interaction in Melbourne.
“Our past studies have been in the lab because, if you want to evaluate specific system components in isolation, you need to control all other aspects of the meal,” said lead author Amal Nanavati, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. “But that doesn’t capture the diverse meal contexts that exist outside the lab. At the end of the day, the goal is to enable people to feed themselves in real environments, so we should also evaluate the system in those environments.
The system, which researchers dubbed ADA (Assistive Dexterous Arm), consists of a robotic arm that can be affixed to something nearby, such as a power wheelchair or hospital table. The user specifies what bite they want through a web app, and the system then feeds the user that bite autonomously (though users can stop the arm with a “kill button”). The arm has a force sensor and camera to distinguish between foods and to get the food to the user’s mouth.
In both studies, users successfully fed themselves their meals. In the first study, the robot acquired entrees with around 80% accuracy, which users in another study found to be the threshold for success. In the second study, the home’s varied circumstances and environments—Ko could be eating while watching TV in low light or while working in bed—hindered the system’s default functionality. But researchers designed the system to be customizable, so Ko was able to control the robot and still feed himself all meals.
The team plans to continue improving the system for effectiveness and customizability.
“It was a really important step to take the robot out of the lab,” Ko said. “You eat in different environments, and there are little variables that you don’t think about. If the robot is too heavy, it might tilt a table. Or if the lighting isn’t good, the facial recognition could struggle, but lighting is something you really don’t think about when you’re eating.”
More information:
Paper: Lessons Learned from Designing and Evaluating a Robot-Assisted Feeding System for Out-of-Lab Use
Citation:
Video: Assistive-feeding robot gets tested outside the lab (2025, March 4)
retrieved 4 March 2025
from https://medicalxpress.com/news/2025-03-video-robot-lab.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

The mechanics of eating are more complex than they might appear. For about a decade, researchers in the Personal Robotics Lab at the University of Washington have been working to build a robot that can help feed people who can’t eat on their own.
Researchers’ first breakthrough, back when the lab was at Carnegie Mellon University, was creating a robotic arm that could use a fork to feed someone a marshmallow. Since then, the robot graduated from feeding users fruit salads to full meals composed of nearly anything that can be picked up with a fork. Researchers also investigated how the robot can enhance the social aspects of dining.
Up until recently, this work has mostly been evaluated in the lab. But last year, researchers deployed the assistive feeding arm in a pair of studies outside the lab. In the first, six users with motor impairments used the robot to feed themselves a meal in a UW cafeteria, an office or a conference room. In the second study, one of those users, Jonathan Ko, a community researcher and co-author on the research, used the system at home for five days, having it feed him ten meals.
The team will present its research March 5 at the ACM/IEEE International Conference on Human-Robot Interaction in Melbourne.
“Our past studies have been in the lab because, if you want to evaluate specific system components in isolation, you need to control all other aspects of the meal,” said lead author Amal Nanavati, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. “But that doesn’t capture the diverse meal contexts that exist outside the lab. At the end of the day, the goal is to enable people to feed themselves in real environments, so we should also evaluate the system in those environments.
The system, which researchers dubbed ADA (Assistive Dexterous Arm), consists of a robotic arm that can be affixed to something nearby, such as a power wheelchair or hospital table. The user specifies what bite they want through a web app, and the system then feeds the user that bite autonomously (though users can stop the arm with a “kill button”). The arm has a force sensor and camera to distinguish between foods and to get the food to the user’s mouth.
In both studies, users successfully fed themselves their meals. In the first study, the robot acquired entrees with around 80% accuracy, which users in another study found to be the threshold for success. In the second study, the home’s varied circumstances and environments—Ko could be eating while watching TV in low light or while working in bed—hindered the system’s default functionality. But researchers designed the system to be customizable, so Ko was able to control the robot and still feed himself all meals.
The team plans to continue improving the system for effectiveness and customizability.
“It was a really important step to take the robot out of the lab,” Ko said. “You eat in different environments, and there are little variables that you don’t think about. If the robot is too heavy, it might tilt a table. Or if the lighting isn’t good, the facial recognition could struggle, but lighting is something you really don’t think about when you’re eating.”
More information:
Paper: Lessons Learned from Designing and Evaluating a Robot-Assisted Feeding System for Out-of-Lab Use
Citation:
Video: Assistive-feeding robot gets tested outside the lab (2025, March 4)
retrieved 4 March 2025
from https://medicalxpress.com/news/2025-03-video-robot-lab.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.