
Autonomous technologies promise passengers travel without concern—the ability to get from Point A to Point B without needing to be engaged in the process. Yet passengers still don’t trust computers the way they trust human drivers, and most autonomous vehicles on the road today are equipped for a human to take over.
The same technologies that allow autonomous vehicles (AVs) to navigate city streets can also give motorized wheelchair users the ability to get from place to place without personally controlling the chair, but wheelchair users also want a way to override the computer. Engineers at the University of Michigan aim to provide that functionality.
Most wheelchairs available to those needing regular transportation are either fully manual, or fully autonomous—with few options in between. The research team is harnessing the light detection and ranging (LiDAR) sensors and an onboard camera to allow people with disabilities the combination of freedom to let the software drive and oversight when the user is less trusting of autonomous decisions.
“The sweet spot is something we call ‘shared control’ or ‘shared autonomy,’ where the robot is assisting you to the extent you want, but it is not ever putting the passenger in a situation where they cannot control their destiny,” said Vineet Kamat, the U-M John L. Tishman Family Professor of Construction Management and Sustainability and a professor of civil and environmental engineering.
“People with physical disabilities primarily want to maintain their independence but have significant navigation and maneuvering challenges operating in the built environment.”
In many instances, the prospect of having to steer through a complex or crowded pathway may discourage people with disabilities from partaking in social, business or educational opportunities. In the U.S., roughly 2.7 million per year experience health issues that require the use of a wheelchair.
Kamat and Carol Menassa, a U-M professor of civil and environmental engineering and a John L. Tishman Construction Management Faculty Scholar, have collaborated for years, helping robots to understand and reason with built environments, both indoor and outdoors.
This year, their research team was able to outfit a motorized wheelchair with both LiDAR and a 3D camera, tap into the wheelchair’s drive system and write algorithms that would allow for shared control with the help of a video game controller.
“Trust is very important in this type of situation, because there are so many things at stake,” Menassa said. “You want to trust that you are going to be safe and that any people in the environment are going to be safe.”
It’s the same problem being encountered by the auto industry. In Oct. 2023, J.D. Power reported: “Consumer confidence in fully-automated, self-driving vehicles continues to decline for the second consecutive year… Consumers show less readiness on all metrics, with the lowest level of comfort riding in a fully automated, self-driving vehicle and using fully automated, self-driving public transit.”
They have been testing their system, called CoNav, in the basement corridors of the G.G. Brown Building on North Campus with able-bodied volunteers manning the chair.
“Feedback between the user and the system is very important,” Menassa said, “and that is the feedback that’s going to, over time, initiate or establish that trust.”
The research team includes Yifan Xu, a U-M graduate student research assistant; Jordan Lillie a U-M biomedical engineering technician; and undergraduate student Qianwei Wang.
Following the technical validation and testing that’s currently underway, the team will turn its focus to testing with people with disabilities.
Citation:
Merging autonomy with manual control for wheelchair users (2025, April 21)
retrieved 21 April 2025
from https://medicalxpress.com/news/2025-04-merging-autonomy-manual-wheelchair-users.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Autonomous technologies promise passengers travel without concern—the ability to get from Point A to Point B without needing to be engaged in the process. Yet passengers still don’t trust computers the way they trust human drivers, and most autonomous vehicles on the road today are equipped for a human to take over.
The same technologies that allow autonomous vehicles (AVs) to navigate city streets can also give motorized wheelchair users the ability to get from place to place without personally controlling the chair, but wheelchair users also want a way to override the computer. Engineers at the University of Michigan aim to provide that functionality.
Most wheelchairs available to those needing regular transportation are either fully manual, or fully autonomous—with few options in between. The research team is harnessing the light detection and ranging (LiDAR) sensors and an onboard camera to allow people with disabilities the combination of freedom to let the software drive and oversight when the user is less trusting of autonomous decisions.
“The sweet spot is something we call ‘shared control’ or ‘shared autonomy,’ where the robot is assisting you to the extent you want, but it is not ever putting the passenger in a situation where they cannot control their destiny,” said Vineet Kamat, the U-M John L. Tishman Family Professor of Construction Management and Sustainability and a professor of civil and environmental engineering.
“People with physical disabilities primarily want to maintain their independence but have significant navigation and maneuvering challenges operating in the built environment.”
In many instances, the prospect of having to steer through a complex or crowded pathway may discourage people with disabilities from partaking in social, business or educational opportunities. In the U.S., roughly 2.7 million per year experience health issues that require the use of a wheelchair.
Kamat and Carol Menassa, a U-M professor of civil and environmental engineering and a John L. Tishman Construction Management Faculty Scholar, have collaborated for years, helping robots to understand and reason with built environments, both indoor and outdoors.
This year, their research team was able to outfit a motorized wheelchair with both LiDAR and a 3D camera, tap into the wheelchair’s drive system and write algorithms that would allow for shared control with the help of a video game controller.
“Trust is very important in this type of situation, because there are so many things at stake,” Menassa said. “You want to trust that you are going to be safe and that any people in the environment are going to be safe.”
It’s the same problem being encountered by the auto industry. In Oct. 2023, J.D. Power reported: “Consumer confidence in fully-automated, self-driving vehicles continues to decline for the second consecutive year… Consumers show less readiness on all metrics, with the lowest level of comfort riding in a fully automated, self-driving vehicle and using fully automated, self-driving public transit.”
They have been testing their system, called CoNav, in the basement corridors of the G.G. Brown Building on North Campus with able-bodied volunteers manning the chair.
“Feedback between the user and the system is very important,” Menassa said, “and that is the feedback that’s going to, over time, initiate or establish that trust.”
The research team includes Yifan Xu, a U-M graduate student research assistant; Jordan Lillie a U-M biomedical engineering technician; and undergraduate student Qianwei Wang.
Following the technical validation and testing that’s currently underway, the team will turn its focus to testing with people with disabilities.
Citation:
Merging autonomy with manual control for wheelchair users (2025, April 21)
retrieved 21 April 2025
from https://medicalxpress.com/news/2025-04-merging-autonomy-manual-wheelchair-users.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.