
Engineers combine AI and wearable cameras in self-walking robotic exoskeletons
Robotics researchersare developing exoskeleton legs capable of thinking and making control decisions on their own using sophisticated artificial intelligencetechnology.
Robotics researchersare developing exoskeleton legs capable of thinking and making control decisions on their own using sophisticated artificial intelligencetechnology.
By Media Relationsare developing exoskeleton legs capable of thinking and making control decisions on their own using sophisticated artificial intelligence (AI) technology.
The system combines computer vision and deep-learning AI to mimic how able-bodied people walk by seeing their surroundings and adjusting their movements.
“We’re giving robotic exoskeletons vision so they can control themselves,” said Brokoslaw Laschowski, a PhD candidate in systems design engineering who leads a University of ݮƵ research project called ExoNet.
Exoskeleton legs operated by motors already exist, but they must be manually controlled by users via smartphone applications or joysticks.
“That can be inconvenient and cognitively demanding,” said Laschowski, also a student member of the ݮƵ Artificial Intelligence Institute (ݮƵ.ai). “Every time you want to perform a new locomotor activity, you have to stop, take out your smartphone and select the desired mode.”
To address that limitation, the researchers fitted exoskeleton users with wearable cameras and are now optimizing AI computer software to process the video feed to accurately recognize stairs, doors and other features of the surrounding environment.
The next phase of the ExoNet research project will involve sending instructions to motors so that robotic exoskeletons can climb stairs, avoid obstacles or take other appropriate actions based on analysis of the user’s current movement and the upcoming terrain.
“Our control approach wouldn’t necessarily require human thought,” said Laschowski, who is supervised by engineering professor John McPhee, the Canada Research Chair in Biomechatronic System Dynamics, in hisMotion Research Group lab. “Similar to autonomous cars that drive themselves, we’re designing autonomous exoskeletons that walk for themselves.”
The researchers are also working to improve the energy efficiency of motors for robotic exoskeletons by using human motion to self-charge the batteries.
The latest in a series of papers on the related projects,, appears in the journal IEEE Transactions on Medical Robotics and Bionics.
The research team also includes engineering professor Alexander Wong, the Canada Research Chair in Artificial Intelligence and Medical Imaging, and William McNally, also a PhD candidate in systems design engineering and a student member of ݮƵ.ai.
Read more
Engineering research team expects free service to be operating on the Ring Road by this fall
Read more
Trexo Robotics has engineered a wearable solution for kids living with mobility challenges
Read more
A new basketball-playing robot called MyJay is part of a vision for a futurein whichsocial robots are accessible to children in public schools and libraries.
The University of ݮƵ acknowledges that much of our work takes place on the traditional territory of the Neutral, Anishinaabeg, and Haudenosaunee peoples. Our main campus is situated on the Haldimand Tract, the land granted to the Six Nations that includes six miles on each side of the Grand River. Our active work toward reconciliation takes place across our campuses through research, learning, teaching, and community building, and is co-ordinated within the Office of Indigenous Relations.