MAY 31, 2018 3:27 PM PDT

Robot Learns to Put Hospital Gowns on Patients

WRITTEN BY: Julia Travers

A robot at the Georgia Institute of Technology is learning how to assist patients with their gowns -- in fact, with the guidance of researchers, it is teaching itself. The robot is a PR2, a kind of personal robot, and is acquiring new skills through sensing force rather than using visual information. To do so, it relies on haptic and kinematic feedback, which involve touch, force and vibration, from its end effector, which is the device on the end of its arm, also called its grippers or fingertips. After studying thousands of simulations, the robot was able to predict and estimate the appropriate forces to apply to the process of laying fabric over a human.

the PR2, credit: Georgia Tech

With more than 1 million people in the U.S. alone in need of dressing assistance, this robotic function could become very helpful in both medical and residential settings. Zackory Erickson, a Georgia Tech Ph.D. student who is one of the robotics researchers, said:

People learn new skills using trial and error. We gave the PR2 the same opportunity. Doing thousands of trials on a human would have been dangerous, let alone impossibly tedious. But in just one day, using simulations, the robot learned what a person may physically feel while getting dressed.

Erickson also said the robot’s end goal is to put the cloth over the person’s body by selecting actions that are “best for the person.” In 24 hours, the PR2 analyzed about 11,000 simulations of robots putting clothes on people’s arms. The hand, elbow and shoulder are tricky areas, where applying too much or too little pressure could be dangerous. Some of the examples the robot was given were perfect, some were hazardous to the person, and some were mediocre. This range allowed the machine to figure out how best to move fabric around a body, including which movements make cloth taut and which make it loose in different scenarios. The PR2 was programmed to think and project about a fifth of a second ahead while planning its movements. Without this allowance, the PR2 struggled.

“The key is that the robot is always thinking ahead. It asks itself, ‘if I pull the gown this way, will it cause more or less force on the person’s arm? What would happen if I go that way instead?’”  lead faculty member and Georgia Tech and Emory University Associate Professor Charlie Kemp said.

After achieving success in dressing simulations, the robot was tested on living people. While they sat before the PR2, it held up a hospital gown and slid it onto their arms. It can currently put on one arm of a gown within about 10 seconds. For a personal robot to be able to fully dress a person safely is still a ways off, according to Georgia Tech News. Check out the PR2’s new skills below.

The paper, “Deep Haptic Model Predictive Control for Robot-Assisted Dressing,” was shared in May in Australia during the International Conference on Robotics and Automation.

 

Source:

Georgia Tech News

About the Author
Bachelor's (BA/BS/Other)
Julia Travers is a writer, artist and teacher. She frequently covers science, tech, conservation and the arts. She enjoys solutions journalism. Find more of her work at jtravers.journoportfolio.com.
You May Also Like
Loading Comments...