JAN 01, 2018 6:47 AM PST

Robots Engineered to Learn Like Babies

In the first year of life, the brain development of a baby is going on at a lightning-fast pace. All of the milestones like sitting up, grasping objects, crawling and eventually walking, are the result of neurons firing and becoming active.
 
Babies don’t realize it, but they gain these skills by imagining their next move. The brain is a step ahead, and that’s how they achieve all these abilities in such a short time.
 
Researchers at UC Berkeley have used that brain function as a model for their new robot called Vestri. Robots are being developed for many different applications by lots of researchers, but normally the way they teach a robot how to move or pick up objects is by programming hundreds of items and movements into the software. That limits the actions a robot can take, however. Once it encounters an object that has not been entered into the database, it won’t know what to do.
 
The goal of developing Vestri to have the ability to think ahead, which is called visual foresight, is so that self-driving cars might be able to avoid accidents or road hazards by anticipating them. Robots that can assist someone in the home or the workplace will also need this kind of technology and engineers have found that mimicking the way the human brain does it is the most efficient way. Babies learn almost entirely from autonomous play. While a baby sitting on the floor banging spoons and pans together might not seem like something robotic scientists would learn from, it’s the exact example they needed to make robots more helpful.
 
For now, the goal is just to get the robot’s software to anticipate what to do just a few seconds into the future. Images of their surroundings are received through cameras that serve at the “eyes” of the robot. Moving objects around on a table is where the learning starts, but it’s vital that they begin just by being left on their own to figure out how to move some objects without knocking over others. Once a robot has spent time managing these tasks, the algorithms in its software builds what is called a “predictive model” which will come in handy the next time an unfamiliar item is encountered.
 
Sergey Levine, an assistant professor in Berkeley’s Department of Electrical Engineering and Computer Sciences, runs the lab that is working to produce a prototype that can think ahead in a situation it hasn’t been programmed to handle. He explained, “In the same way that we can imagine how our actions will move the objects in our environment, this method can enable a robot to visualize how different behaviors will affect the world around it. This can enable intelligent planning of highly flexible skills in complex real-world situations.”
 
The robot isn’t complete yet, but what it has managed to do so far has been accomplished entirely by way of machine learning. It’s called “dynamic neural advection” (DNA), and it’s a form of predicting how the pixels of video will move from one frame to the next. By allowing the robots to do the same tasks, over and over again, without any human feedback to correct errors, the software literally learns how to predict what will happen next. Just as letting a baby surf around the furniture, grasping at toys and finding their way, this way of engineering robotic machinery will allow the devices to be more human-like. Making the video prediction part of the equation work will eventually replace the practice of programmers having to enter thousands of items and images into the database of a robot. The video below shows the prototype at work, check it out.
 
About the Author
  • I'm a writer living in the Boston area. My interests include cancer research, cardiology and neuroscience. I want to be part of using the Internet and social media to educate professionals and patients in a collaborative environment.
You May Also Like
JUL 15, 2021
Infographics
Ketamine's Effects on Pain and Mental Health
JUL 15, 2021
Ketamine's Effects on Pain and Mental Health
Ketamine is receiving increasing amount of attention both for its effects in mental health- particularly in depression- ...
JUL 16, 2021
Neuroscience
From Thought to Text: A Neural Interface can Type the Sentences You Think
JUL 16, 2021
From Thought to Text: A Neural Interface can Type the Sentences You Think
Researchers create neural interface to decode thought into text
AUG 01, 2021
Drug Discovery & Development
Berry Compound Reverses Parkinson's in Mice
AUG 01, 2021
Berry Compound Reverses Parkinson's in Mice
A naturally-occurring compound called farnesol found in berries and other fruits prevents and reverses Parkinson's-a ...
AUG 15, 2021
Neuroscience
Cholesterol in the Brain Linked to Alzheimer's Plaques
AUG 15, 2021
Cholesterol in the Brain Linked to Alzheimer's Plaques
Cholesterol levels in the brain tightly regulate the production of amyloid-beta plaques in the brain, a key feature of A ...
AUG 18, 2021
Health & Medicine
Brain-Gut Connection in Brain Disorders Potentially Explained
AUG 18, 2021
Brain-Gut Connection in Brain Disorders Potentially Explained
Gut microbes are believed to influence brain disorders, potentially through the vagus nerve, as seen in experimental and ...
SEP 09, 2021
Drug Discovery & Development
Can vaccines help prevent and treat opioid addiction?
SEP 09, 2021
Can vaccines help prevent and treat opioid addiction?
According to the United States Centers for Disease Control (CDC), 136 people die from an opioid overdose every ...
Loading Comments...