SEP 15, 2018 11:37 PM PDT

The Future House Robot?

WRITTEN BY: Nouran Amin

In a recent publication, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), have made a key development in the area of robotics by developing a system that allows robots to randomly inspect objects and visually understand in order to complete specific tasks. The new system is being referred to as "Dense Object Nets" (DON) and looks at objects as a collection of points that serve as "visual roadmaps". The approach will allow robots to better understand and manipulate items.This can be very useful in household tasks especially right now when many are spending cash for houses

Image via The Japan Times

"Many approaches to manipulation can't identify specific parts of an object across the many orientations that object may encounter," says PhD student Lucas Manuelli. "For example, existing algorithms would be unable to grasp a mug by its handle, especially if the mug could be in multiple orientations, like upright, or on its side." The research team views potential applications for DON that are not strictly for manufacturing settings but also used at homes.

The DON system produces a series of coordinates on a given object, which creates "visual roadmap" of the objects allowing the robot a better understanding of what it needs to grab. The team trained DON to look at objects as points that compose a larger coordinate system; thus mapping different points together to visualize the object's 3-D shape. This is similar to how panoramic photos are sewn together to create multiple photos. “In factories robots often need complex part feeders to work reliably," says Manuelli. "But a system like this that can understand objects' orientations could just take a picture and be able to grasp and adjust the object accordingly."

Researchers hope in the future to advance the system to be oriented in a place where it can perform certain tasks with a deeper understanding of the corresponding objects.

The study will be represented in the Conference on Robot Learning in Zürich, Switzerland.

Source: MIT Computer Science & Artificial Intelligence Lab

About the Author
  • Nouran earned her BS and MS in Biology at IUPUI and currently shares her love of science by teaching. She enjoys writing on various topics as well including science & medicine, global health, and conservation biology. She hopes through her writing she can make science more engaging and communicable to the general public.
You May Also Like
JUL 20, 2020
Neuroscience
Phantom-Limb Pain Reduced by Brain-Computer Interface
JUL 20, 2020
Phantom-Limb Pain Reduced by Brain-Computer Interface
Phantom-limb pain is a condition in which amputees feel like their amputated limb is still attached to their bodies. Whi ...
AUG 17, 2020
Clinical & Molecular DX
Smartphone Cameras Detect Signs of Diabetes
AUG 17, 2020
Smartphone Cameras Detect Signs of Diabetes
In the future, detecting the early signs of diabetes could be as simple as picking up your smartphone. A new study, rece ...
SEP 15, 2020
Chemistry & Physics
Chemistry Grad Students Be Warned: a Robotic Takeover?
SEP 15, 2020
Chemistry Grad Students Be Warned: a Robotic Takeover?
In a recent news release, the research arm of IBM announced that their Zurich team has developed an autonomous ...
OCT 23, 2020
Space & Astronomy
Astronomers Merge AI and Photonics to Find New Earths
OCT 23, 2020
Astronomers Merge AI and Photonics to Find New Earths
Researchers from the University of Sydney have developed a new type of sensor capable of measuring and correcting starli ...
OCT 13, 2020
Drug Discovery & Development
Uses of Tandem Mass Spectrometry (TMS)
OCT 13, 2020
Uses of Tandem Mass Spectrometry (TMS)
What is tandem mass spectrometry? A powerful analytical tool that is capable of characterizing complex mixtures in drug ...
OCT 25, 2020
Technology
Seeing is No Longer Believing
OCT 25, 2020
Seeing is No Longer Believing
Seeing is no longer believing. It is now quite easy to manipulate images and re-imagine events. Anyone can download imag ...
Loading Comments...