Robots will soon be able to communicate with humans in different situations News Today

  • | Monday | 15th February, 2021

Developed by Cornell University researchers the low-cost method for soft deformable robots can indentify from pats to punches to hugs without relying on touch at all. A USB camera located inside the robot captures the shadow movements of hand gestures on the robot's skin and classifies them with machine-learning software. The technology originated as part of an effort to develop inflatable robots that could guide people to safety during emergency evacuations. Such a robot would need to be able to communicate with humans in extreme conditions and environments. Under the robot's skin is a USB camera which connects to a laptop.

Chennai: Scientists have created robots that detect a range of physical interaction using ShadowSense technology. Developed by Cornell University researchers the low-cost method for soft deformable robots can indentify from pats to punches to hugs without relying on touch at all. A USB camera located inside the robot captures the shadow movements of hand gestures on the robots skin and classifies them with machine-learning software. The study has been published in the Proceedings of the Association for Computing Machinery on Interactive Mobile Wearable and Ubiquitous Technologies. The new ShadowSense technology is the latest project from the Human-Robot Collaboration and Companionship Lab led by the papers senior author Guy Hoffman associate professor in the Sibley School of Mechanical and Aerospace Engineering. The technology originated as part of an effort to develop inflatable robots that could guide people to safety during emergency evacuations. Such a robot would need to be able to communicate with humans in extreme conditions and environments. Imagine a robot physically leading someone down a noisy smoke-filled corridor by detecting the pressure of the persons hand. Instead of installing a large number of contact sensors—which would add weight and complex wiring to the robot and would be difficult to embed in a deforming skin—the team took a counterintuitive approach. In order to gauge touch they looked to sight. Hu the lead author said “By placing a camera inside the robot we can infer how the person is touching it and what the persons intent is just by looking at the shadow images.” He added: “We think there is interesting potential there because there are lots of social robots that are not able to detect touch gestures.” The prototype robot consists of a soft inflatable bladder of nylon skin stretched around a cylindrical skeleton roughly four feet in height that is mounted on a mobile base. Under the robots skin is a USB camera which connects to a laptop. The team of researchers developed a neural-network-based algorithm that uses previously recorded training data to distinguish between six touch gestures—touching with a palm punching touching with two hands hugging pointing and not touching at all—with an accuracy of 87.5 to 96 percent depending on the lighting.

If You Like This Story, Support NYOOOZ

NYOOOZ SUPPORTER

NYOOOZ FRIEND

Your support to NYOOOZ will help us to continue create and publish news for and from smaller cities, which also need equal voice as much as citizens living in bigger cities have through mainstream media organizations.


Stay updated with all the Chennai Latest News headlines here. For more exclusive & live news updates from all around India, stay connected with NYOOOZ.

Related Articles