Wristband Enables Wearers to Control a Robotic Hand
Next time you scroll through your phone, take a moment to appreciate the feat: this seemingly mundane act is made possible by the coordination of 34 muscles, 27 joints, and over 100 tendons and ligaments in your hand. Indeed, our hands are the most nimble parts of our bodies, and mimicking their nuanced gestures has long been a challenge in robotics and virtual reality. Now, MIT engineers have designed an ultrasound wristband that precisely tracks a wearer’s hand movements in real-time.
The wristband produces ultrasound images of the wrist’s muscles, tendons, and ligaments as the hand moves and is paired with an artificial intelligence algorithm that continuously translates the images into the corresponding positions of the five fingers and palm. The researchers can train the wristband to learn a wearer’s hand motions, which the device can communicate in real-time to a robot or a virtual environment.
In demonstrations, the team has shown that a person wearing the wristband can wirelessly control a robotic hand. As the person gestures or points, the robot does the same. In a sort of wireless marionette interaction, the wearer can manipulate the robot to play a simple tune on the piano and shoot a small basketball into a desktop hoop. With the same wristband, a wearer can also manipulate objects on a computer screen, for instance pinching their fingers together to enlarge and minimize a virtual object.
The team is using the wristband to gather hand motion data from many more users with different hand sizes, finger shapes, and gestures. They envision building a large dataset of hand motions that can be plumbed, for instance, to train humanoid robots in dexterity tasks, such as performing certain surgical procedures. The ultrasound band could also be used to grasp, manipulate, and interact with objects in video games, design applications, or other virtual settings.
“We think this work has immediate impact in potentially replacing hand tracking techniques with wearable ultrasound bands in virtual and augmented reality,” says Xuanhe Zhao, the Uncas and Helen Whitaker Professor of Mechanical Engineering at MIT. “It could also provide huge amounts of training data for dexterous humanoid robots.”
Use Gemini in Google Maps for navigation while walking and biking
AI System Optimizes Robot Traffic in Warehouse
Related articles
Antioch startup develops simulation tools for robotics
Antioch develops simulation tools for robots to bridge the gap between simulation and reality.
Cadence expands AI and robotic partnerships with Nvidia and Google Cloud
Cadence Design Systems announced new partnerships with Nvidia and Google Cloud to enhance robotics and chip design.
Drones become smarter for large agricultural holdings
GEODASH Aerosystems develops smart drones for agriculture.