A hand wearing a wristband controls an app on a tablet.The University of Utah and Meta have announced a new research collaboration to evaluate consumer-grade wrist wearables for people with different levels of hand mobility, with the goal of making human-computer interactions more accessible. 

The project uses Meta Neural Band, an electromyography wristband already available to consumers for controlling Meta Ray-Ban Display glasses. This new collaboration will evaluate how the wristband can enable controls and serve as an interface for people with different levels of hand mobility. By measuring electrical signals from the user’s muscles and translating them into a digital signal, the collaboration will co-design custom gestures that can control various other devices —  such as smart speakers, blinds, locks, thermostats, and more. 

For people with muscular dystrophy, ALS, and other conditions, everyday activities like raising window blinds present a challenge. Motorized smart home devices can help do the work, but their interfaces must be accessible for people with a range of abilities. What might be intuitive to some users — say, turning their hand to control a smart thermostat — might be impossible for others. Meta’s wristband is sensitive enough to detect subtle muscle activity in the wrist, even for those who can’t move their hands; U researchers will test Meta’s advanced research algorithms for customizing gesture controls and adapt the experience based on user feedback.   

These gesture controls go beyond flipping light switches and dialing in thermostats, however. The researchers will also investigate controls for mobility devices, like the University of Utah’s TetraSki. Designed for people with a wide range of physical abilities, the TetraSki currently employs either a joystick or sip-and-puff mouth controls; Meta’s wristband presents a new set of potential control options, with gestures for steering and changing the skis’ wedge angle.     

A person using a seated skiing contraption
The University of Utah’s TetraSki currently employs either a joystick or sip-and-puff mouth controls

Leading this effort is Jacob A. George, the Solzbacher-Chen Endowed Professor in the John and Marcia Price College of Engineering’s Department of Electrical & Computer Engineering and the Spencer Fox Eccles School of Medicine’s Department of Physical Medicine and Rehabilitation. Jason Wiese from the Kahlert School of Computing, Russel Butterfield and Mark Bromberg from the Department of Neurology, and Steven Edgley, Jeffrey Rosenbluth, and Colby Hansen from the Department of Physical Medicine and Rehabilitation will provide faculty support in engineering and medicine.  

As director of the Utah NeuroRobotics Lab, George’s research focuses on neural interfaces for robotic prostheses, and has previously used EMG to enable users to control the LUKE Arm.

“Meta’s research technology goes beyond just controlling computers; it dovetails with the NeuroRobotics Lab’s work on how we get signals from the nervous system integrated with the world at large,” says George. “To that end, we’ll be testing the wristband’s ability to control mobility devices, like the TetraSki, as well as a suite of smart home devices.” 

“We’re going to be assessing the existing EMG wristband for individuals with muscular dystrophy, stroke, ALS, and other conditions, to see how well it works for them right out of the box,” says George. He and his collaborators will also have access to some of Meta’s research-grade algorithms and will use advanced features to explore customized controls for users with different abilities.

“We’ll be working closely with the participants to ensure the technology is intuitive, effective, and inclusive to all,” says George. “This co-design, between industry, academia, and end users, is what makes this project so unique.”  

They will test their customized gestures in the Craig H. Neilsen Rehabilitation Hospital’s Advanced Rehabilitation Technology Research Studio and Therapy Apartment, where individuals practice everyday activities.   

This initial project is a part of a series of academic collaborations Meta is leading as a part of their EMG accessibility portfolio. The partnership reflects a practical approach to turning university research into real-world impact. Beyond this project, the broader agreement opens the door for other future collaborations between the U and Meta in robotics and AI.