Meta eggheads demo Bluetooth wristband that decodes muscle twitches for UI control
- Reference: 1753295714
- News link: https://www.theregister.co.uk/2025/07/23/meta_wristband_ui_control/
- Source link:
According to a peer-reviewed paper from scientists at the company formerly known as Facebook, the wrist-worn device prototype allows users to interact with computers through hand gestures, including handwriting and pinch movements. It streams muscle signals wirelessly over Bluetooth and decodes them into computer commands in real time.
The paper, [1]published in the research journal Nature on Wednesday, says the history of the user interface has seen the introduction of keyboards, mice, and touchscreens, all of which require contact and are difficult to use on the move. While gesture-capture devices have also been developed, some require line-of-sight camera sensors, while others are intrusive.
[2]
Meanwhile, researchers have long imagined tapping brain or muscle electrical signals to control computers without any physical device. In practice, it usually requires invasive implants and software custom-trained to each person's unique signal patterns.
[3]
[4]
The authors claim their new system can capture and decode hand movements without the need for personalized calibration or invasive procedures.
Led by Meta Reality Labs' research science director Patrick Kaifosh and research VP Thomas Reardon, the team showed how the wristband could be used to recognize gestures in real time to control a one-dimensional cursor, select commands, and even create text on the screen by detecting handwriting gestures.
[5]
It's worth noting that handwriting recognition runs at about 20.9 words per minute (WPM), compared with roughly 36 WPM on mobile-phone keyboards and over 40 WPM for proficient typists.
Nonetheless, it's impressive that the wrist-attached surface electromyography (sEMG) can decode movements generalized among users. sEMG places electrodes on the skin to record the tiny electrical signals your muscles produce when they contract.
To our knowledge, this is the first high-bandwidth neuromotor interface with performant out-of-the-box generalization across people
“We developed a highly sensitive, easily donned sEMG wristband and a scalable infrastructure for collecting training data from thousands of consenting participants. Together, these data enabled us to develop generic sEMG decoding models that generalize across people," the authors write in the paper.
"We demonstrate that the decoding performance of handwriting models can be further improved by 16 percent by personalizing sEMG decoding models. To our knowledge, this is the first high-bandwidth neuromotor interface with performant out-of-the-box generalization across people.”
The Reality Labs team is publicly releasing a repository containing over 100 hours of sEMG recordings from 300 participants across all three tasks in the publication in an effort to boost future work into studying the wrist-action UI model.
[6]
They also suggest that the system could be adapted to help those with reduced mobility interact with computers more easily.
[7]Meta reveals plan for several multi-gigawatt datacenter clusters
[8]More trouble for authors as Meta wins Llama drama AI scraping case
[9]Meta – yep, Facebook Meta – is now a defense contractor
[10]Meta calls €200M EU fine over pay-or-consent ad model 'unlawful'
“It is unclear whether the generalized models developed here and trained on able-bodied participants will be able to generalize to clinical populations, although early work appears promising. Personalization can be applied selectively to users for whom the generic model works insufficiently well due to differences in anatomy, physiology or behavior.
"However, all of these new applications will be facilitated by continued improvements in the sensing performance of future sEMG devices, increasingly diverse datasets covering populations with motor disabilities, and potentially combining with other signals recorded at the wrist,” the paper said. ®
Get our [11]Tech Resources
[1] https://www.nature.com/articles/s41586-025-09255-w
[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aIZNPdyrcYQB0dTHxTflRAAAAI8&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aIZNPdyrcYQB0dTHxTflRAAAAI8&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aIZNPdyrcYQB0dTHxTflRAAAAI8&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aIZNPdyrcYQB0dTHxTflRAAAAI8&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aIZNPdyrcYQB0dTHxTflRAAAAI8&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[7] https://www.theregister.com/2025/07/15/meta_datacenter_build_plan/
[8] https://www.theregister.com/2025/06/27/meta_llama_author_lawsuit/
[9] https://www.theregister.com/2025/05/30/meta_is_now_a_defense/
[10] https://www.theregister.com/2025/07/03/meta_ec_dma_sulk/
[11] https://whitepapers.theregister.com/
Re: I'm not a big fan of Meta...
Any Meta link should only send the one finger salute.
Re: I'm not a big fan of Meta...
Yeah, plus it has an IMU so it could also be used to replace a Wiimote (Extended Data Fig. 1). It's great that their Nature paper is Open Access with some good details (36 pages, linked under "published").
I imagine that on-wrist local processing would be possible with some Helium-capable 1 GHz [1]Renesas RA8P1 Cortex-M85 microcontroller (for inference/use, not training). Their 60 million parameter handwriting model that needs 801.7 million FLOPs per output sample may be a tight fit, and slow (before optimization), but the wrist decoder network that takes 1.2 million FLOPs per output should be just fine, imho!
The good folks at [2]Plumerai could be the ones to get that tech to run in standalone mode ... (if so inclined)
[1] https://www.renesas.com/en/products/ra8p1
[2] https://plumerai.com/partners/renesas
Could be useful for people with disabilities
But hard to imagine it would be useful for day to day use for the rest of us. The "holding hands up and making gestures" UI may look cool in Minority Report, but isn't practical and won't replace a mouse/keyboard or phone/tablet touchscreen.
Re: Could be useful for people with disabilities
I actually liked the Johnny Mnemonic interface, and I like the UIs where data is shipped from one display to another for another form of presentation or as way to send data to another device, I think Iron Man had a few of those too. Some of these ideas were actually very usable. Doable, probably not (for instance, we're still not really doing holographics) but some of these ideas have merit.
Re: Could be useful for people with disabilities
No they aren't. It would be at minimum tiring (and painful for many due to injuries or age) to hold your arms up in front of you all the time. The "Iron Man" interface looks cool, but you've only ever seen in a movie, and never considered how it would be to actually use something like that.
Re: Could be useful for people with disabilities
I disagree. That is, I agree about the posture implications, but I think you need to take a step back and look at the wider picture.
Some UI ideas these people have come up with are pretty good, especially for people who do not use computers on a daily basis. It's underestimated just what a massive impact a UI (and changes) have on productivity - if it was valued for the manhour saving feature a good UI is Microsoft had to bribe work a lot harder to stay in business. Just because it hides in OPEX rather than CAPEX doesn't mean the costs aren't there and I don't think we've reached the end of UI design. Or UI mistakes of aforementioned outfit.
All IMHO, of course.
Re: Could be useful for people with disabilities
The ideas generally are good for those who have to interactively present, for day-to-day work very few of these ideas are of use; how many people are using touch screens on desktop computers or even on laptops?
Yes I use touch screens on tablets, but clamshell laptops and desktops its keyboard and trackball/mouse. Hence why I found e-readers problematic, their form factor calling out for touch screen not sequences of button presses.
Smeeeee...
Smeeeeeeeee. Smeeeeeeeggggg...
Sorry I can't stop reading that in Krytens voice. I know it's not spelled that way but I can't unsee it now.
Re: Smeeeee...
If they're willingly working for Zuckerberg it is spelled and pronounced that way.
I waved hello to my boss and it deleted our entire production database, sold the backup drives to a chinese metal-recovery conglomerate and sent an email to the VP of sales, claiming his wife was a perk of the job......
Thats my excuse.
I'll take it ...
... but only if certain obscene hand gestures are interpreted as an instruction to the application or operating system to stop doing the thing it just did.
I'm not a big fan of Meta...
But a wristband that could potentially translate sign-language directly into speech would be awesome to have !
(As long as it doesn't need to send everything to Meta's servers...)