Speech now streaming from brains in real-time
- Reference: 1743578653
- News link: https://www.theregister.co.uk/2025/04/02/speech_now_streaming_from_brains/
- Source link:
Described in [1]a paper published in Nature Neuroscience this week, the neuroprosthesis is intended to allow patients with severe paralysis and anarthria – loss of speech – to communicate by turning brain signals into synthesized words.
"Our streaming approach brings the same rapid speech decoding capacity of devices like Alexa and Siri to neuroprostheses," said Gopala Anumanchipalli – assistant professor of electrical engineering and computer sciences at University of California, Berkeley and co-principal investigator of the study, done in conjunction with UC San Francisco – in [2]a statement .
[3]
"Using a similar type of algorithm, we found that we could decode neural data and, for the first time, enable near-synchronous voice streaming. The result is more naturalistic, fluent speech synthesis."
[4]
[5]
The project improves on work published in 2023 by reducing the latency to decode thought and turn it into speech, which at the time took about eight seconds to produce a sentence.
As demonstrated in this video, below, the new process works roughly 8x faster, operating in near real-time.
[6]
[7]Youtube Video
It begins by reading the patient's electrical brain signals after the intent to speak has been formed but before the thought has produced a vocal muscle response.
"We are essentially intercepting signals where the thought is translated into articulation and in the middle of that motor control," said co-lead author Cheol Jun Cho, UC Berkeley PhD student in electrical engineering and computer sciences, in a statement.
[8]
"So what we’re decoding is after a thought has happened, after we’ve decided what to say, after we’ve decided what words to use and how to move our vocal-tract muscles."
[9]Writing for humans? Perhaps in future we'll write specifically for AI – and be paid for it
[10]Genetic data repo OpenSNP to self-destruct before authoritarians weaponize it
[11]Microsoft to mark five decades of Ctrl-Alt-Deleting the competition
[12]Generative AI app goes dark after child-like deepfakes found in open S3 bucket
The neuroprosthesis works by passing 80ms chunks of electrocorticogram (ECoG) data through a neural encoder and then using a deep learning recurrent neural network transducer model to convert brain signals to sounds. The researchers used a recording of the patient's pre-injury voice to make the model's output sound more like natural speech.
While this particular neuroprosthesis requires a direct electrical connection to the brain, the researchers believe their approach is generalizable to other interfaces, including surgically implanted microelectrode arrays (MEAs) and non-invasive surface electromyography (SEMG).
The work builds on [13]research funded by Facebook that the social media biz abandoned four years ago to pursue more market-friendly SEMG wrist sensors. Edward Chang, chair of neurosurgery at the UCSF, who oversaw the Facebook-funded project is the senior co-principal investigator of this latest study.
Code for the [14]Streaming Brain2Speech Decoder has been posted to GitHub, in case anyone is looking to reproduce the researchers' results. ®
Get our [15]Tech Resources
[1] https://www.nature.com/articles/s41593-025-01905-6
[2] https://engineering.berkeley.edu/news/2025/03/brain-to-voice-neuroprosthesis-restores-naturalistic-speech/
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Z-0KwAj5OWXiu_YekpJWSgAAAlE&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Z-0KwAj5OWXiu_YekpJWSgAAAlE&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Z-0KwAj5OWXiu_YekpJWSgAAAlE&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Z-0KwAj5OWXiu_YekpJWSgAAAlE&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[7] https://www.youtube.com/watch?v=MGSoKGGbbXk
[8] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/science&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Z-0KwAj5OWXiu_YekpJWSgAAAlE&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[9] https://www.theregister.com/2025/04/01/interview_with_david_wong/
[10] https://www.theregister.com/2025/04/01/opensnp_shutdown/
[11] https://www.theregister.com/2025/04/01/50_years_of_microsoft/
[12] https://www.theregister.com/2025/04/01/nudify_website_open_database/
[13] https://www.theregister.com/2021/07/15/facebook_dumps_mindreading_neural_interface/
[14] https://github.com/cheoljun95/streaming.braindecoder
[15] https://whitepapers.theregister.com/
Re: that's really cool and
Yeah, this is nothing short of amazing for a first implementation.
Genuinely impressive
----- >
Few of these highly deserved for the team that got this working. What an amazing achievement!
"So what we’re decoding is after a thought has happened, after we’ve decided what to say, after we’ve decided what words to use and how to move our vocal-tract muscles."
What about those who set the mouth in motion before engaging the brain.
Astounding! This is one half of direct mind to mind commiunication, without distance limitations - because the intermediate stage is a computer.
Wrenches
I think that is the moment where wrench manufacturers are shaking from fear.
There is no longer a need to break someone's knee to make them talk.
Just put the contraption on their head and say: "Don't try to speak the password in your mind." or "Don't try to imagine where you hid the rugs."
Like the pink elephant thing.
that's really cool and
A welcome bit of good news.