Meta AI has announced a major breakthrough with AI that can reconstruct sentences from the mind solely by reading brain signals.
In the announcement on 7 February, the company said it has been working on decoding the perception of images and speech from brain activity, and has now made significant progress.
The AI can successfully decode the production of sentences from non-invasive brain recordings, decoding up to 80% of characters and reconstructing sentences solely from brain signals.
Getting closer to AMI
AMI is a foreseen possibility in AI, where electronic and other devices can be aware of and respond to people’s presence without express prompts.
This makes it possible for bots to improve the quality of lives of even people who can’t speak or move. According to Meta, this new breakthrough brings the company closer to realizing AMI.
By being able to construct sentences from brain signals, bots may be able to decode the needs of people and attend to those needs without the people having to say anything.
The most significant aspect of this breakthrough is that the method is non-invasive. Before now, only invasive brain recording techniques were available, which required neurosurgical help, making them difficult to scale.
By decoding messages from brain signals, the new system doesn’t require any expert assistance, and can help restore speech for those who have lost the ability to speak.
The technology has not been perfected yet and will need some more work to be used in clinical settings, but Meta is sure this will lead to greater discoveries in the study of computer and brain interactions.
Next step
With this discovery, Meta is set to go further in its investigation, and is donating $2.2 million to the Rothschild Foundation Hospital in support of further research.
The research will focus on understanding brain signals and how the brain effectively transforms thoughts into a sequence of words.