How a New AI Translated Brain Activity to Speech With 97 Percent Accuracy
researchers at the University of California, San Francisco, have developed technology that can translate brain signals into complete sentences with error rates as low as three percent. Although part of speech have been decoded from brain signals for roughly a decade, the solutions have been far from perfect as they did not have the ability to translate intelligible sentences consistently. Last year, a project was able to use brain signals to animate a simulated vocal tract, however, even in this instance, only 70 percent of the words were intelligible.
The new solution achieved by researchers had one key factor: identifying the strong parallels between translating brain signals to text and machine translation between languages using neural networks. This process of machine translation between languages is highly accurate for numerous languages, and therefore by realizing the connection between this and translating brain signals into text, the researchers were able to develop the new and accurate solution. Rather than attempt to translate phonemes, the researchers used machine translation to translate an entire sentence at once.