A California neuroscientist said that a new AI-powered surgery could make it possible to provide vision and hearing for the blind, as well as “mind reading”.
Ann Johnson, who was paralyzed by a stroke in 2005 and lost the ability to speak, has been able to use a cloned voice to communicate after having her brain connected to artificial Intelligence. Over 250 electrodes were attached to Johnson’s head and connected to a computer array through a port at the back of her skull. then translated the brain activity of Johnson into English by using an AI avatar.
“AI will be used in a variety of applications, and not just to restore function after paralysis,” Dr. Eddie Chang said, chairman of neurosurgery at the University of California San Francisco, who led the team that developed Johnson’s procedure. “I believe some of the tools will allow us to help restore vision and perhaps certain types of deafness,” said Dr. Eddie Chang, chairman of neurological surgery at the University of California, San Francisco.
“We are far from being able to read minds, but there is potential,” he said.
A NEW BREAKTHROUGH IN ARTIFICIAL Intelligence Could Soon Enable Mind Reading: Neuroscientist
Chang led a group of scientists from the University of California at Berkeley and his college. published an study last week showing how to translate brain signals into facial expressions and spoken language.
Chang stated that “we could be at a stage where similar technologies and their related ones will allow us, quote-unquote to read the mind.” Researchers are “one step closer” to understanding the technology that allows us to read signals from the brain and mind.
In May, scientists at the University of Texas at Austin released research that supports the possibility of converting brain activity to language. The Texas team measured the brain activity of participants while they listened to podcasts on a functional MRI.
This data was fed into a computer to teach it to interpret brain activity in words. The participants then listened to stories that were being read in the scanner.
According to the study , using the brain activity the computer recreated the new story. It was able either to closely or accurately match the meaning of the original phrase about half the time.
Chang told Fox News that “even five or 10 years ago we did not have the right AI tool to decipher brain activity and translate into words.” It is “very complicated” to translate “over tens thousands of neurons into English.”
He continued, “That’s the reason AI has been critical to our strategy because it is very powerful in converting those subtle signals into useful things like words.”
According to a study by the University of California, Johnson’s communication skills improved dramatically when he used the AI avatar. Chang said that she went from communicating 15 words per minute to almost 80. The normal conversation speed is 160 words per minutes.
Chang, speaking to Fox News, said that the research “will have profound implications for thinking about the brain and medicine.” The research will also have “a lot of ethical implications, including privacy issues and others.”
He said, “Now is the time to engage and think through what we truly want from this technology.”
Click here to view the complete interview about how AI can enable mind-reading.
Julia Musto is the author of this report.