IE 11 is not supported. For an optimal experience visit our site on another browser.

Mind-Reading Devices to Help the Speechless Speak

The thoughts are there, but there is no way to express them. For "locked in" patients, many with Lou Gehrig's disease, the only way to communicate tends to be through blinking in code.
/ Source: Discovery Channel

The thoughts are there, but there is no way to express them. For "locked in" patients, many with Lou Gehrig's disease, the only way to communicate tends to be through blinking in code.

But now, words can be read directly from patients' minds by attaching microelectrode grids to the surface of the brain and learning which signals mean which words, a development that will ultimately help such patients talk again.

"They're perfectly aware. They just can't get signals out of their brain to control their facial expressions. "They're the patients we'd like to help first," said University of Utah's Bradley Greger, an assistant professor of bioengineering who, with neurosurgery professor Paul House, M.D., published the study in the October issue of the Journal of Neural Engineering.


How to Read Thoughts

Some severely-epileptic patients have the seizure-stricken parts of the brain removed. This standard procedure requires cutting the skull open and putting large, button-sized electrodes on the brain to determine just what needs removal.  The electrodes are then taken off the brain.

The University of Utah team worked with an epileptic patient who let them crowd together much smaller devices, called micro-electrocorticography, onto his brain prior to surgery.

"The microelectrode grids that we placed on top of the brain are actually simple technology," Greger said. Made from platinum wires and silicone, a grid of 16 microelectrodes is less than a centimeter in diameter.

"The hard part for us was to figure out how to take the recordings we got from the microelectrodes and relate it to the words that the patients were speaking," Greger said. 

Microelectrode grids sat on two parts of the volunteer's brain crucial for speech: the Face-Motor Cortex and Wernicke's area. Both grids were hooked up to a computer and run through an algorithm. The researchers asked the volunteer to repeat a string of 10 words out loud while the computer read the brain signals: "yes," "no," "hot," "cold," "hungry," "thirsty," "hello," "goodbye," "more" and "less." 

Looking at the brain patterns from the string of words, the scientists matched the right word to the corresponding signal between 28 percent and 48 percent of the time, which is better than chance.

"As we increased the number of words, it dropped," Greger said. "We want it to be closer to 90 percent."

Now that they have proof of concept, the team plans to test a larger number of microelectrodes on more volunteers.

"We need to get more electrodes to get more information from the brain," Greger said. "We also need to account for the subtleties of the brain." For example, the word "yes" can be said in many different ways, and have multiple meanings. 

Ideally, the microelectrodes would be linked to a speech synthesizer to turn the signals into spoken words. 

Philip Kennedy, who heads up Neural Signals, Inc. in Duluth, Ga., also works on using brain signals to help locked in patients, using a smaller electrode than the University of Utah's microelectrode grid.

He hopes the University of Utah team brings their work to fruition.

"If you could cover the whole cortex with tiny electrodes, if they could get them close together, this will help greatly in not only detecting more words but actually probably doing it faster so we get closer to almost conversational speech," Kennedy said.