The stroke that disconnected Cathy Hutchinson’s brain from her body has kept her silent and unable to move for more than 14 years. But science is starting to change all that.
Researchers have connected the 58-year-old woman’s brain to a computer that runs a robotic arm. As Hutchinson sits at a table staring at a bottled drink and imagining the robot grabbing the bottle and bringing it to her mouth, the robot arm begins to move.
The robot is running on signals detected by sensors implanted in the part of Hutchinson’s brain that would normally control the movements of her right arm. The sensors pick up the sparking of nerve cells and send the signals to the computer which then translates them into commands for the robotic arm. Suddenly Hutchinson is able to do something she could only dream of before: As she thinks about getting herself a drink, the arm reaches over to the bottle and brings it to her lips, where she is able to sip the drink from a straw.
It’s the first time Hutchinson’s been able to do anything for herself since the stroke.
Hutchinson’s experiences, along with those of another quadriplegic patient, were described in a groundbreaking paper published Wednesday in Nature. Both patients are part of an ongoing government funded trial that is testing the new brain translation technology, BrainGate, which one day may free “locked-in” patients like Hutchinson and give functional limbs to amputees.
It will be years before BrainGate could be available to the general public. But Hutchinson’s happy to enjoy the future today. After realizing she could control the robot arm, she said she was “ecstatic.” Though Hutchinson cannot speak, she can type her thoughts through a device that takes its cues from her eye movements.
She’s optimistic about what the research might one day bring. “I would love to have robotic leg support,” she says.
What’s amazing is how researchers have “taught” their computer to essentially read Hutchinson’s thoughts.
The baby-aspirin sized sensor implanted in Hutchinson’s brain contains 96 hair-thin electrodes that record the sparking of neurons in the movement control center, the motor cortex.
The first step in the learning process is for the computer to “see” which neurons spark, and in what pattern, when a person picks up a bottle and brings it to her lips, explains the study’s lead researcher, Dr. Leigh R. Hochberg, a professor of engineering at Brown University, a researcher at the Providence VA Medical Center, a critical care neurologist at the Massachusetts General Hospital/Brigham and Women’s Hospital in Boston, and a visiting associate professor of neurology at the Harvard Medical School.
Fortuitously, it doesn’t matter whether the person actually moves their limb or whether they’re merely imagining themselves doing it. So, for several trials, Hochberg and his colleagues had the computer observe the sparking patterns of neurons in Hutchinson’s brain as she watched the robot arm pick up the bottle and bring it to her lips.
Once the scientists had taught the computer which patterns would normally make Hutchinson’s arm reach out for the bottle of coffee, they hardwired them as the command for the robot arm to do the same thing – but with the signal coming directly from Hutchinson’s brain as she imagined herself grasping the cup and bringing it to her lips.
“Beyond this, our real dream for this research is for people with paralysis — from a brain stroke or spinal cord injury — to be able to one day reconnect the brain to the limbs,” Hochberg says.
Those kinds of injuries often leave a healthy brain disconnected from a healthy set of nerves that would normally control the movement of the limbs. So, the hope is that someday doctors might be able to reconnect things, like an electrician replacing a bit of shorted out wire.
For her part, Hutchinson is hoping that the researchers come up with a permanent, wireless implant and a way to reconnect her brain to her body.
She imagines what it would be like to cook and garden again. “I know that someday this will happen again,” she says.