| Mind Machine
by David Pescovitz
Jose Carmena is also affiliated with UC Berkeley and UC San Francisco's Joint Graduate Group in Bioengineering.
For decades, the ability to control computers and robots with our minds has been the stuff of science fiction. Recently though, UC Berkeley professor Jose Carmena has made strides to bring the underlying technology into the real world. His research on brain machine interfaces could someday enable disabled individuals to be fitted with bionic prostheses operated by thought alone.
"Many people with severe physical disabilities have intact brains," says Carmena, a professor of electrical engineering and computer sciences, who joined the UC Berkeley faculty this year. "If we could extract and decode the intention of the patient from their brains, the signals could control any device, from a robotic arm to the cursor on a computer screen."
Recently, Carmena was part of a team at Duke University that famously demonstrated how rhesus monkeys could learn to operate a robot arm with brain signals. The commands came via an array of electrodes implanted in the frontal and parietal lobes of the animal's brain. In 2003, the researchers made headlines by showing that the monkey wasn't treating the robot as an external device. Instead, its brain's structure had adapted to control the appendage as if it were its own arm.
The implant consists of an array of several hundred hair-thin electrodes surgically placed in the frontal and parietal lobes, regions of the brain involved in motor abilities. Each microwire can detect the signals from as many as four neurons. Last year, Carmena and his colleagues recorded electrical signals from a human patient's brain. The array was temporarily implanted during deep-brain stimulation surgical procedures conducted to alleviate tremors symptomatic of Parkinson's and other diseases.
"The main benefit of using an array of many microelectrodes is that you can observe the patterns of how signals move through the brain over time," says Carmena, who is also affiliated with UC Berkeley's Group Major in Cognitive Science and the Helen Wills Neuroscience Institute. "From a neuroscience perspective, visualizing these neural functions helps us study how the brain learns and adapts."
An array of hair-thin electrodes that the researchers use to record electrical signals from the brain.
Of course, the brain-machine interface depends on nothing getting lost in the translation between thought and robotic action. At Berkeley, Carmena is attacking all the layers in the interface between mind and machine. Firstly, he will collaborate with researchers from the university's state-of-the-art microfabrication facilities to develop better electrode arrays. Currently, s mall movements of the recording electrodes in the cortex often damage the neurons, weakening the signal over time.
"So far, we've recorded signals for two years, but in a human patient you'd want the electrodes to work for much longer," Carmena says. "We need a device as dependable as a pacemaker, but for the brain."
Along with new electrodes, Carmena and his colleagues are improving the brain-machine interface itself. The aim is a small device that requires very little power but can handle the large bandwidth of brain signals. Then, once the signals are routed to the computer, advanced software algorithms are required to distinguish the signal from the noise. Finally, they must design and test novel prosthetic devices, from robot grippers and exoskeletons to software that could enable patients with neurological disorders such as locked-in syndrome to interact with the external world via computer.
Eventually, Carmena says he'd like to close the loop on brain-machine interfaces beyond just observing what's being controlled. Feedback, he explains, is essential for proprioception, knowledge of our body's place in the physical world. For example, proprioception is what enables us to touch a finger to our nose even with our eyes closed.
"We'd like patients to feel where the artificial arm is in space and perhaps even experience the sense of touch," Carmena says. "We've shown that we can extract signals from the brain, but can we encode information back into it?"
Jose Carmena's home page
Helen Wills Neuroscience Institute
Cognitive Science at UC Berkeley
UCSF/UCB Joint Graduate Group in Bioengineering
"Human Studies Show Feasibility of Brain-Machine Interfaces" (Duke University News and Communications, March 23, 2004)
"Monkeys Consciously Control a Robot Arm Using Only Brain Signals; Appear to 'Assimilate' Arm As If it Were Their Own " (Duke University News and Communications, October 13, 2003)
Lab Notes is
published online by the Marketing and Communications Office of the UC Berkeley
College of Engineering. The Lab Notes mission is to illuminate groundbreaking
research underway today at the College of Engineering that will
dramatically change our lives tomorrow.
Media contact: Teresa
Moore, Lab Notes editor, Director of Marketing and Communications
Writer, Researcher: David
Web Manager: Michele
Subscribe or send comments to the Engineering Marketing and Communications Office: firstname.lastname@example.org.
© 2005 UC Regents.