Researchers at the University of Pittsburgh School of Medicine and UPMC 
describe in PLoS ONE how an electrode array sitting on top of the brain 
enabled a 30-year-old paralyzed man to control the movement of a 
character on a computer screen in three dimensions with just his 
thoughts. It also enabled him to move a robot arm to touch a friend's 
hand for the first time in the seven years since he was injured in a 
motorcycle accident.
With brain-computer interface (BCI) technology, the thoughts of Tim 
Hemmes, who sustained a spinal cord injury that left him unable to move 
his body below the shoulders, were interpreted by computer algorithms 
and translated into intended movement of a computer cursor and, later, a
 robot arm, explained lead investigator Wei Wang, Ph.D., assistant 
professor, Department of Physical Medicine and Rehabilitation, Pitt 
School of Medicine.
"When Tim reached out to high-five me with the robotic arm, we knew 
this technology had the potential to help people who cannot move their 
own arms achieve greater independence," said Dr. Wang, reflecting on a 
memorable scene from September 2011 that was re-told in stories around 
the world. "It's very important that we continue this effort to fulfill 
the promise we saw that day."
Six weeks before the implantation surgery, the team conducted 
functional magnetic resonance imaging (fMRI) of Mr. Hemmes' brain while 
he watched videos of arm movement. They used that information to place a
 postage stamp-size electrocortigraphy (ECoG) grid of 28 recording 
electrodes on the surface of the brain region that fMRI showed 
controlled right arm and hand movement. Wires from the device were 
tunneled under the skin of his neck to emerge from his chest where they 
could be connected to computer cables as necessary.
For 12 days at his home and nine days in the research lab, Mr. Hemmes
 began the testing protocol by watching a virtual arm move, which 
triggered neural signals that were sensed by the electrodes. Distinct 
signal patterns for particular observed movements were used to guide the
 up and down motion of a ball on a computer screen. Soon after mastering
 movement of the ball in two dimensions, namely up/down and right/left, 
he was able to also move it in/out with accuracy on a 3-dimensional 
display.
"During the learning process, the computer helped Tim hit his target 
smoothly by restricting how far off course the ball could wander," Dr. 
Wang said. "We gradually took off the 'training wheels,' as we called 
it, and he was soon doing the tasks by himself with 100 percent brain 
control."
The robot arm was developed by Johns Hopkins University's Applied 
Physics Laboratory. Currently, Jan Scheuermann, of Whitehall, Pa., is 
testing another BCI technology at Pitt/UPMC. 
Co-authors of the paper include Jennifer L. Collinger, Ph.D., Alan D.
 Degenhart, Andrew B. Schwartz, Ph.D., Douglas J. Weber, Ph.D., Brian 
Wodlinger, Ph.D., Ramana K. Vinjamuri, Ph.D., and Robin C. Ashmore, 
Ph.D., all of the University of Pittsburgh; Elizabeth C. Tyler-Kabara, 
M.D., Ph.D., and Michael L. Boninger, M.D., of the University of 
Pittsburgh and UPMC; Daniel W. Moran, Ph.D., of Washington University in
 St. Louis; and John W. Kelly, of Carnegie Mellon University.
The study was funded by the National Institute of Neurological 
Disorders and Stroke, part of the National Institutes of Health, the 
University of Pittsburgh's Clinical and Translational Science Institute,
 and UPMC.
Journal Reference:
- Wei Wang, Jennifer L. Collinger, Alan D. Degenhart, Elizabeth C. Tyler-Kabara, Andrew B. Schwartz, Daniel W. Moran, Douglas J. Weber, Brian Wodlinger, Ramana K. Vinjamuri, Robin C. Ashmore, John W. Kelly, Michael L. Boninger. An Electrocorticographic Brain Interface in an Individual with Tetraplegia. PLoS ONE, 2013; 8 (2): e55344 DOI: 10.1371/journal.pone.0055344
 
Courtesy: ScienceDaily 


No comments:
Post a Comment