The patients used brain-machine interfaces, including a virtual reality system that used their own brain activity to simulate full control of their legs.
The research -- led by Duke University neuroscientist Miguel Nicolelis, Ph.D., as part of the Walk Again Project in São Paulo, Brazil -- offers promise for people with spinal cord injury, stroke and other conditions to regain strength, mobility and independence.
"What we're showing in this paper is that patients who used a brain-machine interface for a long period of time experienced improvements in motor behavior, tactile sensations and visceral functions below the level of the spinal cord injury," he said. "Until now, nobody has seen recovery of these functions in a patient so many years after being diagnosed with complete paralysis."
Several patients saw changes after seven months of training. After a year, four patients' sensation and muscle control changed significantly enough that doctors upgraded their diagnoses from complete to partial paralysis.
Most patients saw improvements in their bladder control and bowel function, reducing their reliance on laxatives and catheters, he said. These changes reduce patients' risk of infections, which are common in patients with chronic paralysis and are a leading cause of death, Nicolelis said.
Brain-machine systems establish direct communication between the brain and computers or often prosthetics, such as robotic limbs. For nearly two decades, Nicolelis has worked to build and hone systems that record hundreds of simultaneous signals from neurons in the brain, extracting motor commands from those signals and translating them into movement.
Nicolelis and colleagues believe with weekly training, the rehab patients re-engaged spinal cord nerves that survived the impact of the car crashes, falls and other trauma that paralyzed their lower limbs.
"One previous study has shown that a large percentage of patients who are diagnosed as having complete paraplegia may still have some spinal nerves left intact," Nicolelis said. "These nerves may go quiet for many years because there is no signal from the cortex to the muscles. Over time, training with the brain-machine interface could have rekindled these nerves. It may be a small number of fibers that remain, but this may be enough to convey signals from the motor cortical area of the brain to the spinal cord."
In one early experiment Nicolelis used brain-implanted microelectrodes to record the brain activity of rats trained to pull a robotic lever to get a sip of water. Through a brain-machine interface, the rats learned to control the lever using only their brain activity.
In later endeavors, Nicolelis trained rhesus monkeys to use brain-machine interfaces to control robotic limbs, and later, the 3-D movements of an avatar -- animated versions of themselves on a digital screen. The animals soon learned they could control the movements by mentally conceiving them; there was no need to physically move.
The rhesus monkeys later learned to walk on a treadmill with robotic legs controlled by their brains. They also learned they could use thought to propel a small electric wheelchair toward a bowl of grapes.
The Duke experiments with rats and primates built a foundation for the work in human patients, including a 2004 article with Duke neurosurgeon Dennis Turner, M.D., that established a model for recording brain activity in patients when they used a hand to grip a ball with varied force.
The patients spent at least two hours a week using brain-machine interfaces, or devices controlled through their brain signals. All began the program by learning how to operate their own avatar, or digital likeness, in a virtual reality environment.
The patients wore fitted caps lined with 11 non-invasive electrodes to record their brain activity through EEG. Initially, when participants were asked to imagine walking in the virtual environment, scientists didn't observe the expected signals in the areas associated with motor control of their legs.
"If you said, use your hands, there was modulation of brain activity," Nicolelis said. "But the brain had almost completely erased the representation of their lower limbs."
After months of training, scientists began to observe the brain activity they expected to see when the patients' thought about moving their legs. "Basically, the training reinserted the representation of lower limbs into the patients' brains," Nicolelis said.
As they progressed, patients graduated from virtual reality to more challenging equipment that required more control over their posture, balance and ability to use their upper limbs, including two commercially available walking devices used in some physical therapy centers in the U.S.: the ZeroG and the Lokomat. Both use overhead harnesses to support a patient's weight as they build strength and proper gait after paralysis due to injury or neurological conditions such as stroke.
The patients rotated through other training systems that applied robotics, including the exoskeleton Pinto wore at the 2014 World Cup.
During most of their training, the participants also wore a sleeve equipped with touch-technology called haptic feedback to enrich the experience and train their brains, Nicolelis said. Haptics use varied vibrations to offer tactile feedback, much like the buzzing jolts or kickbacks gamers feel through a handheld controller.
"The tactile feedback is synchronized and the patient's brain creates a feeling that they are walking by themselves, not with the assistance of devices," Nicolelis said. "It induces an illusion that they are feeling and moving their legs. Our theory is that by doing this, we induced plasticity not only at the cortical level, but also at the spinal cord."
MEDICA-tradefair.com; Source: Duke University Medical Center