Brain-Machine Interfaces for Rehabilitation

NeuroREX project

20120519_STP001_2

Descriptions:

Working principles of wheelchairs for people with lower-limb paralysis has remained the same for centuries. In the last decade, researchers around the world focused their efforts in building human-machine systems (HMS) that provide mobility closer to the natural walk patterns of able-bodied people. Among commercial robotic exoskeleton systems, the REX exoskeleton (REX Bionics, New Zealand) has the capability of walking without an external support (such as a walker or canes), in a balanced set of configurations. One can initiate the walking (or turning, sitting, standing and stepping up/down motions) by the use of a joystick that is attached to the handlebar of the system.
At the University of Houston, our research focuses on providing more natural and intuitive ways of supplying command signals for such systems – that is, the brain-machine interface. Most exoskeletons are currently controlled via an external operator or by detecting the user’s intent from upper-body gestures, hand-control, joysticks, or via motion sensors embedded in walkers or instrumented canes. Although there are other types of usable controls signals, it can be argued the most natural way would be to harness one’s own brain activity and using it to drive such systems. In this way, the user would just think about walking or sitting, and the system would interpret the changes in his/her brain activity – that is, his/her motor intentions, as correct commands to the exoskeleton. And the system would do this in real-time while liberating our hands, voice and eyes for communication and tool use.
We use non-invasive active EEG system to acquire brain waves that are processed in real-time to extract the user’s intent to control exoskeletons such as the REX robot. An EEG cap is placed on the user’s head and his/her brain waves are measured in real-time as he conducts a series of motions with the exoskeleton. We then use advanced algorithms to map the slow modulation of the user’s brainwaves (amplitude modulation of the slow cortical potentials) for each motion that is performed with the exoskeleton. During an initial calibration phase, we record as few as 5-10 minutes of EEG to build the decoder model to extract motor intent, specific to tasks performed with the robot. We then ask the user to use only his thoughts to control the exoskeleton (thereby putting the user-in-the-loop), namely we ask the user to think about walking (kinesthetically) as we evaluate the model with his/her real-time EEG data. We also develop real-time artifact removal algorithms to further improve the system’s accuracy and robustness.

 

Team Members:
  • Atilla Kilicarslan
  • Matthew Green

 

Supported by:
  • National Institute of Neurological Disorders and Stroke (NINDS)  R01NS075889-01

 

Representative Publications:
  • Kilicarslan, Atilla, Saurabh Prasad, Robert G. Grossman, and Jose L. Contreras-Vidal. “High accuracy decoding of user intentions using EEG to control a lower-body exoskeleton.” In Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE, pp. 5606-5609. IEEE, 2013.

 

 

NASA X1 Project

NASA X1 Project

Descriptions:
     Stroke remains a leading cause of disability, limiting independent ambulation in survivors, and consequently affecting quality of life (QOL). Recent technological advances in neural interfacing with robotic rehabilitation devices are promising in the context of gait rehabilitation. Here, the X1, NASA’s powered robotic lower limb exoskeleton, is introduced as a potential diagnostic, assistive, and therapeutic tool for stroke rehabilitation. Additionally, the feasibility of decoding lower limb joint kinematics and kinetics during walking with the X1 from scalp electroencephalographic (EEG) signals – the first step towards the development of a brain-machine interface (BMI) system to the X1 exoskeleton – is demonstrated.

 

Team Members:
  • Yongtian He
  • Kevin Nathan
  • Anusha Venkatakrishnan

 

Collaborators:

 

Representative Publications:
  • 2014 IEEE EMBS conference paper: An Integrated Neuro-Robotic Interface for Stroke Rehabilitation using the NASA X1 Powered Lower Limb Exoskeleton (accepted)

 

 

 

Human-Machine System for the H2 Lower Limb Exoskeleton (H2-NeuroExo)

 

Supported by:
Laboratory of Noninvasive Brain Machine Interface System, University of Houston.
TIRR Memorial Hermann
Consejo Superior de Investigaciones Científicas
ClinicalTrials.gov, NCT02114450

 

Description:
Many stroke survivors suffer from gait dysfunction, which makes gait restoration one of the main goals for post stroke rehabilitation. Compare to traditional therapy which involves physically guiding the impaired leg to reinforce the normal walking pattern, new therapy that employ the robotic-aided device provide us a much economic and less labor demanding solution to deliver consistent and repetitive motor practice needed for stroke rehabilitation.

This study investigates the use of a six degree of freedom light weight smart lower limb robotic exoskeleton, H2 Exo (developed by the CSIC, Spain) in rehabilitation after stroke. It compares robotic-assisted rehabilitation with supervised motor practice particularly in terms of functional recovery. Additionally, this study also examine brain plasticity associated with robotic-assisted training using non-invasive scalp electroencephalography (EEG) and changes in lower limb joint kinematics during robotic-assisted training. So far, six stroke patients attended our experiment and data has been recorded over a 5 week robot assist-as-needed over ground walking sessions. Preliminary results showed most of patients were able to follow normal gait trajectory closer at the end of 5 week therapy compared to first a few sessions. Total walking steps and steps/min also increases a little bit across most subjects. EEG signal analysis also showed that robot assisted walking intent can be decoded from patients scalp brain waves.

Taken together, the findings from this research will be used to understand human-robot interaction and to design smart powered orthotic devices that can be controlled directly by brain activity and assist those that have lost all or part of their walking abilities due to neurological disease or injury. Moreover, this study will systematically track neuroplasticity associated with functional recovery after stroke, which will help determine optimal windows for treatment that would maximize therapeutic benefit. Lastly, it will also help characterize markers of learning to use these new devices, which will be important in the clinical setting for modifying and adapting rehabilitation protocols to suit changing needs of the patient (user).

 

Team members:

  • Anusha Venkatakrishnan
  • Magdo Bortole
  • Fangshi Zhu
  • Chang, Shuo-Hsiu
  • Jonathan Kung
  • Víctor Ernesto Issa García
  • Theodore Kahn

 

 

    Representative Publications:

  • Magdo Bortole, Anusha Venkatakrishnan, Fangshi Zhu, Juan C Moreno, Jose L Contreras-Vidal, Jose L Pons; The H2 Robotic Exoskeleton for Gait Rehabilitation After Stroke: Early Findings From a Clinical Study. Journal of Neuroengineering and rehabilitation. 2014 (Acceptted)
  • Anusha Venkatakrishnan, Gerard E. Francisco, Jose L. Contreras-Vidal. Applications of Brain–Machine Interface Systems in Stroke Recovery and Rehabilitation. Curr Phys Med Rehabil Rep (2014) 2:93–105