Brain-Machine Interfaces in Virtual Reality Applications

Avatar Project



This project aims to design neural decoders to translate neural activity acquired via scalp EEG into direct lower limb movements. Another objective is to examine the changes in the representation of gait in cortical networks during adaptation to virtual cortical lesions or visual perturbations of gait kinematics in healthy subjects. In this study, neural decoders for extracting lower limb kinematics will be designed and the predicted walking gait pattern will be used to control a walking avatar. It is predicted that subjects will adapt their brain patterns to compensate for visual kinematic perturbations and virtual cortical lesions.

Team Members:

  • Trieu Phat Luu: Postdoctoral Research Associates
  • Yongtian He: PhD Graduate Student
  • Samuel Brown: Technical Staff
  • Kevin Nathan: PhD Graduate Student
  • Sho Nakagome: PhD Graduate Student
  • Fangfei Zhang: Undergraduate Student