Search from over 60,000 research works

Advanced Search

Movement intention based Brain Computer Interface for Virtual Reality and Soft Robotics rehabilitation using novel autocorrelation analysis of EEG

Full text not archived in this repository.
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Wairagkar, M., Zoulias, I., Oguntosin, V., Hayashi, Y. orcid id iconORCID: https://orcid.org/0000-0002-9207-6322 and Nasuto, S. orcid id iconORCID: https://orcid.org/0000-0001-9414-9049 (2016) Movement intention based Brain Computer Interface for Virtual Reality and Soft Robotics rehabilitation using novel autocorrelation analysis of EEG. In: 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob), 26-29 June 2016.

Abstract/Summary

Brain Computer Interface (BCI) could be used as an effective tool for active engagement of patients in motor rehabilitation by enabling them to initiate the movement by sending the command to BCI directly via their brain. In this paper, we have developed a BCI using novel EEG analysis to control a Virtual Reality avatar and a Soft Robotics rehabilitation device. This BCI is able identify and predict the upper limb movement. Autocorrelation analysis was done on EEG to study the complex oscillatory processes involved in motor command generation. Autocorrelation represented the interplay between oscillatory and decaying processes in EEG which change during voluntary movement. To investigate these changes, the exponential decay curve was fitted to the autocorrelation of EEG windows which captured the autocorrelation decay. It was observed that autocorrelation decays slower during voluntary movement and fast otherwise, thus, movement intention could be identified. This new method was translated into online signal processing for BCI to control the virtual avatar hand and soft robotic rehabilitation device by intending to move an upper limb. The soft robotic device placed on the joint between upper and the lower arm inflated and deflated resulting to extension and flexion of the arm providing proprioceptive feedback. Avatar arm viewed in virtual 3D environment with Oculus Rift also moved simultaneously providing a strong visual feedback.

Item Type Conference or Workshop Item (Paper)
URI https://reading-clone.eprints-hosting.org/id/eprint/73375
Item Type Conference or Workshop Item
Refereed Yes
Divisions Life Sciences > School of Biological Sciences > Biomedical Sciences
Download/View statistics View download statistics for this item

University Staff: Request a correction | Centaur Editors: Update this record

Search Google Scholar