February 6th, Wednesday 14:15, Room 303, Jacobs Building
Brain-computer interfaces (BCIs) allow people to interact with external devices using their thought alone. Of particular interest is to allow people to control an alternative body, either a virtual avatar or a physical robot, by remapping motor imagery to the corresponding body parts. We have demonstrated this possibility in previous research using electroencephalogram (EEG)-based BCI. EEG, however, has high levels of noise and low spatial resolution. Therefore, we are now exploring the possibilities of real-time functional magnetic resonance imaging (fMRI) for BCI. Activity in specific regions of interest in the brain is sampled online to identify subjects' intentions and convert them into actions performed by an avatar or a humanoid robot. The process has allowed subjects located in Israel to control a HOAP3 humanoid robot in France, experiencing the whole session through the eyes of the robot. In the talk I will present experimental results as well as some preliminary results of improved classification using machine-learning techniques. This research is carried out in collaboration with additional partners as part of the EU project VERE.