MAIA aims at developing a brain-computer interface which recognizes the subject’s voluntary intent to do primitive motor actions on the order of milliseconds and conveys this intention to a robot that implements the necessary low-level details for achieving complex tasks. To achieve this objective, we will take a radical departure from current assumptions and approaches in BCI. In particular, we will follow five innovative principles: (1) recognition of the subject’s motor intent from the analysis of high resolution brain maps, which estimates intracranial potentials from scalp EEG; (2) adaptive shared autonomy between two intelligent agents (the human user and the robot) so that the user only gives high-level mental commands that the robot performs autonomously; (3) use of haptic feedback to the user to accelerate training and facilitate accurate control; (4) recognition of brain events associated to high-level cognitive states, such as error recognition and alarm, to increase the reliability of the brain-actuated robots; (5) on-line adaptation of the interface to the user to keep the BCI constantly tuned to its owner.

These principles will be demonstrated in three applications, which will be used to measure the S&T objectives. The three demonstrations are: – driving a wheelchair in an indoor environment; – controlling a robot arm for reaching and manipulation tasks; – handling emergency situations after recognizing the subject’s alarm state (e.g., braking the vehicle or retracting the robot arm).

MAIA run from September 2004 to December 2007 and was coordinated by Prof. José del R. Millán, then at the Idiap Research Institute, in cooperation with:




  • Brain-Actuated Wheelchair (Navigation)

  • Brain-Actuated Wheelchair (Docking)

  • Simulated Brain-Actuated Wheelchair