Event Abstract

A P300-Based Brain-Robot Interface for Shaping Human-Robot Interaction

  • 1 Bielefeld University, CoR-Lab, P.O. Box 10 01 31, D-33501 Bielefeld, Germany
  • 2 Honda Research Institute Europe GmbH, Germany

Brain-computer interfaces (BCI) based on the P300 event-related potential (ERP) have been studied widely in the past decade. These BCIs exploit stimuli, called oddballs, which are presented on a computer screen in an arbitrary fashion to implement a binary selection mechanism. The P300 potential has been linked to human surprise, meaning that P300 potentials are triggered by unpredictable events. This hypothesis is the basis of the oddball paradigm. In this work, we go beyond the standard paradigm and exploit the P300 in a more natural fashion for shaping human-robot interaction (HRI). In HRI a flawless behavior of the robot is essential to avoid confusion or anxiety of the human user when interacting with the robot. Detecting such reactions in the human user on the fly and providing instantaneous feedback to the robot is crucial. Ideally, the feedback system does not demand additional cognitive loads and operates automatically in the background. In other words, providing feedback from the human user to the robot should be an inherent feature of the human-machine interaction framework. Information extracted from the human EEG, in particular the P300, is a well-suited candidate for serving as input to this feedback loop.

We propose to use P300 as a means for human-robot interaction, in particular to spot the surprises of the human user during interaction to detect in time any mistakes in robot behavior the human user observes. In this way, the robot can notice its mistakes as early as possible and correct them accordingly. Our brain-robot interface implementing the proposed feedback system consists of the following core modules: (1) a "P300 spotter" that analyzes the incoming preprocessed data stream for identifying P300 potentials on a single-trial basis and (2) a "translation" module that translates the detected P300s into appropriate feedback signals to the robot. The classification relies on a supervised machine learning algorithm that requires labeled training data. This data must be collected subject-wise to account for the high inter-subject variances typically found in EEG data. The off-line training needs to be carried out only once prior to using the interface. The trained classifier is then employed for on-line detection of P300 signals. During the online operation, the incoming multi-channel EEG data is recorded and analyzed continuously. Each incoming new sample vector is added to a new window. Spectral, spatial and temporal features are extracted from the filtered windows. The resulting feature vectors are classified and a probability that the vector contains a P300 is assigned. Eventually, a feedback signal to the robot is generated based on the classification result, either a class label or a probability between 0 and 1. The proposed framework was tested off-line in a scenario using Honda's humanoid robot ASIMO. This scenario is suited for eliciting P300 events in a controlled experimental environment without neglecting the constraints of real robots. We recorded EEG data during interaction with ASIMO and applied our method off-line. In the future we plan to extend our system to a fully on-line operating framework.

Conference: Bernstein Conference on Computational Neuroscience, Frankfurt am Main, Germany, 30 Sep - 2 Oct, 2009.

Presentation Type: Oral Presentation

Topic: Neurotechnology and brain computer interfaces

Citation: Finke A, Jin Y and Ritter H (2009). A P300-Based Brain-Robot Interface for Shaping Human-Robot Interaction. Front. Comput. Neurosci. Conference Abstract: Bernstein Conference on Computational Neuroscience. doi: 10.3389/conf.neuro.10.2009.14.108

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 27 Aug 2009; Published Online: 27 Aug 2009.

* Correspondence: Andrea Finke, Bielefeld University, CoR-Lab, P.O. Box 10 01 31, D-33501 Bielefeld, Bielefeld, Germany, afinke@techfak.uni-bielefeld.de