Skip to main content
Log in

Design of auditory P300-based brain-computer interfaces with a single auditory channel and no visual support

  • Research Article
  • Published:
Cognitive Neurodynamics Aims and scope Submit manuscript

Abstract

Non-invasive brain-computer interfaces (BCIs) based on an event-related potential (ERP) component, P300, elicited via the oddball paradigm, have been extensively developed to enable device control and communication. While most P300-based BCIs employ visual stimuli in the oddball paradigm, auditory P300-based BCIs also need to be developed for users with unreliable gaze control or limited visual processing. Specifically, auditory BCIs without additional visual support or multi-channel sound sources can broaden the application areas of BCIs. This study aimed to design optimal stimuli for auditory BCIs among artificial (e.g., beep) and natural (e.g., human voice and animal sounds) sounds in such circumstances. In addition, it aimed to investigate differences between auditory and visual stimulations for online P300-based BCIs. As a result, natural sounds led to both higher online BCI performance and larger differences in ERP amplitudes between the target and non-target compared to artificial sounds. However, no single type of sound offered the best performance for all subjects; rather, each subject indicated different preferences between the human voice and animal sound. In line with previous reports, visual stimuli yielded higher BCI performance (average 77.56%) than auditory counterparts (average 54.67%). In addition, spatiotemporal patterns of the differences in ERP amplitudes between target and non-target were more dynamic with visual stimuli than with auditory stimuli. The results suggest that selecting a natural auditory stimulus optimal for individual users as well as making differences in ERP amplitudes between target and non-target stimuli more dynamic may further improve auditory P300-based BCIs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Data availability

The datasets analyzed during the current study are available from the corresponding author on reasonable request.

References

Download references

Funding

This work was supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) Grant funded by the Korea Government (MSIT) (2017-0-00432, Development of non-invasive integrated CI SW platform to control home appliances and external devices by user’s thought via AR/VR interface).

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization: Y-JC and S-PK; methodology: Y-JC and S-PK; experiment: Y-JC; analysis: Y-JC; supervision: S-PK and O-SK; writing—original draft: Y-JC; writing—review and editing: S-PK and O-SK. All authors have read and agreed to the published version of the manuscript.

Corresponding authors

Correspondence to Oh-Sang Kwon or Sung-Phil Kim.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Informed consent

The studies involving human participants were reviewed and approved by the Ulsan National Institute of Science and Technology, Institutional Review Board (UNIST-IRB-21-22-A). The participants provided their written informed consent to participate in this study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Choi, YJ., Kwon, OS. & Kim, SP. Design of auditory P300-based brain-computer interfaces with a single auditory channel and no visual support. Cogn Neurodyn 17, 1401–1416 (2023). https://doi.org/10.1007/s11571-022-09901-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11571-022-09901-3

Keywords

Navigation