Abstract
Non-invasive brain-computer interfaces (BCIs) based on an event-related potential (ERP) component, P300, elicited via the oddball paradigm, have been extensively developed to enable device control and communication. While most P300-based BCIs employ visual stimuli in the oddball paradigm, auditory P300-based BCIs also need to be developed for users with unreliable gaze control or limited visual processing. Specifically, auditory BCIs without additional visual support or multi-channel sound sources can broaden the application areas of BCIs. This study aimed to design optimal stimuli for auditory BCIs among artificial (e.g., beep) and natural (e.g., human voice and animal sounds) sounds in such circumstances. In addition, it aimed to investigate differences between auditory and visual stimulations for online P300-based BCIs. As a result, natural sounds led to both higher online BCI performance and larger differences in ERP amplitudes between the target and non-target compared to artificial sounds. However, no single type of sound offered the best performance for all subjects; rather, each subject indicated different preferences between the human voice and animal sound. In line with previous reports, visual stimuli yielded higher BCI performance (average 77.56%) than auditory counterparts (average 54.67%). In addition, spatiotemporal patterns of the differences in ERP amplitudes between target and non-target were more dynamic with visual stimuli than with auditory stimuli. The results suggest that selecting a natural auditory stimulus optimal for individual users as well as making differences in ERP amplitudes between target and non-target stimuli more dynamic may further improve auditory P300-based BCIs.
Similar content being viewed by others
Data availability
The datasets analyzed during the current study are available from the corresponding author on reasonable request.
References
Baykara E, Ruf CA, Fioravanti C et al (2016) Effects of training and motivation on auditory P300 brain-computer interface performance. Clin Neurophysiol 127:379–387. https://doi.org/10.1016/j.clinph.2015.04.054
Belitski A, Farquhar J, Desain P (2011) P300 audio-visual speller. J Neural Eng. https://doi.org/10.1088/1741-2560/8/2/025022
Bigdely-Shamlo N, Mullen T, Kothe C et al (2015) The PREP pipeline: standardized preprocessing for large-scale EEG analysis. Front Neuroinform. https://doi.org/10.3389/fninf.2015.00016
Birbaumer N, Cohen LG (2007) Brain-computer interfaces: communication and restoration of movement in paralysis. J Physiol 579:621–636. https://doi.org/10.1113/jphysiol.2006.125633
Carabalona R, Grossi F, Tessadri A et al (2010) Home smart home: Brain-computer interface control for real smart home environments. In: Proceedings of the 4th international convention on rehabilitation engineering & assistive technology. Singapore Therapeutic, Assistive & Rehabilitative Technologies (START) Centre, p 51
Chang M, Nishikawa N, Struzik ZR et al (2013) Comparison of P300 responses in auditory, visual and audiovisual spatial speller BCI paradigms. arXiv preprint http://arxiv.org/abs/1301.6360
Chang CY, Hsu SH, Pion-Tonachini L et al (2020) Evaluation of artifact subspace reconstruction for automatic artifact components removal in multi-channel EEG recordings. IEEE Trans Bio-Med Eng 67:1114–1121. https://doi.org/10.1109/Tbme.2019.2930186
Cheng SY, Hsu HT, Shu CM (2008) Effects of control button arrangements on human response to auditory and visual signals. J Loss Prevent Proc 21:299–306. https://doi.org/10.1016/j.jlp.2007.03.002
Corralejo R, Nicolás-Alonso LF, Álvarez D et al (2014) A P300-based brain–computer interface aimed at operating electronic devices at home for severely disabled people. Med Biol Eng Compu 52:861–872
De Vos M, Kroesen M, Emkes R et al (2014) P300 speller BCI with a mobile EEG system: comparison to a traditional amplifier. J Neural Eng 11:036008. https://doi.org/10.1088/1741-2560/11/3/036008
Donchin E, Ritter W, McCallum WC (1978) Cognitive psychophysiology: the endogenous components of the ERP. Event-Relat Brain Potentials Man 349:411
Farwell LA, Donchin E (1988) Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol 70:510–523. https://doi.org/10.1016/0013-4694(88)90149-6
Ferracuti F, Freddi A, Iarlori S et al (2013) Auditory paradigm for a P300 BCI system using spatial hearing. In: IEEE international conference on intelligent robots, pp 871–876
Furdea A, Halder S, Krusienski DJ et al (2009) An auditory oddball (P300) spelling system for brain-computer interfaces. Psychophysiology 46:617–625. https://doi.org/10.1111/j.1469-8986.2008.00783.x
Halder S, Rea M, Andreoni R et al (2010) An auditory oddball brain-computer interface for binary choices. Clin Neurophysiol 121:516–523. https://doi.org/10.1016/j.clinph.2009.11.087
Halder S, Kathner I, Kubler A (2016) Training leads to increased auditory brain-computer interface performance of end-users with motor impairments. Clin Neurophysiol 127:1288–1296. https://doi.org/10.1016/j.clinph.2015.08.007
Harvey DG, Torack RM, Rosenbaum HE (1979) Amyotrophic lateral sclerosis with ophthalmoplegia—clinicopathologic study. Arch Neurol-Chicago 36:615–617. https://doi.org/10.1001/archneur.1979.00500460049005
Hayashi H, Kato S (1989) Total manifestations of amyotrophic lateral sclerosis—ALS in the totally locked-in state. J Neurol Sci 93:19–35. https://doi.org/10.1016/0022-510x(89)90158-5
Höhne J, Schreuder M, Blankertz B et al (2011) A Novel 9-class auditory ERP paradigm driving a predictive text entry system. Front Neurosci 5:99. https://doi.org/10.3389/fnins.2011.00099
Höhne J, Krenzlin K, Dahne S et al (2012) Natural stimuli improve auditory BCIs with respect to ergonomics and performance. J Neural Eng. https://doi.org/10.1088/1741-2560/9/4/045003
Huang MQ, Jin J, Zhang Y et al (2018) Usage of drip drops as stimuli in an auditory P300 BCI paradigm. Cogn Neurodyn 12:85–94. https://doi.org/10.1007/s11571-017-9456-y
Katayama J, Polich J (1996) P300 from one-, two-, and three-stimulus auditory paradigms. Int J Psychophysiol 23:33–40. https://doi.org/10.1016/0167-8760(96)00030-X
Kim M, Kim MK, Hwang M et al (2019) Online Home appliance control using EEG-based brain–computer interfaces. Electronics. https://doi.org/10.3390/electronics8101101
Klobassa DS, Vaughan TM, Brunner P et al (2009) Toward a high-throughput auditory P300-based brain-computer interface. Clin Neurophysiol 120:1252–1261. https://doi.org/10.1016/j.clinph.2009.04.019
Kübler A, Birbaumer N (2008) Brain-computer interfaces and communication in paralysis: Extinction of goal directed thinking in completely paralysed patients? Clin Neurophysiol 119:2658–2666. https://doi.org/10.1016/j.clinph.2008.06.019
Mullen TR, Kothe CAE, Chi YM et al (2015) Real-time neuroimaging and cognitive monitoring using wearable Dry EEG. IEEE Trans Bio-Med Eng 62:2553–2567. https://doi.org/10.1109/Tbme.2015.2481482
Ng AWY, Chan AHS (2012) Finger response times to visual, auditory and tactile modality stimuli. Lect Notes Eng Comput 2:1449–1454
Nicolas-Alonso LF, Gomez-Gil J (2012) Brain computer interfaces, a review. Sensors 12:1211–1279. https://doi.org/10.3390/s120201211
Oralhan Z (2019) A new paradigm for region-based P300 speller in brain computer interface. IEEE Access 7:106617–106626. https://doi.org/10.1109/Access.2019.2933049
Sara G, Gordon E, Kraiuhin C et al (1994) The P300 ERP component: an index of cognitive dysfunction in depression? J Affect Disord 31:29–38. https://doi.org/10.1016/0165-0327(94)90124-4
Schmidt-Kassow M, Wilkinson D, Denby E et al (2016) Synchronised vestibular signals increase the P300 event-related potential elicited by auditory oddballs. Brain Res 1648:224–231. https://doi.org/10.1016/j.brainres.2016.07.019
Schreuder M, Blankertz B, Tangermann M (2010) A New auditory multi-class brain-computer interface paradigm: spatial hearing as an informative cue. PLoS ONE. https://doi.org/10.1371/journal.pone.0009813
Si-Mohammed H, Petit J, Jeunet C et al (2020) Towards BCI-based interfaces for augmented reality: feasibility, design and evaluation. IEEE Trans vis Comput Graph 26:1608–1621. https://doi.org/10.1109/Tvcg.2018.2873737
Simon N, Kathner I, Ruf CA et al (2015) An auditory multiclass brain-computer interface with natural stimuli: usability evaluation with healthy participants and a motor impaired end user. Front Hum Neurosci. https://doi.org/10.3389/fnhum.2014.01039
Takano K, Hata N, Kansaku K (2011) Towards intelligent environments: an augmented reality-brain-machine interface operated with a see-through head-mount display. Front Neurosci. https://doi.org/10.3389/fnins.2011.00060
Wolpaw JR, Birbaumer N, McFarland DJ et al (2002) Brain-computer interfaces for communication and control. Clin Neurophysiol 113:767–791. https://doi.org/10.1016/s1388-2457(02)0005
Zaehle T, Wustenberg T, Meyer M et al (2004) Evidence for rapid auditory perception as the foundation of speech processing: a sparse temporal sampling fMRI study. Eur J Neurosci 20:2447–2456. https://doi.org/10.1111/j.1460-9568.2004.03687.x
Zander TO, Kothe C (2011) Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general. J Neural Eng 8:025005. https://doi.org/10.1088/1741-2560/8/2/025005
Zeng H, Wang YX, Wu CC et al (2017) Closed-loop hybrid gaze brain-machine interface based robotic arm control with augmented reality feedback. Front Neurorobotics. https://doi.org/10.3389/fnbot.2017.00060
Funding
This work was supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) Grant funded by the Korea Government (MSIT) (2017-0-00432, Development of non-invasive integrated CI SW platform to control home appliances and external devices by user’s thought via AR/VR interface).
Author information
Authors and Affiliations
Contributions
Conceptualization: Y-JC and S-PK; methodology: Y-JC and S-PK; experiment: Y-JC; analysis: Y-JC; supervision: S-PK and O-SK; writing—original draft: Y-JC; writing—review and editing: S-PK and O-SK. All authors have read and agreed to the published version of the manuscript.
Corresponding authors
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Informed consent
The studies involving human participants were reviewed and approved by the Ulsan National Institute of Science and Technology, Institutional Review Board (UNIST-IRB-21-22-A). The participants provided their written informed consent to participate in this study.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Choi, YJ., Kwon, OS. & Kim, SP. Design of auditory P300-based brain-computer interfaces with a single auditory channel and no visual support. Cogn Neurodyn 17, 1401–1416 (2023). https://doi.org/10.1007/s11571-022-09901-3
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11571-022-09901-3