Skip to main content
Log in

TSANet: multibranch attention deep neural network for classifying tactile selective attention in brain-computer interfaces

  • Original Article
  • Published:
Biomedical Engineering Letters Aims and scope Submit manuscript

Abstract

Brain-computer interfaces (BCIs) enable communication between the brain and a computer and electroencephalography (EEG) has been widely used to implement BCIs because of its high temporal resolution and noninvasiveness. Recently, a tactile-based EEG task was introduced to overcome the current limitations of visual-based tasks, such as visual fatigue from sustained attention. However, the classification performance of tactile-based BCIs as control signals is unsatisfactory. Therefore, a novel classification approach is required for this purpose. Here, we propose TSANet, a deep neural network, that uses multibranch convolutional neural networks and a feature-attention mechanism to classify tactile selective attention (TSA) in a tactile-based BCI system. We tested TSANet under three evaluation conditions, namely, within-subject, leave-one-out, and cross-subject. We found that TSANet achieved the highest classification performance compared with conventional deep neural network models under all evaluation conditions. Additionally, we show that TSANet extracts reasonable features for TSA by investigating the weights of spatial filters. Our results demonstrate that TSANet has the potential to be used as an efficient end-to-end learning approach in tactile-based BCIs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Availability of data and materials

All data, as well as analysis codes that were used to perform analyses, can be made available from the corresponding author upon reasonable request.

References

  1. Wolpaw JR, Birbaumer N, Heetderks WJ, McFarland DJ, Peckham PH, Schalk G, et al. Brain-computer interface technology: a review of the first international meeting. IEEE Trans Rehabil Eng. 2000;8:164–73.

    Article  Google Scholar 

  2. Roy Y, Banville H, Albuquerque I, Gramfort A, Falk TH, Faubert J. Deep learning-based electroencephalography analysis: a systematic review. J Neural Eng. 2019;16:051001.

    Article  Google Scholar 

  3. Abiri R, Borhani S, Sellers EW, Jiang Y, Zhao X. A comprehensive review of EEG-based brain-computer interface paradigms. J Neural Eng. 2019;16:011001. https://doi.org/10.1088/1741-2552/aaf12e.

    Article  Google Scholar 

  4. Gramfort A, Strohmeier D, Haueisen J, Hämäläinen MS, Kowalski M. Time-frequency mixed-norm estimates: sparse M/EEG imaging with non-stationary source activations. Neuroimage. 2013;70:410–22.

    Article  Google Scholar 

  5. Nicolas-Alonso LF, Gomez-Gil J. Brain computer interfaces, a review. Sensors. 2012;12:1211–79.

    Article  Google Scholar 

  6. Murguialday AR, Aggarwal V, Chatterjee A, Cho Y, Rasmussen R, O’Rourke B, et al. Brain-computer interface for a prosthetic hand using local machine control and haptic feedback. In: IEEE 10th international conference on rehabilitation robotics. 2007, pp. 609–13.

  7. McFarland DJ, Sarnacki WA, Wolpaw JR. Electroencephalographic (EEG) control of three-dimensional movement. J Neural Eng. 2010. https://doi.org/10.1088/1741-2560/7/3/036007.

    Article  Google Scholar 

  8. Sun A, Fan B, Jia C. Motor imagery EEG-based online control system for upper artificial limb. In: International conference on transportation, mechanical, and electrical engineering. 2011, pp. 1646–9.

  9. Bhattacharyya S, Khasnobish A, Konar A, Tibarewala DN, Nagar AK. Performance analysis of left/right hand movement classification from EEG signal by intelligent algorithms. In: IEEE symposium on computational intelligence, cognitive algorithms, mind, and brain. 2011, 1–8.

  10. Cipresso P, Carelli L, Solca F, Meazzi D, Meriggi P, Poletti B, et al. The use of P300-based BCIs in amyotrophic lateral sclerosis: from augmentative and alternative communication to cognitive assessment. Brain Behav. 2012;2:479–98.

    Article  Google Scholar 

  11. Krusienski DJ, Sellers EW, Cabestaing F, Bayoudh S, McFarland DJ, Vaughan TM, et al. A comparison of classification techniques for the P300 Speller. J Neural Eng. 2006. https://doi.org/10.1088/1741-2560/3/4/007.

    Article  Google Scholar 

  12. Hoffmann U, Vesin JM, Ebrahimi T, Diserens K. An efficient P300-based brain-computer interface for disabled subjects. J Neurosci Methods. 2008;167:115–25.

    Article  Google Scholar 

  13. Bell CJ, Shenoy P, Chalodhorn R, Rao RPN. Control of a humanoid robot by a noninvasive brain-computer interface in humans. J Neural Eng. 2008;5:214–20.

    Article  Google Scholar 

  14. Bryan M, Green J, Chung M, Chang L, Scherer R, Smith J, et al. An adaptive brain-computer interface for humanoid robot control. In: IEEE-RAS international conference on humanoid robots. 2011, pp. 199–204.

  15. Gembler F, Stawicki P, Volosyak I. Autonomous parameter adjustment for SSVEP-based BCIs with a novel BCI wizard. Front Neurosci. 2015;9:1–12.

    Article  Google Scholar 

  16. Blankertz B, Sannelli C, Halder S, Hammer EM, Kübler A, Müller KR, et al. Neurophysiological predictor of SMR-based BCI performance. Neuroimage. 2010;51:1303–9.

    Article  Google Scholar 

  17. Volosyak I, Valbuena D, Lüth T, Malechka T, Gräser A. BCI demographics II: How many (and what kinds of) people can use a high-frequency SSVEP BCI? IEEE Trans Neural Syst Rehabil Eng. 2011;19:232–9.

    Article  Google Scholar 

  18. Müller-Putz GR, Scherer R, Neuper C, Pfurtscheller G. Steady-state somatosensory evoked potentials: Suitable brain signals for brain-computer interfaces? IEEE Trans Neural Syst Rehabil Eng. 2006;14:30–7.

    Article  Google Scholar 

  19. Yao L, Meng J, Zhang D, Sheng X, Zhu X. Selective sensation based brain-computer interface via mechanical vibrotactile stimulation. PLoS ONE. 2013. https://doi.org/10.1371/journal.pone.0064784.

    Article  Google Scholar 

  20. Yao L, Meng J, Zhang D, Sheng X, Zhu X. Combining motor imagery with selective sensation toward a hybrid-modality BCI. IEEE Trans Biomed Eng. 2014;61:2304–12.

    Article  Google Scholar 

  21. Ahn S, Ahn M, Cho H, Chan JS. Achieving a hybrid brain-computer interface with tactile selective attention and motor imagery. J Neural Eng. 2014. https://doi.org/10.1088/1741-2560/11/6/066004.

    Article  Google Scholar 

  22. Ahn S, Kim K, Jun SC. Steady-state somatosensory evoked potential for brain-computer interface-present and future. Front Hum Neurosci. 2016;9:1–6.

    Article  Google Scholar 

  23. Craik A, He Y, Contreras-Vidal JL. Deep learning for electroencephalogram (EEG) classification tasks: a review. J Neural Eng. 2019;16:031001.

    Article  Google Scholar 

  24. Schirrmeister RT, Springenberg JT, Fiederer LDJ, Glasstetter M, Eggensperger K, Tangermann M, et al. Deep learning with convolutional neural networks for EEG decoding and visualization. Hum Brain Mapp. 2017;38:5391–420.

    Article  Google Scholar 

  25. Gao Z, Sun X, Liu M, Dang W, Ma C, Chen G. Attention-based Parallel Multiscale Convolutional Neural Network for Visual Evoked Potentials EEG Classification. IEEE J Biomed Health Inform. 2021;2194:1–8.

    Google Scholar 

  26. Lawhern VJ, Solon AJ, Waytowich NR, Gordon SM, Hung CP, Lance BJ. EEGNet: A compact convolutional neural network for EEG-based brain-computer interfaces. J Neural Eng. 2018;15: 056013.

    Article  Google Scholar 

  27. Acharya UR, Oh SL, Hagiwara Y, Tan JH, Adeli H. Deep convolutional neural network for the automated detection and diagnosis of seizure using EEG signals. Comput Biol Med. 2018;100:270–8.

    Article  Google Scholar 

  28. Sors A, Bonnet S, Mirek S, Vercueil L, Payen JF. A convolutional neural network for sleep stage scoring from raw single-channel EEG. Biomed Signal Process Control. 2018;42:107–14.

    Article  Google Scholar 

  29. Ingolfsson TM, Hersche M, Wang X, Kobayashi N, Cavigelli L, Benini L. EEG-TCNet: an accurate temporal convolutional network for embedded motor-imagery brain-machine interfaces. In: 2020 IEEE international conference on systems, man, and cybernetics (SMC). 2020, pp. 2958–65.

  30. Zhang D, Yao L, Zhang X, Wang S, Chen W, Boots R. Cascade and parallel convolutional recurrent neural networks on EEG-based intention recognition for brain computer interface. In: 32nd AAAI conference on artificial intelligence. 2018, pp. 1703–10.

  31. Delorme A, Makeig S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods. 2004;134:9–21.

    Article  Google Scholar 

  32. Urigüen JA, Garcia-Zapirain B. EEG artifact removal—state-of-the-art and guidelines. J Neural Eng. 2015. https://doi.org/10.1088/1741-2560/12/3/031001.

    Article  Google Scholar 

  33. Supratak A, Dong H, Wu C, Guo Y. DeepSleepNet: A model for automatic sleep stage scoring based on raw single-channel EEG. IEEE Trans Neural Syst Rehabil Eng. 2017;25:1998–2008.

    Article  Google Scholar 

  34. Zhang R, Zong Q, Zhao X. A multi-branch 3D convolutional neural network for EEG-based motor imagery classification. In: Chinese control conference. 2019, pp. 8428–32.

  35. Zhang Y, Cai H, Nie L, Xu P, Zhao S, Guan C. An end-to-end 3D convolutional neural network for decoding attentive mental state. Neural Netw. 2021;144:129–37.

    Article  Google Scholar 

  36. Chollet F. Xception: deep learning with depthwise separable convolutions. In: Proceedings - 30th IEEE conference on computer vision and pattern recognition. 2017, pp. 1800–7.

  37. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, et al. Attention is all you need. In: Advances in neural information processing systems. Neural information processing systems foundation; 2017. pp. 5999–6009.

  38. Qu W, Wang Z, Hong H, Chi Z, Feng DD, Grunstein R, et al. A residual based attention model for EEG based sleep staging. IEEE J Biomed Health Inform. 2020;24:2833–43.

    Article  Google Scholar 

  39. Yao Q, Wang R, Fan X, Liu J, Li Y. Multi-class arrhythmia detection from 12-lead varied-length ECG using attention-based time-incremental convolutional neural network. Inf Fusion. 2020;53:174–82.

    Article  Google Scholar 

  40. Springenberg JT, Dosovitskiy A, Brox T, Riedmiller M. Striving for simplicity: the all convolutional net. In: 3rd International conference on learning representations. 2015, pp. 1–14.

  41. Zhang D, Yao L, Zhang X, Wang S, Chen W. Boots Cascade and Parallel Convolutional Recurrent Neural Networks on EEG-Based Intention Recognition for Brain Computer Interface. Thirty-Second AAAI Conference on Artificial Intelligence. 2017;32:1703–10.

    Google Scholar 

  42. Aggarwal S, Chugh N. Review of machine learning techniques for EEG based brain computer interface. Arch Comput Methods Eng. 2022;29:3001–20.

    Article  Google Scholar 

  43. Yao L, Jiang N, Mrachacz-Kersting N, Zhu X, Farina D, Wang Y. Reducing the calibration time in somatosensory BCI by using tactile ERD. IEEE Trans Neural Syst Rehabil Eng. 2022;30:1870–6.

    Article  Google Scholar 

  44. Lotte F, Bougrain L, Cichocki A, Clerc M, Congedo M, Rakotomamonjy A, et al. A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update. J Neural Eng. 2018. https://doi.org/10.1088/1741-2552/aab2f2.

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank all participants for their participation in the study.

Funding

This work was supported by the National Research Foundation of Korea (NRF) Grant funded by the Korea government (MSIT) (No. NRF-2022R1A4A1023248 and No. RS-2023-00209794). This work was supported by the IITP (Institute of Information and Communications Technology Planning & Evaluation) grants (No. 2017-0-00451, No. 2019-0-01842) funded by the Korea government.

Author information

Authors and Affiliations

Authors

Contributions

S.C.J and S.A designed the experimental paradigms and S.A collected the data. J.S.P and H.J performed the data analysis, and J.S.P, H.J, and S.A wrote the original draft of the manuscript. All authors reviewed and revised the manuscript.

Corresponding author

Correspondence to Sangtae Ahn.

Ethics declarations

Conflict of interest

All authors declare that they have no competing interests.

Ethics approval and consent to participate

All participants signed an informed consent. The study was approved by the Institutional Review Board of the Gwangju Institute of Science and Technology.

Consent for publication

All participants signed an informed consent for publication.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 218 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jang, H., Park, J.S., Jun, S.C. et al. TSANet: multibranch attention deep neural network for classifying tactile selective attention in brain-computer interfaces. Biomed. Eng. Lett. 14, 45–55 (2024). https://doi.org/10.1007/s13534-023-00309-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13534-023-00309-4

Keywords

Navigation