Log in

Deep learning networks based decision fusion model of EEG and fNIRS for classification of cognitive tasks

  • Research Article
  • Published:
Cognitive Neurodynamics Aims and scope Submit manuscript

Abstract

The detection of the cognitive tasks performed by a subject during data acquisition of a neuroimaging method has a wide range of applications: functioning of brain-computer interface (BCI), detection of neuronal disorders, neurorehabilitation for disabled patients, and many others. Recent studies show that the combination or fusion of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) demonstrates improved classification and detection performance compared to sole-EEG and sole-fNIRS. Deep learning (DL) networks are suitable for the classification of large volume time-series data like EEG and fNIRS. This study performs the decision fusion of EEG and fNIRS. The classification of EEG, fNIRS, and decision-fused EEG–fNIRSinto cognitive task labels is performed by DL networks. Two different open-source datasets of simultaneously recorded EEG and fNIRS are examined in this study. Dataset 01 is comprised of 26 subjects performing 3 cognitive tasks: n-back, discrimination or selection response (DSR), and word generation (WG). After data acquisition, fNIRS is converted to oxygenated hemoglobin (HbO2) and deoxygenated hemoglobin (HbR) in Dataset 01. Dataset 02 is comprised of 29 subjects who performed 2 tasks: motor imagery and mental arithmetic. The classification procedure of EEG and fNIRS (or HbO2, HbR) are carried out by 7 DL classifiers: convolutional neural network (CNN), long short-term memory network (LSTM), gated recurrent unit (GRU), CNN–LSTM, CNN–GRU, LSTM–GRU, and CNN–LSTM–GRU. After the classification of single modalities, their prediction scores or decisions are combined to obtain the decision-fused modality. The classification performance is measured by overall accuracy and area under the ROC curve (AUC). The highest accuracy and AUC recorded in Dataset 01 are 96% and 100% respectively; both by the decision fusion modality using CNN–LSTM–GRU. For Dataset 02, the highest accuracy and AUC are 82.76% and 90.44% respectively; both by the decision fusion modality using CNN–LSTM. The experimental result shows that decision-fused EEG–HbO2–HbR and EEG–fNIRSdeliver higher performances compared to their constituent unimodalities in most cases. For DL classifiers, CNN–LSTM–GRU in Dataset 01 and CNN–LSTM in Dataset 02 yield the highest performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sheikh Md. Rabiul Islam.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rabbani, M.H.R., Islam, S.M.R. Deep learning networks based decision fusion model of EEG and fNIRS for classification of cognitive tasks. Cogn Neurodyn (2023). https://doi.org/10.1007/s11571-023-09986-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11571-023-09986-4

Keywords

Navigation