Next Article in Journal
Experimental Evaluation of Smartphone Accelerometer and Low-Cost Dual Frequency GNSS Sensors for Deformation Monitoring
Next Article in Special Issue
fNIRS-Based Upper Limb Motion Intention Recognition Using an Artificial Neural Network for Transhumeral Amputees
Previous Article in Journal
Research on Classification Model of Panax notoginseng Taproots Based on Machine Vision Feature Fusion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Classification of Individual Finger Movements from Right Hand Using fNIRS Signals

1
Department of Mechanical, Electronics and Chemical Engineering, OsloMet-Oslo Metropolitan University, 0167 Oslo, Norway
2
Department of Informatics, University of Oslo, 0315 Oslo, Norway
3
Department of Computer Science, OsloMet-Oslo Metropolitan University, 0167 Oslo, Norway
4
Department of Neurosurgery, Oslo University Hospital, 0450 Oslo, Norway
5
Department of Computer Science, Norwegian University of Science and Technology, 7491 Trondheim, Norway
6
Software and Service Innovation, SINTEF Digital, 0373 Oslo, Norway
7
School of Mechanical Engineering, Pusan National University, Busan 46241, Korea
8
Department of Biomedical Engineering, Michigan Technological University, Houghton, MI 49931, USA
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(23), 7943; https://doi.org/10.3390/s21237943
Submission received: 8 November 2021 / Revised: 25 November 2021 / Accepted: 26 November 2021 / Published: 28 November 2021
(This article belongs to the Special Issue Signal Processing for Brain–Computer Interfaces)

Abstract

:
Functional near-infrared spectroscopy (fNIRS) is a comparatively new noninvasive, portable, and easy-to-use brain imaging modality. However, complicated dexterous tasks such as individual finger-tapping, particularly using one hand, have been not investigated using fNIRS technology. Twenty-four healthy volunteers participated in the individual finger-tapping experiment. Data were acquired from the motor cortex using sixteen sources and sixteen detectors. In this preliminary study, we applied standard fNIRS data processing pipeline, i.e., optical densities conversation, signal processing, feature extraction, and classification algorithm implementation. Physiological and non-physiological noise is removed using 4th order band-pass Butter-worth and 3rd order Savitzky–Golay filters. Eight spatial statistical features were selected: signal-mean, peak, minimum, Skewness, Kurtosis, variance, median, and peak-to-peak form data of oxygenated haemoglobin changes. Sophisticated machine learning algorithms were applied, such as support vector machine (SVM), random forests (RF), decision trees (DT), AdaBoost, quadratic discriminant analysis (QDA), Artificial neural networks (ANN), k-nearest neighbors (kNN), and extreme gradient boosting (XGBoost). The average classification accuracies achieved were 0.75 ± 0.04 , 0.75 ± 0.05 , and 0.77 ± 0.06 using k-nearest neighbors (kNN), Random forest (RF) and XGBoost, respectively. KNN, RF and XGBoost classifiers performed exceptionally well on such a high-class problem. The results need to be further investigated. In the future, a more in-depth analysis of the signal in both temporal and spatial domains will be conducted to investigate the underlying facts. The accuracies achieved are promising results and could open up a new research direction leading to enrichment of control commands generation for fNIRS-based brain-computer interface applications.

1. Introduction

Functional near-infrared spectroscopy (fNIRS) is a portable and non-invasive brain imaging modality for continuous measurement of haemodynamics in the cerebral cortex of the human brain [1]. Over the last decade, the method has gained popularity due to its acceptable temporal and spatial resolutions, and its easy-to-use, safe, portable, and affordable monitoring compared to other neuroimaging modalities [2]. fNIRS has been used to monitor a variety of cognitive activities, such as attention, problem-solving, working memory, and gait rehabilitation [3]. The underlying theory behind fNIRS functionality is based on optical spectroscopy and neurovascular coupling [1,4]. Optical spectroscopy uses the interaction of light with matter to measure certain characteristics of molecular structures, while neurovascular coupling defines the relationship between local neuronal activity and subsequent changes in cerebral blood flow due to cerebral activity [5,6,7]. It is known that most of the biological tissue is transparent to the near-infrared range (700–900 nm). The near-infrared window commonly used in fNIRS is 690–860 nm [8]. Haemoglobin is a protein that is responsible for delivering oxygen throughout the body via red blood cells. This protein is the major absorbent within the near-infrared range of light ( def. 700–1100 nm). In summary, the continuous-wave fNIRS machine uses two near-infrared wavelengths to measure the relative change in oxygenated haemoglobin ( Δ H b O ) and deoxygenated haemoglobin ( Δ H b R ) in cerebral activation.
The most common brain areas studied in neuroimaging are the cerebral prefrontal and motor cortex, particularly for cognitive and motor tasks [9,10]. Since the beginning of the 19th century, the finger-tapping test has been used in various brain studies to assess the motor abilities and accessory muscular control [11]. Various brain and non-brain signals were obtained during the finger-tapping task to access the motor abilities and differentiated movements. Investigating finger movements is particularly important in the field of the brain-computer interface to decode the neurophysiological signal and generate control commands for external devices [9,12]. Individual finger movements were classified with an average accuracy of 85% using electromyogram (EMG) bio-signals while performing finger-tapping tasks [13]. Similarly, in another study using surface EMG, individual and combined finger movements were classified with an average accuracy of 98% on healthy and 90% in below-elbow amputee persons [14]. These higher classification accuracies of finger movements may be best for prosthetic hand development. Other modalities predicting dexterous individual finger movements include ultrasound imaging from the forehand and differentiating finger movements with a higher precision of 98% accuracy [15]. Most brain imaging modalities are limited to the movement of larger body parts, such as the upper and lower limbs. However, it is essential to decode dexterous functions from brain signals in case where other types of brain imaging are difficult to implement. Among invasive brain signals, electrocorticography (ECoG) was shown to differentiate between individual finger movements with acceptable classification accuracies [12,16,17]. However, to the best of the author’s knowledge, only one study was found during a literature review that utilized noninvasive brain signals, i.e., electroencephalography (EEG) signals, to decode individual finger movements. The study found a broadband power increase and low-frequency-band power decrease in finger flexion and extension data when EEG power spectra were decomposed in principal components using principal component analysis (PCA). The average decoding accuracy over all subjects was 77.11% obtained with the binary classification of each pair of fingers from one hand using movement-related spectral changes and a support vector machine (SVM) classifier.
The prevalent motor execution task in fNIRS-based studies includes tapping of one or more fingers, single hand-tapping, both hand-tapping, right and left finger-tapping and hand-tapping. In the study, left and right index finger-tapping was distinguished with a classification accuracy of 85.4% using features from the vector-based phase and linear discriminant analysis [18]. In [19], three different tasks, i.e., right and left-hand unilateral complex finger-tapping, and foot-tapping, were performed. The classification accuracy achieved using SVM was 70.4% for the three-class problem. In single-trail classification for a motor imaginary with thumb and complex finger-tapping task achieves an average accuracy of 81% by simply changing the combination of a set of channels, time intervals, and features [20]. In [21] thumb and little finger were classified with an accuracy of 87.5% for Δ H b O data. Deep learning approaches are also becoming popular for the classification of these complex finger movements. In a study [22], using conditional generative adversarial networks (CGAN) in combination with convolutional neural networks (CNN), the left finger, right finger, and foot-tapping tasks were differentiated with higher classification accuracy of 96.67%. In one of the recent studies, left and right index finger-tapping were distinguished with a different tapping frequency using multilabeling and deep learning [23]. Different labels were assigned to right and left finger-tapping with different tapping frequencies labels such as rest, 80 bpm, and 120 bpm. With this complex combination using deep learning approach the average classification accuracy achieved was 81%. The aforementioned studies are difficult to compare since different models and finger-tapping exercises were conducted. However, according to the literature, the differentiation of finger movement patterns is very challenging using fNIRS. This fact is supported by legacy studies that show that there is no significant statistical difference between fNIRS signals recorded from primary- and pre-motor cortices during sequential finger-tapping and whole-hand grasping [24]. Furthermore, the dynamic relationship between the simultaneously activated brain regions during the motor task is becoming better understood. An interesting study conducted by Anwar et al. [25,26] describes the effective connectivity of the information flow in the sensorimotor cortex, premotor cortex, and contralateral dorsolateral prefrontal cortex during different finger movement tasks using multiple modalities such as fNIRS, fMRI, and EEG. It was found that there is an adequate bi-directional information flow between the cortices mentioned above. The study also concluded that, compared to fMRI, fNIRS is an attractive and easy to use alternative with an excellent spatial resolution for studying connectivity. In this perspective, multi-modal fNIRS-EEG is also an appealing alternative to fMRI. Hence, it is essential to study the flow and connectivity of individual finger movement from the motor cortex using fNIRS or multi-model integration of EEG-fNIRS. The multi-model EEG-fNIRS integration was shown to enhance classification accuracy [27], increase the number of control commands, and reduce the signal-processing time [4,28].
It has been unclear whether fNIRS signals have enough information to differentiate between individual finger movements. Some underlying limitations of fNIRS may be the reason for this drawback, such as comparatively low temporal resolution (1–10 Hz for commercially available portable devices), depth sensitivity of about 1.5 cm (depending upon source-detector distance, which is typically 3 cm), and spatial resolution up to 1 cm [29]. To shed light on this research area, the study is conducted to investigate the detection of individual finger-tapping tasks using fNIRS. Also, the study is a step forward towards understanding the dynamic relationship between the brain regions that are simultaneously activated during motor tasks. We believe that the advances made in sophisticated machine learning algorithms could help to identify individual finger movements from potential fNIRS signals. This study is structured and reported in accordance with the guidelines published in [30]. The following sections will address materials and methods (Section 2), results and discussion (Section 3) and conclusion (Section 4).

2. Materials and Methods

The section on materials and methods describes procedure followed during experimental design, data collection, and processing.

2.1. Participants

Twenty-four healthy right-handed participants, 18 males (M) and 6 females (F), selected from random university population participated in the experiment. The ages of the participants were for male (mean age ± standard deviation; range) (M = 30.44 ± 3.03 ; range: 24–34 years), and female (F = 29.17 ± 3.06 ; range: 24–34 years). The healthy young participants were selected in the age range of 25–35 years because the frequency of finger-tapping can vary between different age groups. The inclusion criterion for right-handedness was that the participants had to write with the right hand. The participants had normal vision or corrected to normal vision. Exclusion criteria include neurological disorders or limitation of motor abilities in any hands or finger. For ethical statements, please see Section 4.

2.2. Instrumentation

A continuous-wave optical tomography machine NIRScout (NIRx Medizintechnik GmbH, Germany) was used to spontaneously acquire brain data at one of the laboratories under the ADEPT (Advanced intelligent health and brain-inspired technologies) research group at Oslo Metropolitan University, Norway. The data acquisition used two wavelengths, i.e., 760 nm ( λ 1 ) and 850 nm ( λ 1 ) with a sampling rate of 3.9063 Hz.

2.3. Experimental Setup and Instructions

The experiment was performed in a relatively controlled environment. The environmental light, including monitor screen brightness, was shielded to minimise any influences during stimuli changes in the presentation. A black over-cap was used to further reduce the effect of surrounding light further, as shown in Figure 1C. The experiment was conducted in a noise-free room. A visual presentation of resting and task (finger-tapping corresponding to each finger) was displayed on the computer monitor to the participants. Before starting the actual experiment, the participants were given implicit instructions about the experimental protocol and procedure. Practice sessions were conducted before the experiment. The finger-tapping task was performed at a medium-to-fast pace but not with any specific frequency. The number of repetitions of experiments for each participant was dependent upon the comfort and convenience of the participants. No investigation was conducted during any inconvenience and discomfort experienced by the participant, resulting in unwanted signals such as frustration interference in brain signals. Data were acquired using commercial NIRx software NIRStar 15.1. The complete experimental setup is shown in Figure 1.

2.4. Experimental Design

The experiments were designed using blocks of rest (initial rest, final rest, and inter-stimulus rest) and task (thumb, index middle, ring, and little finger-tapping) of the right hand as shown in Figure 2. An optimal baseline period of 30 s was set up before and after the first and last task, respectively. The stimuli duration necessary to acquire an adequate and robust haemodynamic response corresponding to finger-tapping activity was set to 10 s [31]. The single experimental paradigm consists of three sessions of each finger tapping trial. The total length of an experiment was 350 s. The single trial includes 10 s rest followed by 10 s of the task. Experiments were repeated for each participant from one to three times in a single day depending upon his/her comfortability. The rest and task blocks were presented using NIRX stimulation software NIRStim 4.0.

2.5. Brain Area and Montage Selection

Before placing the NIRScap on the participant’s head, cranial landmarks (inion and nasion) were marked to locate C z . The emitter and detector were placed in accordance with 10-5 international positioning layout. The distance between source and detector was kept at 3 cm using optode holders. Sixteen emitters and sixteen detectors were placed over the motor cortex in accordance with standard m o t o r 16 x 16 montage available in NIRStar v15.2, as shown in Figure 3A,B. The source-detectors were placed over the frontal lobe (F1, F2, F3, F4, F5, F6, F7, and F8), frontal-central sulcus lobe (FC1, FC2, FC3, FC4, FC5, and FC6), central sulcus lobe (C1, C2, C3, C4, C5, C6), central-parietal lobe (CP1, CP2, CP3, CP4, CP5, and CP6), and the temporal-parietal lobe (T7, T8, TP7 and TP8). The data were collected from both the left and right hemispheres for further research work. However, in this particular work, only the channels of the left hemisphere were only further analysed.

2.6. Signal Prepossessing

Signal processing was performed using commercial fNIRS data processing software nirsLAB (v2019014) [32] and Matlab®. Signal were pre-processed in nirsLAB, for diverse tasks such as removing discontinuities, spikes, and truncation of the data points before and after the first and last stimuli appeared, respectively. Bad channels were identified using the criterion of the gain setting of three and coefficient of variation (CV) of 7.5 % in nirsLAB. The coefficient of variation is equal to a hundred times the standard deviation divided by the mean value of the raw data measurements. A large value for CV is an indication of high noise. The gain setting was set to eight for all the data processed. Optical densities were converted into haemoglobin concentration change using Modified Beer–Lambert Law in nirsLAB (see details in Section 2.7).

2.7. Modified Beer–Lambert Law (MBLL)

The changes in optical densities were converted using MBLL into Δ H b O (Equation (1a)) and Δ H b R (Equation (1b)). The parameter for MBLL, such as the differential path length factor (DPF) and molar extinction coefficients (using standard W.B Gratzer spectrum) for Δ H b O and Δ H b R , are shown in Table 1. The molar concentration and MVO2Sat value are set as 75 μ M and 70%, respectively.
Δ H b O i ( k ) = ε Δ H b R λ 1 Δ O D λ 2 ( k ) D P F λ 2 ε Δ H b R λ 2 Δ O D λ 1 ( k ) D P F λ 1 l i ε Δ H b R λ 1 ε Δ H b O λ 2 ε Δ H b R λ 2 ε Δ H b O λ 1
Δ H b R i ( k ) = ε Δ H b O λ 2 Δ O D λ 1 ( k ) D P F λ 1 ε Δ H b O λ 1 Δ O D λ 2 ( k ) D P F λ 2 l i ε Δ H b R λ 1 ε Δ H b O λ 2 ε Δ H b R λ 2 ε Δ H b O λ 1
where, Δ H b O i and Δ H b R i : concentration changes of Δ H b O and Δ H b R , ε ( λ ) : extinction coefficient corresponding to wavelengths and haemoglobin concentrations, Δ O D : variation in optical density at k t h sample, D P F ( λ ) : differential path length factor, i: i t h channel pair representation of emitter-detector, λ 1 and λ 2 : two working wavelengths of fNIRS system, ε H b R λ 1 , ε Δ H b O λ 2 , ε Δ H b R λ 2 and ε Δ H b O λ 1 : extinction coefficients of Δ H b O and Δ H b R at two different wavelengths.

2.8. Signal Filtration

The spontaneous contamination from physiological and non-physiological noise in fNIRS data, such as heart rate (≃1 Hz), respiration (≃0.2 Hz), Mayer waves (≃0.1 Hz), and very low frequency ( 0.04, V L F ) was removed by applying subsequent filters. Non-physiological noise refers to motion artefacts, measurements noise and machine drift due to the temperature changes in the optical system. The stimulation frequency for the given experimental paradigm was ( 1 / 20 s = 0.05 Hz). The stable 4th order band-pass Butter-worth filter with a low and high cut-off frequency of 0.01 Hz and 0.15 Hz, respectively [33], was applied to remove the noises. To avoid phase delay in filtering, the built-in MATLAB® command ’filtfilt’ was used. Furthermore, smoothing of the fNIRS signal was done by applying the Savitzky-Golay filter with the optimal order and frame size recommended in [34]. In [34], the recommended filter order and frame size is three and nineteen, respectively, for a frequency band of 0.03–0.1 Hz. We used the same order and frame size because our band of frequencies are quite similar.

2.9. Feature Extraction

The most common statistical features (descriptive and morphological) used in fNIRS are signal mean, peak, minimum, Skewness, Kurtosis, variance, median, and peak-to-peak [35,36,37,38]. The window length was set to 10 s, which is equal to the task period. The descriptions of the extracted features are shown in Table 2 from Δ H b O data.

2.10. Classification

Eight commonly used classifiers were evaluated to check the robustness of modern machine learning algorithms for decoding dexterous finger movements. The classifiers included Support vector machine (SVM), Random Forest (RF), Decision tree (DT), Adaboost, Quadratic discriminant analysis (QDA), Artificial neural networks (ANN), k-nearest neighbours (kNN), and Extreme Gradient Boosting (XGBoost). The different classifiers’ parameters are shown in Table 3.

2.11. Performance Evaluation

Each classifier was mostly evaluated using different performance measures, like accuracy, precision, recall, F1 score, receiver operating characteristic curve/ROC curve, and confusion matrix [39]. All these measures can be derived from the so-called true positives (TP), false positives (FP), true negatives (TN), and false negatives (FN). Reporting single metrics does not give us a complete understanding of the classifier behavior. Hence, it is important to at-least report a few of these parameters to gain a complete understanding of the classifier behaviour. In this study, we have reported accuracy, precision, recall and F1 score. Accuracy is the ratio between correctly classified points to the number of total point. The accuracy gives the probability of correct predictions of the model. However, in the case of highly imbalanced data sets, the model that deterministically classifies all the data as the majority class will yield higher classification accuracy, which makes this measure unreliable. The confusion matrix summarizes the predicted results in table format with visualisation of all the above-mentioned four parameters (TP, FP, TN, FN) of the classifiers. Precision and recall give us an understanding of how useful and complete are the results, respectively. F1 score is the harmonic mean of precision and recall. All these parameters are discussed in the results section, where we discuss the performance of the classifier in decoding individual finger-tapping.

3. Results and Discussion

In this study, we classified individual finger tapping of right-handed people using fNIRS signals. For that purpose, eight different spatio-statistical features were extracted from Δ H b O , as shown in Table 2. Furthermore, we also compared and evaluated the performance of different classifiers, such as SVM, RF, DT, Adaboost, QDA, ANN, kNN and XGBoost, as shown in Figure 4. Table 4 shows the four important performance measures among all of the subjects for the respective classifiers. It was noted that the kNN, RF and XGBoost classifiers yielded maximum classification accuracies, 0.75 ± 0.04, 0.75 ± 0.05, and 0.77 ± 0.06, respectively. We applied the student’s t-test to validate whether or not these classifier’s accuracies were statistically discriminant or not with respect to the rest of the classifiers. The p-values obtained among kNN, RF, and XGBoost were not statistically significant, since all the classifiers yielded a similar accuracy. On the other hand, the p-values using either classifiers kNN, RF or XGBoost versus all of the other classifiers were less than 0.05 for all Δ H b O signals, which establish the statistical significance of these classifiers performance. Previous studies showed that thumb finger-tapping gives a higher level of cortical activation among other fingers [40], which is also supported by our current study as shown in Figure 5f–h. Moreover, the highest peaks in Δ H b O signal which corresponds to higher brain activity during thumb finger-tapping can be seen in Figure 6.
Overall, it was noted that most of the classes were misclassified as a rest class, and KNNs were therefore unable to classify the index finger correctly. We tested kNNs on different neighbours (such as 5, 10, and 15), five of which performed better than others, whereas RFs performed poorly on classifying the middle finger. Similarly, like kNNs, we also tested RFs on different estimators and got the best results at 10 number of estimators. On the other hand, XGBoost only classified little fingers poorly. In general, KNNs, RFs, and XGBoost performed well.
One of the core objectives of the brain-computer interface is to achieve a maximum number of commands with good classification accuracy. If we look at the previous literature in the field of fNIRS demonstrates that most of the work utilized either two-class, three-class, or four-class classification. While classifying two commands using fNIRS-based brain signals Power et al. achieved an average classification accuracy of 0.56 for two tasks [41]. Hong et al., achieved an average classification accuracy of 0.75 for three commands [42]. Similarly, several studies have reported classification results for four-class classification as well [43]. To the best of the author’s knowledge, this is the first work that has reported good accuracies for five class-classification in the field of fNIRS. In this work, the achieved classification accuracies are far above the chance level (i.e., 0.2), which shows that machine learning can result in a potential increase in the number of commands in the field of fNIRS-based brain imaging.
In future, the signals will be studied in depth to gain a better understanding and more precise understanding of the cortical hemodynamics response precisely. After all, the attributes of different brain regions and with repetition of trails could vary for the same experimental paradigm [44]. Selection of trails or active channels using the 3-gamma function, changing the window length, detection of initial dip, vector phase analysis, and optimal feature extraction are the future directions for data analysis that could help to increase the classification accuracy. Furthermore, deep learning approaches, including deep belief and convolutional neural networks models, could also help to increase classification accuracy [45]. Moreover, activation of the left and right finger-tapping is dominant in premotor and SMA areas comparative to motor execution finger-tapping [46]. In future work, we will focus on averaging over this region of interest to gain a better idea of which activation regions corresponding to different finger-tapping. Trail-to-trail variability in fNIRS signal for finger-tapping tasks could be reduced using seed correlation methods that can enhance the classification accuracy [47]. We also envisage to using estimation algorithms such as the q-step-ahead prediction scheme and the kernel-based recursive least squares (KRLS) algorithm to reduce the onset delay of the Δ H b O changes due to finger-tapping for real-time implementation in the BCI system [21,48,49,50]. In the study, we considered only Δ H b O data. The reason for selecting Δ H b O is that in the field of fNIRS-based brain imaging, although both Δ H b O and Δ H b R are indicators of cerebral blood flows. However, Δ H b O is more sensitive than Δ H b R [51,52]. As far as Δ H b T and cerebral oxygen exchange C O E are concerned, the quantities are dependent on HbO and HbR [53]. In future, Δ H b R and total haemoglobin changes Δ H b T changes will also be considered in ordered to achieve understanding. Moreover, only left hemisphere channels were considered in the study. Investigating the dynamic relationship between the brain regions simultaneously activated during finger-tapping would be an interesting direction for the future study. In recent studies, different stimulation durations were investigated to find the appropriate duration that can shorten the command generation time [54]. Keeping in mind the findings of these studies, shorter stimulation durations will also be investigated in the future.

4. Conclusions

Despite the outstanding performance of modern machine-learning algorithms, using functional near-infrared spectroscopy to classify movements from delicate anatomical structures, such as individual finger movements, is very challenging. This work presents a classification of individual finger movements (six classes) from the motor cortex. We have applied eight different classifiers, ranging from simple to sophisticated machine-learning algorithms. Quadratic discriminant analysis (QDA), AdaBoost, Support vector machine (SVM), Artificial neural networks (ANN), and Decision tree (DT) performed poorly, with an average classification accuracy of below 60 % . On the other hand, other classifiers such as k-nearest neighbours (kNN), Random forest (RF) and Extreme Gradient Boosting (XGBoost) performed exceptionally well for such high-order data, with an average classification accuracy of 0.75 ± 0.04 , 0.75 ± 0.05 and 0.77 ± 0.06 , respectively. These are preliminary results from this novel research direction. In future, more in-depth analysis of the temporal and spatial domain will be conducted to understand the signals better. Achieving better classification accuracy could be a quantum leap for control command enrichment in brain-computer interface applications.

Author Contributions

Conceptualisation, H.K., P.M. and F.M.N.; methodology, H.K. and F.M.N.; analysis, H.K. and F.M.N.; suggestions and validation, P.M., A.Y., M.Z.U. and M.N.A.K.; writing—original draft preparation, H.K. and F.M.N.; writing—review and editing, P.M., A.Y., M.Z.U. and M.N.A.K.; supervision, P.M. and A.Y.; project administration, H.K. and P.M.; funding acquisition, P.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded and supported by the department of MEK, OsloMet-Oslo Metropolitan University and the Norwegian Research Council under a project titled ’Patient-Centric Engineering in Rehabilitation (PACER)’ grant number 273599. Available online: https://prosjektbanken.forskningsradet.no/en/project/FORISS/273599?Kilde=FORISS&distribution=Ar&char (accessed on 1 November 2021).

Institutional Review Board Statement

The experiment was conducted according to the declaration of Helsinki. The study protocol and risk analysis were approved by the ethical committee of Oslo Metropolitan University. No objection certificate was obtained from Regional Committees for Medical and Health Research Ethics (REC) for experimental work (Ref. No. 322236).

Informed Consent Statement

Informed consent according to the Norwegian Centre for Research Data AS (NSD) of voluntary participation was given by all the participants before the experiment. The participant personal data is protected under NSD (Ref. No. 647457).

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
fNIRSFunctional Near-Infrared Spectroscopy
SVMSupport Vector Machine
RFRandom forest
DTDecision Tree
QDAQuadratic Discriminant Analysis
ANNArtificial Neural Networks (ANN)
KNNK-Nearest Neighbors (kNN)

References

  1. Izzetoglu, M.; Izzetoglu, K.; Bunce, S.; Ayaz, H.; Devaraj, A.; Onaral, B.; Pourrezaei, K. Functional near-infrared neuroimaging. IEEE Trans. Neural Syst. Rehabil. Eng. 2005, 13, 153–159. [Google Scholar] [CrossRef]
  2. Boas, D.A.; Elwell, C.E.; Ferrari, M.; Taga, G. Twenty years of functional near-infrared spectroscopy: Introduction for the special issue. NeuroImage 2014, 85, 1–5. [Google Scholar] [CrossRef] [PubMed]
  3. Khan, R.A.; Naseer, N.; Qureshi, N.K.; Noori, F.M.; Nazeer, H.; Khan, M.U. fNIRS-based Neurorobotic Interface for gait rehabilitation. J. Neuroeng. Rehabil. 2018, 15, 1–17. [Google Scholar] [CrossRef] [PubMed]
  4. Khan, H.; Naseer, N.; Yazidi, A.; Eide, P.K.; Hassan, H.W.; Mirtaheri, P. Analysis of Human Gait using Hybrid EEG-fNIRS-based BCI System: A review. Front. Hum. Neurosci. 2020, 14, 605. [Google Scholar] [CrossRef] [PubMed]
  5. Villringer, A.; Chance, B. Non-invasive optical spectroscopy and imaging of human brain function. Trends Neurosci. 1997, 20, 435–442. [Google Scholar] [CrossRef]
  6. Huneau, C.; Benali, H.; Chabriat, H. Investigating human neurovascular coupling using functional neuroimaging: A critical review of dynamic models. Front. Neurosci. 2015, 9, 467. [Google Scholar] [CrossRef] [Green Version]
  7. Hendrikx, D.; Smits, A.; Lavanga, M.; De Wel, O.; Thewissen, L.; Jansen, K.; Caicedo, A.; Van Huffel, S.; Naulaers, G. Measurement of neurovascular coupling in neonates. Front. Physiol. 2019, 10, 65. [Google Scholar] [CrossRef] [Green Version]
  8. Kumar, V.; Shivakumar, V.; Chhabra, H.; Bose, A.; Venkatasubramanian, G.; Gangadhar, B.N. Functional near infra-red spectroscopy (fNIRS) in schizophrenia: A review. Asian J. Psychiatry 2017, 27, 18–31. [Google Scholar] [CrossRef] [PubMed]
  9. Naseer, N.; Hong, K.S. fNIRS-based brain-computer interfaces: A review. Front. Hum. Neurosci. 2015, 9, 3. [Google Scholar] [CrossRef] [Green Version]
  10. Naseer, N.; Qureshi, N.K.; Noori, F.M.; Hong, K.S. Analysis of different classification techniques for two-class functional near-infrared spectroscopy-based brain-computer interface. Comput. Intell. Neurosci. 2016, 2016, 5480760. [Google Scholar] [CrossRef] [Green Version]
  11. Jobbágy, Á.; Harcos, P.; Karoly, R.; Fazekas, G. Analysis of finger-tapping movement. J. Neurosci. Methods 2005, 141, 29–39. [Google Scholar] [CrossRef]
  12. Liao, K.; Xiao, R.; Gonzalez, J.; Ding, L. Decoding Individual Finger Movements from One Hand Using Human EEG Signals. PLoS ONE 2014, 9, e85192. [Google Scholar] [CrossRef]
  13. Kondo, G.; Kato, R.; Yokoi, H.; Arai, T. Classification of individual finger motions hybridizing electromyogram in transient and converged states. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 2909–2915. [Google Scholar]
  14. Al-Timemy, A.H.; Bugmann, G.; Escudero, J.; Outram, N. Classification of finger movements for the dexterous hand prosthesis control with surface electromyography. IEEE J. Biomed. Health Inform. 2013, 17, 608–618. [Google Scholar] [CrossRef]
  15. Sikdar, S.; Rangwala, H.; Eastlake, E.B.; Hunt, I.A.; Nelson, A.J.; Devanathan, J.; Shin, A.; Pancrazio, J.J. Novel method for predicting dexterous individual finger movements by imaging muscle activity using a wearable ultrasonic system. IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 22, 69–76. [Google Scholar] [CrossRef] [PubMed]
  16. Samiee, S.; Hajipour, S.; Shamsollahi, M.B. Five-class finger flexion classification using ECoG signals. In Proceedings of the 2010 International Conference on Intelligent and Advanced Systems, Kuala Lumpur, Malaysia, 15–17 June 2010; pp. 1–4. [Google Scholar]
  17. Flamary, R.; Rakotomamonjy, A. Decoding Finger Movements from ECoG Signals Using Switching Linear Models. Front. Neurosci. 2012, 6, 29. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Nazeer, H.; Naseer, N.; Khan, R.A.A.; Noori, F.M.; Qureshi, N.K.; Khan, U.S.; Khan, M.J. Enhancing classification accuracy of fNIRS-BCI using features acquired from vector-based phase analysis. J. Neural Eng. 2020, 17, 056025. [Google Scholar] [CrossRef] [PubMed]
  19. Bak, S.; Park, J.; Shin, J.; Jeong, J. Open-access fNIRS dataset for classification of unilateral finger-and foot-tapping. Electronics 2019, 8, 1486. [Google Scholar] [CrossRef] [Green Version]
  20. Holper, L.; Wolf, M. Single-trial classification of motor imagery differing in task complexity: A functional near-infrared spectroscopy study. J. Neuroeng. Rehabil. 2011, 8, 1–13. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Zafar, A.; Hong, K.S. Reduction of onset delay in functional near-infrared spectroscopy: Prediction of HbO/HbR signals. Front. Neurorobotics 2020, 14, 10. [Google Scholar] [CrossRef] [Green Version]
  22. Wickramaratne, S.D.; Mahmud, M. Conditional-GAN Based Data Augmentation for Deep Learning Task Classifier Improvement Using fNIRS Data. Front. Big Data 2021, 4, 62. [Google Scholar] [CrossRef]
  23. Sommer, N.M.; Kakillioglu, B.; Grant, T.; Velipasalar, S.; Hirshfield, L. Classification of fNIRS Finger Tapping Data With Multi-Labeling and Deep Learning. IEEE Sens. J. 2021, 21, 24558–24569. [Google Scholar] [CrossRef]
  24. Kashou, N.H.; Giacherio, B.M.; Nahhas, R.W.; Jadcherla, S.R. Hand-grasping and finger tapping induced similar functional near-infrared spectroscopy cortical responses. Neurophotonics 2016, 3, 025006. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Anwar, A.R.; Muthalib, M.; Perrey, S.; Galka, A.; Granert, O.; Wolff, S.; Heute, U.; Deuschl, G.; Raethjen, J.; Muthuraman, M. Effective connectivity of cortical sensorimotor networks during finger movement tasks: A simultaneous fNIRS, fMRI, EEG study. Brain Topogr. 2016, 29, 645–660. [Google Scholar] [CrossRef] [PubMed]
  26. Vergotte, G.; Torre, K.; Chirumamilla, V.C.; Anwar, A.R.; Groppa, S.; Perrey, S.; Muthuraman, M. Dynamics of the human brain network revealed by time-frequency effective connectivity in fNIRS. Biomed. Opt. Express 2017, 8, 5326–5341. [Google Scholar] [CrossRef] [Green Version]
  27. Cicalese, P.A.; Li, R.; Ahmadi, M.B.; Wang, C.; Francis, J.T.; Selvaraj, S.; Schulz, P.E.; Zhang, Y. An EEG-fNIRS hybridization technique in the four-class classification of alzheimer’s disease. J. Neurosci. Methods 2020, 336, 108618. [Google Scholar] [CrossRef] [PubMed]
  28. Hong, K.S.; Khan, M.J. Hybrid brain–computer interface techniques for improved classification accuracy and increased number of commands: A review. Front. Neurorobotics 2017, 11, 35. [Google Scholar] [CrossRef] [Green Version]
  29. Quaresima, V.; Ferrari, M. Functional near-infrared spectroscopy (fNIRS) for assessing cerebral cortex function during human behavior in natural/social situations: A concise review. Organ. Res. Methods 2019, 22, 46–68. [Google Scholar] [CrossRef]
  30. Yücel, M.A.; Lühmann, A.V.; Scholkmann, F.; Gervain, J.; Dan, I.; Ayaz, H.; Boas, D.; Cooper, R.J.; Culver, J.; Elwell, C.E.; et al. Best practices for fNIRS publications. Neurophotonics 2021, 8, 012101. [Google Scholar] [CrossRef]
  31. Khan, M.A.; Bhutta, M.R.; Hong, K.S. Task-specific stimulation duration for fNIRS brain-computer interface. IEEE Access 2020, 8, 89093–89105. [Google Scholar] [CrossRef]
  32. Santosa, H.; Zhai, X.; Fishburn, F.; Huppert, T. The NIRS brain AnalyzIR toolbox. Algorithms 2018, 11, 73. [Google Scholar] [CrossRef] [Green Version]
  33. Pinti, P.; Scholkmann, F.; Hamilton, A.; Burgess, P.; Tachtsidis, I. Current Status and Issues Regarding Pre-processing of fNIRS Neuroimaging Data: An Investigation of Diverse Signal Filtering Methods Within a General Linear Model Framework. Front. Hum. Neurosci. 2019, 12, 505. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Rahman, M.A.; Rashid, M.A.; Ahmad, M. Selecting the optimal conditions of Savitzky–Golay filter for fNIRS signal. Biocybern. Biomed. Eng. 2019, 39, 624–637. [Google Scholar] [CrossRef]
  35. Hong, K.S.; Khan, M.J.; Hong, M.J. Feature extraction and classification methods for hybrid fNIRS-EEG brain-computer interfaces. Front. Hum. Neurosci. 2018, 12, 246. [Google Scholar] [CrossRef]
  36. Naseer, N.; Noori, F.M.; Qureshi, N.K.; Hong, K.S. Determining optimal feature-combination for LDA classification of functional near-infrared spectroscopy signals in brain-computer interface application. Front. Hum. Neurosci. 2016, 10, 237. [Google Scholar] [CrossRef] [Green Version]
  37. Noori, F.M.; Naseer, N.; Qureshi, N.K.; Nazeer, H.; Khan, R.A. Optimal feature selection from fNIRS signals using genetic algorithms for BCI. Neurosci. Lett. 2017, 647, 61–66. [Google Scholar] [CrossRef] [PubMed]
  38. Qureshi, N.K.; Naseer, N.; Noori, F.M.; Nazeer, H.; Khan, R.A.; Saleem, S. Enhancing classification performance of functional near-infrared spectroscopy-brain–computer interface using adaptive estimation of general linear model coefficients. Front. Neurorobotics 2017, 11, 33. [Google Scholar] [CrossRef] [PubMed]
  39. Elkan, C. Evaluating Classifiers; University of California: San Diego, CA, USA, 2012. [Google Scholar]
  40. Jorge, A.; Royston, D.A.; Tyler-Kabara, E.C.; Boninger, M.L.; Collinger, J.L. Classification of individual finger movements using intracortical recordings in Human Motor Cortex. Neurosurgery 2020, 87, 630–638. [Google Scholar] [CrossRef] [PubMed]
  41. Power, S.D.; Kushki, A.; Chau, T. Automatic single-trial discrimination of mental arithmetic, mental singing and the no-control state from prefrontal activity: Toward a three-state NIRS-BCI. BMC Res. Notes 2012, 5, 141. [Google Scholar] [CrossRef] [Green Version]
  42. Hong, K.S.; Naseer, N.; Kim, Y.H. Classification of prefrontal and motor cortex signals for three-class fNIRS-BCI. Neurosci. Lett. 2015, 587, 87–92. [Google Scholar] [CrossRef]
  43. Hong, K.S.; Santosa, H. Decoding four different sound-categories in the auditory cortex using functional near-infrared spectroscopy. Hear. Res. 2016, 333, 157–166. [Google Scholar] [CrossRef]
  44. Kamran, M.A.; Jeong, M.Y.; Mannan, M. Optimal hemodynamic response model for functional near-infrared spectroscopy. Front. Behav. Neurosci. 2015, 9, 151. [Google Scholar] [CrossRef] [Green Version]
  45. Ho, T.K.K.; Gwak, J.; Park, C.M.; Song, J.I. Discrimination of mental workload levels from multi-channel fNIRS using deep leaning-based approaches. IEEE Access 2019, 7, 24392–24403. [Google Scholar] [CrossRef]
  46. Wu, S.; Li, J.; Gao, L.; Chen, C.; He, S. Suppressing systemic interference in fNIRS monitoring of the hemodynamic cortical response to motor execution and imagery. Front. Hum. Neurosci. 2018, 12, 85. [Google Scholar] [CrossRef] [PubMed]
  47. Hu, X.S.; Hong, K.S.; Ge, S.S. Reduction of trial-to-trial variability in functional near-infrared spectroscopy signals by accounting for resting-state functional connectivity. J. Biomed. Opt. 2013, 18, 017003. [Google Scholar] [CrossRef] [PubMed]
  48. Naseer, N.; Hong, K.S. Functional near-infrared spectroscopy based brain activity classification for development of a brain-computer interface. In Proceedings of the 2012 International Conference of Robotics and Artificial Intelligence, Rawalpindi, Pakistan, 22–23 October 2012; pp. 174–178. [Google Scholar] [CrossRef]
  49. Khan, M.J.; Hong, K.S.; Bhutta, M.R.; Naseer, N. fNIRS based dual movement control command generation using prefrontal brain activity. In Proceedings of the 2014 International Conference on Robotics and Emerging Allied Technologies in Engineering (iCREATE), Islamabad, Pakistan, 22–24 April 2014; pp. 244–248. [Google Scholar] [CrossRef]
  50. Xiao, J.; Xu, H.; Gao, H.; Bian, M.; Li, Y. A Weakly Supervised Semantic Segmentation Network by Aggregating Seed Cues: The Multi-Object Proposal Generation Perspective. ACM J. 2021, 17, 1–19. [Google Scholar] [CrossRef]
  51. Hoshi, Y.; Kobayashi, N.; Tamura, M. Interpretation of near-infrared spectroscopy signals: A study with a newly developed perfused rat brain model. J. Appl. Physiol. 2001, 90, 1657–1662. [Google Scholar] [CrossRef] [Green Version]
  52. Hu, X.S.; Hong, K.S.; Shuzhi, S.G.; Jeong, M.Y. Kalman estimator-and general linear model-based on-line brain activation mapping by near-infrared spectroscopy. Biomed. Eng. Online 2010, 9, 1–15. [Google Scholar] [CrossRef] [Green Version]
  53. Zafar, A.; Hong, K.S. Neuronal activation detection using vector phase analysis with dual threshold circles: A functional near-infrared spectroscopy study. Int. J. Neural Syst. 2018, 28, 1850031. [Google Scholar] [CrossRef]
  54. Khan, M.N.A.; Hong, K.S. Most favorable stimulation duration in the sensorimotor cortex for fNIRS-based BCI. Biomed. Opt. Express 2021, 12, 5939–5954. [Google Scholar] [CrossRef]
Figure 1. (A) Experimental setup; (B) optodes arrangement; (C) overcap to reduce external light; (D) optodes holder.
Figure 1. (A) Experimental setup; (B) optodes arrangement; (C) overcap to reduce external light; (D) optodes holder.
Sensors 21 07943 g001
Figure 2. Experimental paradigm visualization. Single experiment consists of three sessions of each finger tapping trail. Single trial consists of 10 s task and 10 s finger tapping.
Figure 2. Experimental paradigm visualization. Single experiment consists of three sessions of each finger tapping trail. Single trial consists of 10 s task and 10 s finger tapping.
Sensors 21 07943 g002
Figure 3. (A) Source-detector placement over motor cortex. Figure 3A Colour code: Red (source), Blue (detector), Green (channels), and black colour represent channel numbers. (B) Demonstration of total haemoglobin changes over motor cortex during index finger tapping.
Figure 3. (A) Source-detector placement over motor cortex. Figure 3A Colour code: Red (source), Blue (detector), Green (channels), and black colour represent channel numbers. (B) Demonstration of total haemoglobin changes over motor cortex during index finger tapping.
Sensors 21 07943 g003
Figure 4. Comparison of different classifiers on basis of performance parameters (accuracy, precision, recall F1score).
Figure 4. Comparison of different classifiers on basis of performance parameters (accuracy, precision, recall F1score).
Sensors 21 07943 g004
Figure 5. Confusion metrics for all classifiers for subject one (S01); Classes are labeled as ‘0’, ‘1’, ‘2’, ‘3’, ‘4’ and ‘5’, which stands for ‘Rest’, ‘Thumb’, ‘Index’, ‘Middle’, ‘Ring’, and ‘Little’ finger-tapping classes, respectively. (a) Quadratic discriminant analysis (QDA). (b) AdaBoost. (c) Support vector machine (SVM). (d) Decision tree (DT). (e) Artificial neural networks (ANN). (f) k-nearest neighbors (kNN). (g) Random forest (RF). (h) Extreme Gradient Boosting (XGBoost).
Figure 5. Confusion metrics for all classifiers for subject one (S01); Classes are labeled as ‘0’, ‘1’, ‘2’, ‘3’, ‘4’ and ‘5’, which stands for ‘Rest’, ‘Thumb’, ‘Index’, ‘Middle’, ‘Ring’, and ‘Little’ finger-tapping classes, respectively. (a) Quadratic discriminant analysis (QDA). (b) AdaBoost. (c) Support vector machine (SVM). (d) Decision tree (DT). (e) Artificial neural networks (ANN). (f) k-nearest neighbors (kNN). (g) Random forest (RF). (h) Extreme Gradient Boosting (XGBoost).
Sensors 21 07943 g005
Figure 6. Oxygenated haemoglobin Signal for complete experimental trail.
Figure 6. Oxygenated haemoglobin Signal for complete experimental trail.
Sensors 21 07943 g006
Table 1. Parameters for Modified Beer–Lambert Law (MBLL).
Table 1. Parameters for Modified Beer–Lambert Law (MBLL).
Wavelength (nm)DPF (cm) Δ HbO
(1/cm) (moles/L)
Δ HbR ( 1 / cm )   ( moles / L )
7607.251466.58653843.707
8506.382526.3911798.643
Table 2. Spatial feature extracted from Δ H b O .
Table 2. Spatial feature extracted from Δ H b O .
Sr. No.Statistical FeatureMathematical Formulation/Description
1.Signal MeanSignal mean is calculated as:
μ w = 1 N w k = k L k U H b X w where,
μ w : Mean of window
w: sample window
N w : Number of sample in the window
k L : Lower limit of the window
k U : Upper limit of the window
H b X w : Stands for Δ H b O or Δ H b R
2.Signal Peak (Signal maximum)The feature select the maximum value in the window.
3.Signal MinimumThe feature minimum value in the window.
4.Signal SkewnessSignal skewness is calculated as:
s k e w w = E x ( Δ H b X w μ w ) 3 σ 3
where, E x is the expectation, μ is the mean, and  σ is the standard deviation of the haemoglobin Δ H b X w
5.Signal KurtosisSignal Kurtosis is calculated as:
K u r t w = E x ( Δ H b X w μ w ) 4 σ 4
where, E x is the expectation, μ is the mean, and  σ is the standard deviation of the haemoglobin Δ H b X w
6.Signal VarianceSignal variance is the measure of signal spread.
7.Signal MedianMedian is the value separating the higher half from the lower half of values in the time window.
8.Peak-to-peakPeak-to-peak is computed as the difference between the maximum to the minimum value in the time window.
Table 3. Classifier parameters.
Table 3. Classifier parameters.
ClassifiersParameters Setting
QDApriors = None, reg_param = 0.0
AdaBoostn_estimator = 10, random_state = 0, learning_rate = 1.0
SVMKernal = rbf, degree = 3, random_state = None
ANNhidden layers = (5, 2), solver=’lbfgs’, random_state = 1, max_liter = 300,
Decision Treecriterion = entropy, random_state = 0
kNNn_neighbors = 5
Random Forestn_estimators = 10, criterion = entropy, random_state = 0
XGBoostbooster = gbtree, verbosity = 1, nthread = maximum number of threads
Table 4. Subject-wise comparison of classifiers performance parameters (accuracy, precision, recall, F1 score); ‘S’ stands for subject followed by number.
Table 4. Subject-wise comparison of classifiers performance parameters (accuracy, precision, recall, F1 score); ‘S’ stands for subject followed by number.
S01S02S03S04S05S06S07S08S09S10S11S12S13S14S15S16S17S18S19S20S21S22S23S24MeanSTD
SVMAccuracy0.580.570.580.570.560.570.580.590.580.570.570.620.570.570.640.570.590.580.650.600.570.580.600.590.590.02
Precision0.650.320.340.480.320.490.340.620.530.390.410.460.480.410.670.400.350.430.490.440.450.500.520.650.460.10
Recall0.580.570.580.570.560.570.580.590.580.570.570.620.570.570.640.570.590.580.650.600.570.580.600.590.590.02
F1 Score0.470.410.430.420.410.420.430.450.450.430.410.490.430.420.550.420.440.430.520.450.420.440.500.460.450.04
RFAccuracy0.840.650.840.700.730.770.750.750.750.760.730.730.720.800.780.720.710.750.820.680.770.770.780.780.750.05
Precision0.840.630.850.700.730.770.750.750.750.770.730.730.720.800.800.730.700.750.820.670.770.780.780.780.750.05
Recall0.840.650.840.700.730.770.750.750.750.760.730.730.720.800.780.720.710.750.820.680.770.770.780.780.750.05
F1 Score0.830.610.830.670.720.750.730.740.740.750.700.720.700.780.770.700.690.730.810.650.750.760.770.770.740.05
DTAccuracy0.790.560.760.280.670.680.230.680.710.700.630.710.650.730.760.710.670.690.760.640.720.710.750.710.660.13
Precision0.790.560.760.490.670.690.530.680.710.700.630.720.650.740.760.710.680.690.780.640.720.720.750.710.690.07
Recall0.790.560.760.280.670.680.230.680.710.700.630.710.650.730.760.710.670.690.760.640.720.710.750.710.660.13
F1 Score0.790.560.750.320.670.690.270.680.710.700.630.710.650.740.760.710.670.690.770.640.720.710.750.710.670.13
AdaBoostAccuracy0.410.550.460.560.520.520.390.500.480.520.510.420.430.510.430.380.490.450.490.530.450.380.340.460.470.06
Precision0.400.410.460.330.460.430.390.440.390.390.440.460.380.440.460.380.440.440.530.410.410.410.470.450.430.04
Recall0.410.550.460.560.520.520.390.500.480.520.510.420.430.510.430.380.490.450.490.530.450.380.340.460.470.06
F1 Score0.400.440.460.420.460.450.380.460.430.430.450.430.400.460.430.370.450.440.500.450.420.390.380.430.430.03
QDAAccuracy0.280.220.310.280.240.420.230.410.200.210.240.280.320.250.580.320.310.300.360.260.340.240.560.280.310.10
Precision0.590.490.660.490.560.480.530.500.550.520.450.590.610.510.590.490.540.560.640.540.490.540.690.470.540.06
Recall0.280.220.310.280.240.420.230.410.200.210.240.280.320.250.580.320.310.300.360.260.340.240.560.280.310.10
F1 Score0.290.250.330.320.240.430.270.420.160.220.260.300.330.280.570.350.330.300.420.300.380.230.580.310.330.10
ANNAccuracy0.610.580.600.570.580.580.580.600.600.580.580.630.570.580.640.590.600.610.670.610.590.590.620.590.600.02
Precision0.690.420.560.540.480.540.340.690.670.620.610.540.600.520.600.520.520.600.640.570.560.620.580.580.570.08
Recall0.610.580.600.570.580.580.580.600.600.580.580.630.570.580.640.590.600.610.670.610.590.590.620.590.600.02
F1 Score0.520.430.480.450.440.440.430.480.490.440.450.530.460.440.540.460.460.500.590.480.480.480.550.480.480.04
kNNAccuracy0.800.650.780.710.690.770.740.740.740.730.720.740.720.780.780.700.730.760.820.680.770.760.790.770.750.04
Precision0.800.630.780.690.680.760.740.740.730.720.720.730.710.780.780.700.720.770.810.660.760.760.790.770.740.05
Recall0.800.650.780.710.690.770.740.740.740.730.720.740.720.780.780.700.730.760.820.680.770.760.790.770.750.04
F1 Score0.790.620.780.690.680.760.730.730.730.720.700.730.700.780.770.690.710.750.820.660.760.760.790.770.730.05
XGBoostAccuracy0.860.640.860.710.740.780.740.770.780.790.710.760.730.820.800.750.750.770.860.680.810.780.840.790.770.06
Precision0.870.620.860.720.740.790.740.780.790.790.720.760.730.830.800.760.750.770.850.660.820.790.840.790.770.06
Recall0.860.640.860.710.740.780.740.770.780.790.710.760.730.820.800.750.750.770.860.680.810.780.840.790.770.06
F1 Score0.860.580.850.690.720.770.720.760.760.770.690.750.720.810.780.730.730.750.850.640.800.770.830.780.750.07
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Khan, H.; Noori, F.M.; Yazidi, A.; Uddin, M.Z.; Khan, M.N.A.; Mirtaheri, P. Classification of Individual Finger Movements from Right Hand Using fNIRS Signals. Sensors 2021, 21, 7943. https://doi.org/10.3390/s21237943

AMA Style

Khan H, Noori FM, Yazidi A, Uddin MZ, Khan MNA, Mirtaheri P. Classification of Individual Finger Movements from Right Hand Using fNIRS Signals. Sensors. 2021; 21(23):7943. https://doi.org/10.3390/s21237943

Chicago/Turabian Style

Khan, Haroon, Farzan M. Noori, Anis Yazidi, Md Zia Uddin, M. N. Afzal Khan, and Peyman Mirtaheri. 2021. "Classification of Individual Finger Movements from Right Hand Using fNIRS Signals" Sensors 21, no. 23: 7943. https://doi.org/10.3390/s21237943

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop