Next Article in Journal
Temperature Prediction of Heating Furnace Based on Deep Transfer Learning
Next Article in Special Issue
Indoor Scene Change Captioning Based on Multimodality Data
Previous Article in Journal
Fusion of Environmental Sensing on PM2.5 and Deep Learning on Vehicle Detecting for Acquiring Roadside PM2.5 Concentration Increments
Previous Article in Special Issue
Mask Gradient Response-Based Threshold Segmentation for Surface Defect Detection of Milled Aluminum Ingot
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Automatic Sleep Stage Classification Algorithm Using Improved Model Based Essence Features

1
School of Mechatronics Engineering and Automation, Shanghai University, Shanghai 200444, China
2
Faculty of Biomedical Engineering, Drexel University, Philadelphia, PA 19104, USA
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(17), 4677; https://doi.org/10.3390/s20174677
Submission received: 22 July 2020 / Revised: 12 August 2020 / Accepted: 13 August 2020 / Published: 19 August 2020
(This article belongs to the Special Issue Sensor Signal and Information Processing III)

Abstract

:
The automatic sleep stage classification technique can facilitate the diagnosis of sleep disorders and release the medical expert from labor-consumption work. In this paper, novel improved model based essence features (IMBEFs) were proposed combining locality energy (LE) and dual state space models (DSSMs) for automatic sleep stage detection on single-channel electroencephalograph (EEG) signals. Firstly, each EEG epoch is decomposed into low-level sub-bands (LSBs) and high-level sub-bands (HSBs) by wavelet packet decomposition (WPD), separately. Then, the DSSMs are estimated by the LSBs and the LE calculation is carried out on HSBs. Thirdly, the IMBEFs extracted from the DSSM and LE are fed into the appropriate classifier for sleep stage classification. The performance of the proposed method was evaluated on three public sleep databases. The experimental results show that under the Rechtschaffen’s and Kale’s (R&K) standard, the sleep stage classification accuracies of six classes on the Sleep EDF database and the Dreams Subjects database are 92.04% and 78.92%, respectively. Under the American Academy of Sleep Medicine (AASM) standard, the classification accuracies of five classes in the Dreams Subjects database and the ISRUC database reached 79.90% and 81.65%. The proposed method can be used for reliable sleep stage classification with high accuracy compared with state-of-the-art methods.

1. Introduction

Automatic sleep stage classification is an important research focus due to its importance for the study of sleep related disorders. There are currently two classification criteria for sleep stages. According to Rechtschaffen’s and Kale’s (R&K) recommendations, sleep stages can be divided into six stages: The Awake stage (Awa), rapid Eye Movement stage (REM), Sleep stage 1 (S1), Sleep stage 2 (S2), Sleep stage 3 (S3), Sleep stage 4 (S4) [1]. Another sleep stage classification standard was provided by the AASM. In this standard, there are five sleep stages: Awa, N1 (S1), N2 (S2), N3 (the merging of stages S3 and S4) and REM [2]. Usually, the detection of each sleep stage requires manual marking by professionals, which requires a lot of work and may produce erroneous markings. Therefore, it is imperative to study the method for automatic sleep stage classification.
According to the characteristics of the adopted features, currently commonly used automatic detection methods can be divided into the following two categories. The first is the method based on statistical features (such as spectral energy) extracted from the one-dimensional EEG signal. The other is the implicit features, which can be obtained by training deep-learning based classifiers. Hassan et al. computed various spectral features by Tunable-Q factor wavelet transform (TQWT) on sleep-EEG signal segments [3]. With the random forest classifier, they achieved accuracies of 90.38%, 91.50%, 92.11%, 94.80%, 97.50% for 6-stage to 2-stage classification of sleep states on the Sleep-EDF database. Diykh et al. adopted different structural and spectral attributes extracted from weighted undirected networks to automatically classify the sleep stages [4]. Kang et al. present a statistical framework to estimate whole-night sleep states in patients with obstructive sleep apnea (OSA)—the most common sleep disorder [5]. In this framework, they extracted 11 spectral features from 60903 epochs to estimate per-night sleep stages with a 5-state hidden Markov model. Abdulla et al. used graph modularity of EEG segments as the features to feed an ensemble classifier which achieved the accuracy of 93.1% with 20265 epochs from Sleep EDF database [6].
In [7], Ghimatgar et al. constructed a features pool by the relevance and redundancy analysis on the sleep EEG epochs. With a random forest classifier and a Hidden Markov Model, this method was evaluated on three public sleep EEG database scored according to R&K and AASM guidelines. They achieved overall accuracies in the range of (79.4–87.4%) and (77.6–80.4%) for six-stage (R&K) and five-stage (AASM) classification, respectively. Taran et al. proposed an optimized flexible analytic wavelet transform (OFAWT) to decompose EEG signals into band-limited basis or sub-bands (SBs) [8]. The experimental results yields classification accuracies for the classification of six to two sleep stages 96.03%, 96.39%, 96.48%, 97.56% and 99.36%, respectively. Sharma et al. computed the discriminatory features namely fuzzy entropy and log energy by the wavelet decomposition coefficients [9]. This approach yielded an accuracy of 91.5% and 88.5% for six-class classification task using small and large datasets, respectively. Hassan et al. extracted various statistical moment based features decomposed by the Empirical Mode Decomposition (EMD) and achieved a good performance on a small database [10]. They also decomposed EEG signal segments using Ensemble Empirical Mode Decomposition (EEMD) to extract various statistical moment based features and achieved 88.07%, 83.49%, 92.66%, 94.23% and 98.15% for 6-state to 2-state classification of sleep stages on Sleep-EDF database [11]. Sharma et al. adopted the Poincare plot descriptors and statistical measures which are calculated by the discrete energy separation algorithm (DESA) as the features [12]. Moreover, the classification accuracy of the two to six categories on 15136 epochs from the Sleep-EDF database was 98.02%, 94.66%, 92.29%, 91.13% and 90.02%, respectively.
Besides the conventional features extraction method, some researchers choose the convolutional neural network (CNN) to classify sleep stages with the time–frequency images which are converted by one-dimensional EEG signals. Zhang et al. converted EEG data to a time–frequency representation via Hilbert–Huang transform and employed an orthogonal convolutional neural network (OCNN) as the classifier [13]. They achieved a total classification accuracy of 88.4% and 87.6% on two public datasets, respectively. Similarly, Xu et al. employed multiple CNN on multi-channel EEG signals to classify the sleep stages [14]. Mousavi [15] directly fed the raw EEG signals to a deep CNN with nine layers followed by two fully connected layers, without involving feature extraction and selection. This method achieved the accuracy of 98.10%, 96.86%, 93.11%, 92.95%, 93.55% for two to six class classification. Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used in the field of deep learning. It can not only process single data points (such as images), but also entire sequences of data (such as speech or EEG signal). Korkalainen et al. used a combined convolutional and LSTM neural network on the public database and achieved sleep staging accuracy of 83.7% with a single frontal EEG channel [16]. Michielli et al. proposed a novel cascaded RNN architecture based on LSTM for automated scoring of sleep stages on single-channel EEG signals [17]. The network performed four and two classes classification with a classification rate of 90.8% and 83.6%, respectively.
Most of the existing studies only adopted a few epochs or a single database when evaluating the performance of these method and some do not use the k-fold cross-validation, which will cause large fluctuations in the experimental results. Therefore, although the published researches have achieved positive results in automatic sleep stage classification, there is still a need for further validation and improvements to the existing methods. In this paper we proposed a novel IMBEFs extracted from LE and DSSM for automatically detecting the sleep stages with a high degree of accuracy. LE and DSSM are estimated from the two sets coefficients of LSBs and HSBs. The two sets coefficients are coming from the WPD of the sleep EEG epoch based on two wavelet bases separately. After comparing with various kinds of classifiers, the Bagged Trees was finally selected as the suitable classifier for this method to identify the sleep stages. In addition, experiments are conducted on three public sleep databases and the results are compared with state of the art published work in order to fully evaluate and validate the performance of the proposed method.
The paper is organized as follows: In Section 2, the experimental material and methodology of the proposed method are descripted in detail. Section 3 resents the experimental results. In Section 4, the results and findings of this paper are discussed. The conclusions of the paper are drawn in Section 5.

2. Materials and Methods

2.1. Sleep State Classes

According to the AASM and R&K standards, the classes of sleep stages can be divided into two to six classes. Moreover, under the AASM standard, it can be divided into two to five classes. The difference is that the N3 stage of AASM includes the S3 and S4 stages of the R&K standard. The detailed description of classes considered in this work are shown in Table 1 and Table 2.

2.2. Datasets

2.2.1. Sleep EDF (S-EDF) Database

The S-EDF database have 197 whole-night Polysomnography (PSG) sleep recordings, containing EEG, EOG, chin EMG and event markers [18,19]. All the Hypnograms (sleep patterns) were manually scored by well-trained technicians according to the R&K criteria. In this study, 34 EEG recordings from 26 subjects aged 25 to 96 years are randomly selected.

2.2.2. DREAMS Subjects (DRMS) Database

The DRMS Database consists of 20 whole-night PSG recordings coming from healthy subjects, annotated in sleep stages according to both the R&K criteria and the new standard of the AASM [20]. Data collected were acquired in a sleep laboratory of a Belgium hospital using a digital 32-channel polygraph (BrainnetTM System of MEDATEC, Brussels, Belgium). The sampling frequency was 200 Hz.

2.2.3. ISRUC(Subgroup 3, ISRUC3) Database

The ISRUC3 database is the third subgroup of ISRUC database [21]. The data were obtained from human adults, including healthy subjects, subjects with sleep disorders and subjects under the effect of sleep medication. Each recording was randomly selected between PSG recordings that were acquired by the Sleep Medicine Centre of the Hospital of Coimbra University (CHUC).
The S-EDF database was only labeled under the R&K criteria. Moreover, the ISRUC3 database was only labeled by the AASM criteria. The DRMS database was not only labeled by R&K criteria but also the AASM criteria. The annotations of S-EDF database and DRMS database were produced visually by a single expert. The ISRUC3 database was scored by two experts and the label made by the second expert was used in this paper. The Pz-Oz channel of the S-EDF database is used according to the recommendations of various studies [3,4,5,6,7]. At the same time, for the DRMS database, as the researches [9,10,11,12] recommended, the Cz-A1 channel was adopted in this work. Moreover, for the ISRUC database, the C3-A2 channel is the best choice [7]. Table 3 lists the detailed information of the above three databases.

2.3. Method

Figure 1 shows the schematic outline of the proposed IMBEFs based sleep statge classification algorithm comprising preprocessing, wavelet package decomposition, locality energy calculation, state space models estimation, features extraction, classifier training and performance evaluation.

2.3.1. EEG Data Preprocessing

Firstly, all the single-channel data will be extracted by the Matlab and EEGLAB [22] tools from the three database described previously. According to the prior work [5,6,7,8,9,10,11], the 0–35 Hz low pass filter can be used to eject the most of artifact. Once the dataset is filtered, it will be exported as one-dimensional vector without time information and saved as txt file which also can be denoted as the Formula (1).
X = x 1 , x 2 , , x k , , x M , k [ 1 , M ] , x k R
where X is the vector containing the sampled EEG x k and where M is the length of vector.
Furthermore, we use a window of length j to divide the full data X across time without overlap. That is X is converted into X 1 , X 2 , , X i , , X L T which can be described as (2).
X 1 X 2 X i X L = x 1 x 2 x j x j + 1 x j + 2 x 2 j x ( i 1 ) j + 1 x ( i 1 ) j + 2 x i × j x ( L 1 ) j + 1 x ( L 1 ) j + 2 x L × j L = M j , i [ 1 , L ]
where j = T e × F s . The T e is the length of each epoch. Moreover, the F s is the sampling frequency of the database. For the S-EDF database, the T e = 30 and the F s = 100 , so the j is 3000. Moreover, for the ISRUC3 database, the T e = 20 and the F s = 200 , so the j is 4000.

2.3.2. Wavelet Package Decomposition

WPD is a powerful tool to analyze non-stationary EEG signals. In essence, WPD is a wavelet transform where the discrete-time signal is passed through more filters than the discrete wavelet transform, which can provide a multi-level time-frequency decomposition of signals [23]. Compared with discrete wavelet transform, WPD can provide more frequency resolutions. In the discrete wavelet transform, a signal is split into an approximation coefficient and a detail coefficient [24]. The approximation coefficient is then itself split into a second-level approximation coefficients and detail coefficients and the process is repeated. A wavelet packet function ω l , d m ( q ) is defined as (3):
ω l , d m ( q ) = 2 l / 2 ω m ( 2 l q d )
where l and d are the scaling (frequency localization) parameter and the translation (time localization) parameter, respectively; m = 0 , 1 , 2 , is the oscillation parameter.
Wavelet packet (WP) coefficients of the EEG epoch X i are embedded in the inner product of the signal with every WP function, denoted by p l i , m ( d ) , d = , 1 , 0 , 1 , and given below:
p l i , m ( d ) = x i ( q ) ω l , d m ( q )
where p l i , m ( d ) denotes the m-th set of WPD coefficients at l-th scale parameter and d is the translation parameter. All frequency components and their occurring times are reflected in p l i , m ( d ) through change in m , l , d . Each coefficient p l m ( d ) measures a specific sub-band frequency content, controlled by scaling parameter l and oscillation parameter m. The essential function of WPD is the filtering operation through low-pass filter h ( d ) and high-pass filter g ( d ) . For the l-th level of decomposition, there are 2 l sets of sub-band coefficients C l , m i , of length j / 2 l . The wavelet packet coefficients of epoch X i are given as
C l , m i = { p l i , m ( d ) | d = 1 , 2 , . . . , j / 2 l }
It can be seen from the (5) that each node of the WP tree is indexed with a pair of integers ( l , m ) , where l is the corresponding level of decomposition and m is the order of the node position in the specific level. Here, the level l L E and wavelet basis ω L E of WPD on the epoch X i for LE calculation will be confirmed in the Section 3. Moreover, the wavelet basis ω D S S M for DSSM will be confirmed in the same section.

2.3.3. Locality Energy Calculation

The wavelet package energy E l L E , m i at the m-th node on the level l L E of epoch X i can be defined as follows [25].
E l L E , m i = | p l i , m ( d ) | 2 = | C l L E , m i | 2 , m = { 1 , 2 , , 2 l L E }
Then, the locality energy features (LEFs) of each Epoch can be defined as { E l L E , m i | m = 1 , 2 , , 2 l L E } .

2.3.4. Dual State Space Models Estimation

As we have described before, after the wavelet packet decomposition, the low-level (the first level) coefficients will be used to estimate the dual state space models which can denoted by the difference Equation (7).
u k + 1 = A u k + K e k y k = Bu k + e k
The y k C 1 , m i is the coefficient at instant k [ 1 , 2 , , j / 2 ] . Vector u k R n × 1 is the state vector of process at discrete time instant k and contains the numerical value of n states. Matrix A R n × n is the dynamical system matrix. K R n × 1 is the steady state Kalman gain. B R 1 × n is the output matrix, which describes how the internal state is transferred to the outside world in the observations y k . The e k R denotes zero mean white noise.
With the traditional subspace algorithm such as N4SID, the matrix A ^ , B ^ , K ^ of the state space model of dynamic system can be estimated [26]. In this paper, the order n D S S M of dual state space models will be determined by the experiments in the Section 3. Moreover, the parameter matrixes of state space model estimated by the first level wavelet coefficients C 1 , m i can be expressed as
A ^ 1 , m i = a 1 , 1 i , m a 1 , n D S S M i , m a n D S S M , 1 i , m a n D S S M , n D S S M i , m B ^ 1 , m i = b 1 i , m b 2 i , m b n D S S M i , m K ^ 1 , m i = k 1 i , m b 2 i , m k n D S S M i , m T i 1 , L , m = { 1 , 2 }
Then the DSSM S i of the X i can be defined as:
S i = s 1 i s 2 i = a 1 , 1 i , 1 a n D S S M , n D S S M i , 1 b 1 i , 1 b n D S S M i , 1 k 1 i , 1 k n D S S M i , 1 a 1 , 1 i , 2 a n D S S M , n D S S M i , 2 b 1 i , 2 b n D S S M i , 2 k 1 i , 2 k n D S S M i , 2
So, the parameters extracted from DSSM here is called DSSM Features (DSSMFs) can be defined as D S S M F s = s 1 i s 2 i .

2.3.5. IMBEFs Construction

According to the previously calculated LEFs E l L E , m i and the parameters S i of the DSSM, the features IMBEFs of epoch X i here are given by
F D S S M i = E l L E , 1 i E l L E , 2 l L E i s 1 i s 2 i
The feature dimension can be calculated by the Equation (11).
D i m D S S M = 2 l L E + 2 ( n D S S M 2 + 2 × n D S S M )
Here, the general form of features extracted from LE and multiple state space models (MSSM) which are estimated by the l M S S M -th level WPD coefficients can be depicted as Equation (12).
F M S S M i = E l L E , 1 i E l L E , 2 l L E i s 1 i s 2 l M S S M i
The dimension of the F M S S M i can be calculated by
D i m M S S M = 2 l L E + 2 l M S S M ( n M S S M 2 + 2 × n M S S M )
where n M S S M is the order of MSSM. Usually, the n M S S M range from 5 to 10. Assume the n M S S M = 5 , D i m D S S M = 2 l L E + 2 l M S S M × 40 . Then if l M S S M > 2 , the D i m M S S M will be too large. So the l M S S M is set to 1 in this paper, which means there are two state space models.

3. Experiments and Results

In this section, there are four experimental parts. The first is the experiment for selecting a suitable classifier among several candidate classifiers. Then is the determination of the most suitable wavelet basis ω D S S M and model order n D S S M for DSSM estimation. Next is the selection of the wavelet basis ω L E and the level l L E for the LE calculation. Finally, the test experiment will be conducted on the S-EDF database and ISRUCS3 database with the ω D S S M , ω L E , n D S S M and l L E determined according to the previous experiments.
In the process of selecting these parameters, the DRMS database was adopted for testing under the both R&K and AASM standards. There are several conventional verification strategies, including k-fold cross-validation, leave one-subject-out cross-validation (LOOCV) and corss-dataset validation, etc. In this paper, many commonly-used databases are adopted to verify the performance of the algorithm, in which the S-EDF database and the DRMS database contains lots of subjects. However, some subjects contained in these database possess unevenly distributed samples, which means the incomplete sleep stages. Consequently, the 10-fold cross-validation method would be more suitable for the performance verification in this research. In 10-fold cross-validation, the original sample is randomly partitioned into 10 equal size subsamples. Of the 10 subsamples, a single subsample is retained as the validation data for testing the model and the remaining nine subsamples are used as training data. The cross-validation process is then repeated 10 times, with each of the 10 subsamples used exactly once as the validation data. The 10 results from the folds can then be averaged to produce a single estimation. The advantage of this method is that all observations are used for both training and validation and each observation is used for validation exactly once. The accuracy (ACC) and Cohen’s Kappa Coefficient (Kappa) are computed to evaluate the overall classification performance.
A C C = T P + T N T P + T N + T N + F N × 100 %
K a p p a = A C C p e 1 p e
where TP, TN, FP and FN represent the number of true positive, true negative, false positive and false negative examples respectively. And p e is the hypothetical probability of agreement by chance.

3.1. Classifier Comparison and Selection

In this section, an algorithm is designed to search the best classifier for the method proposed in this paper. The detailed steps are shown in the Algorithm 1 below. In this algorithm, according to the distribution and characteristic of the samples, the candidate classifiers are including Linear Discriminant, Quadratic Discriminant, Quadratic SVM, Fine KNN, Bagged Trees and RUSBoosted Trees. The candidate wavelet bases include the db1, db2, db3, db4, db5, db6, db8, db16, db32, sym2, sym8, sym16, coif1, coif3 and dmey. The order of DSSM range from 5 to 10. Here only the DSSMFs are used for training and validation.
Table 4 shows the experiment results of Algorithm 1. As can be seen from Table 4, the Bagged Trees is the optimal classifier in the classification of two to six classes. At the meantime, the corresponding order of DSSM is 6. In addition, in the two class classification, the optimal wavelet basis is sym2; the others, however, are db1. Furthermore, the comparison of different classifiers in two classes classification under the condition of n D S S M = 6 are listed in Table 5. It can be seen from Table 5 that the accuracy of sym2 is 95.79%, which is a little higher than the 95.71% of db1 and 95.72% of db2. Therefore, considering the results in Table 4 and Table 5, the Bagged Trees will be used as the classifier for subsequent experiments.
Algorithm 1: Search the Optimal Classifier.
Sensors 20 04677 i001

3.2. Wavelet Basis Comparison and Selection

After the classifier is determined, the model order n D S S M and wavelet basis ω D S S M should be further confirmed through the grid search method. This process can be seen in the step 1 of the Figure 2. The candidate wavelets include db1, db2, db3, db4, db5, db6, db8, db16, db32, sym2, sym8, sym16, coif1, coif3 and dmey. The candidate model order is 5 to 10. The Following Table 6, Table 7, Table 8, Table 9, Table 10, Table 11, Table 12, Table 13 and Table 14 are the experiments results of the DRMS database without LEFs, in which the highest accuracy values are highlighted in bold.
From Table 6, Table 7, Table 8, Table 9 and Table 10, we can see that under the R&K standard, when the order of the DSSM is 6 and the wavelet basis is selected as db1, the classification accuracy for three to six classes can reach the highest. When the wavelet basis is selected as sym2, the accuracy of the two classes is the highest. Through further analysis, it can be seen that in the results of two class classification, the difference between the accuracy of the db1 and the highest is very small.
As can be seen from Table 11, Table 12, Table 13 and Table 14, when the order n D S S M is 6, the highest classification accuracy can be obtained in two to five classes sleep state classification. Moreover, in the three to five classes classifications, when the wavelet basis is db1, the highest classification accuracy can be achieved. In the two classes of sleep classification, when the wavelet base is db1, the accuracy is 0.14% lower than the highest accuracy. Combining the classification results of the above tables, in order to facilitate subsequent calculations, the db1 was uniformly used as the wavelet basis for DSSM estimation and the model order of DSSM adopts 6.
Then, the wavelet basis ω L E and level l L E which are required to calculate LE should be further determined according to the experimental results in the next step. That is, on the basis of the features previously extracted from the DSSM, LEFs will be added which have been shown in the Step 2 of the Figure 2. Table 15, Table 16, Table 17, Table 18 and Table 19 are the classification accuracies of 2–6 classes under the R&K standard, in which the highest accuracy values are highlighted in bold.
As can be seen from Table 15, Table 16, Table 17, Table 18 and Table 19, when l L E = 5 , the ω L E is db4, the accuracy of two, four and six classes is the highest. Moreover, when the ω L E is set to the db5 and db3, the classification accuracy of three and five classes can reach the highest respectively. The Table 20 is the confusion matrix of six classes sleep state classification on DRMS database with IMBEFs under the R&K standard. As shown in the Table 20, the sensitivity of Awa, REM, S1, S2, S3 and S4 are 93.68%, 81.16%, 14.37%, 89.29%, 25.71% and 77.99%, respectively. Moreover, the overall accuracy of the six classes classification is 78.92%.
Table 21, Table 22, Table 23 and Table 24 show the classification accuracy of 2–5 classes with LEFs on the DRMS database under the AASM, in which the highest accuracy values are highlighted in boldface. As can be seen from these tables, after adding LEFs, the accuracy of each classification has been greatly improved. Among them, the highest accuracy can be obtained when using the LEFs extracted from the 5 level WPD and there are three corresponding wavelet bases, which are db1, db2 and db4. When the wavelet basis is selected as db4, the accuracy of two classes and four classes can reach the highest. In addition, the accuracy of three and five classes are 88.22% and 79.90% respectively, which is not much different from 88.26% and 79.97% of the corresponding highest classification accuracy. Therefore, the parameter of l L E will be set as 5 and ω L E will be set as db4 in this paper.
The confusion matrix of five classes sleep state classification is listed in the Table 25. As can be seen in this table, the overall accuracy is 79.90%. The sensitivity of Awa, REM, N1, N2, N3 are 92.89%, 81.22%, 17.57%, 85.52% and 78.79%. Furthermore, the receiver operating characteristic (ROC) curve of the classifier trained by this dataset with the confirmed parameter is shown in Figure 3.
As can be seen in the Figure 3, when the positive samples is Awa, the true positive rate is 0.93 and the false positive rate is 0.05. In addition, when the positive samples are REM, N2 and N3, the corresponding positive sample rates are 0.81, 0.86 and 0.79. When the positive samples are N1, the area under the curve (AUC) area is only 0.18. Moreover, the issue of low classification accuracy of S1(N1) will be discussed in the Section 4.

3.3. Experiments on S-EDF and ISRUC3 Database

After experiments on the DRMS database, through the comprehensive comparison and selection, the classifier is selected as the Bagged Tress, n D S S M is set to 6, l L E is set to 5, ω D S S M is set to db1 and ω L E is set to db4. In order to further evaluate the performance of the method proposed in this paper, we will use these parameters to conduct experiments on the S-EDF database and the ISRUC3 database.
The classification accuracy and Cohen’s Kappa Coefficients of the 2–6 classes on the S-EDF database are shown in Table 26. Furthermore, the confusion matrix of six class classification is listed for further analysis in Table 27.
Similarly, the method proposed in this paper was also tested on the ISRUC3 database. The experimental results are shown in the following Table 28 and Table 29.
As can be seen from Table 28, the classification accuracies of two to five classes are 96.18%, 90.54%, 84.68% and 81.65%, respectively. In the five class classification, the sensitivity of Awa, REM, N1, N2, N3 are 90.31%, 83.36%, 57.70%, 81.12% and 87.50%, respectively.

4. Discussion

Table 30 shows the comparison of the classification accuracy from two to six classes of the various published method and the method proposed in this paper on the DRMS database under the R&K standard.
As can be seen from the Table 30 above, when the only DSSMFs is used, the method proposed in this paper has a certain improvement in accuracy compared with the others. After adding LEFs on the basis of DSSMFs, the classification accuracies of two to six classes are improved by 1.27%, 1.02%, 1.27%, 1.38% and 0.72% compared with our previous study [27].
It can be seen from Table 31 that the method proposed in this paper has a certain improvement in the sleep stage classification of 3–5 classes on the DRMS database compared with the current existing methods. The N1 sensitivity of this method on the DRMS database is 17.57%, which is higher than 14.3% of Ghimatgar [7]. Moreover, Table 32 is the accuracy comparison of various published methods on S-EDF database.
It can be seen from Table 32 that when a large number of samples are used, the accuracy is also improved compared with other published methods. Among them, the accuracy for the classification of four classes is 93.87%, while the Sharma [28] is 92.1% and the Shen [27] is 93.0%. In the classification of two classes, Abdulla et al. [6] has the highest accuracy of 93%; however, the number of epoch they used is only 23806. The sensitivity of S1 in this paper is 19.32%, which is higher than 18.3% of Ghimatgar [7] and 15.9% of Shen [27].
The experiments results of the proposed method on ISRUC3 database are also compared with other methods, which can be seen in the following Table 33.
As can be seen from the Table 33, compared with Ghimatgar [7], the detection accuracy of two and three classes is improved by more than 2 points. The sensitivity of S1 in Table 29 is 57.70%, which is higher than 33% of Ghimatgar [7]. Furthermore, the Cohen’s kappa Coefficient is also much higher than Ghimatgar [7].
It should be noted that the classification of S1 which is an enormous challenge to all of the published method. From neurophysiological standpoint, S1(N1) is a transition phase and is a mixture of wakefulness and sleep resulting in similarity with the neural oscillations of S1 and Awa. In REM state, the cortex shows 40–60 Hz gamma waves as it does in waking. So the S1 state is often misclassified as REM or Awa state during the visual inspection by experts [3,11]. This is why many of the S1 epochs are misclassified as REM, Awa or S2 stages in this work. In addition, with different databases, the classification accuracy of S1 (N1) are also different. The detection accuracy of N1 on the ISRUC3 database reached 57.7%; on the DRMS database and the S-EDF database, however, it is less than 20%. This is also related to the different proportions of S1 stages in each database. Under the same AASM standard, on the ISRUC3 database, the S1 accounted for 12.65%; however, on the DRMS database, the S1 accounted for only 7.3%. Furthermore, under the R&K standard, the sensitivity of S3 on the S-EDF and DRMS databases is low, only 46.11% and 25.71%, respectively. The reason relate to this phenomenon rely mainly on that the S3 is a transition phase of S2 and S4. Thus the further research should be conducted to improve the S3 detection accuracy. Moreover, as can be seen in Table 20, a large number of S3 is misclassified as S2 and the other large part is misclassified as S4. Similarly, in Table 27, almost half of S3 epochs are misclassified as S2 and a small part are misclassified as S4. In addition, when under the AASM standard, after combining the S3 and S4 into N3, the sensitivity of N3 has been improved. As shown in Table 25, only 761 epochs of N3 were misclassified as N2; however, in Table 20, 1022 epochs of S3 were misclassified as S2 and 231 epochs of S4 were misclassified as S2. Therefore, the AASM standard is more suitable for guiding the researchers to annotate the sleep stages than the R&K standard.

5. Conclusions

In this paper, a novel IMBEF based automatic sleep stage classification method is proposed. Moreover, a grid search strategy was presented to determine a suitable model order n D S S M and a wavelet basis ω D S S M for estimating the DSSM among 15 candidate wavelets and 6 candidate model orders. With the same search strategy, a proper wavelet basis ω L E and the WPD level l L E for LE calculation are determined under 15 candidate wavelets and multilevel decomposition. The fused IMBEFs extracted from the DSSM and LE would be used as the input features of the suitable classifier which can be selected by comparing a variety of classifiers’ experiment results. In order to precisely verify the performance of the proposed IMBEF based automatic sleep stage classification method, experiments were carried out on three public databases. The comparison results with other state-of-the-art methods show that the proposed algorithm can achieve higher accuracy.
We demonstrated in this paper measurable improvements in automatic sleep stage classification, providing better understanding and diagnostic of the sleep phenomenon, clearly essential in medical, wellness and other fields.

Author Contributions

Conceptualization, H.S. and A.L.; methodology, H.S., A.G. (Allon Guez) and A.L.; software, H.S.; validation, H.S., A.G. (Allon Guez) and A.G. (Aiying Guo); formal analysis, H.S., A.L.; investigation, A.G. (Aiying Guo); resources, F.R.; data curation, A.L.; writing–original draft preparation, H.S. and A.L.; writing–review and editing, H.S., A.G. (Allon Guez) and A.L.; visualization, M.X.; supervision, M.X.; project administration, M.X.; funding acquisition, F.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China grant number 61674100.

Acknowledgments

We acknowledge the support provided by the Microelectronics Research and Development Center of Shanghai University.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rectschaffen, A.; Kales, A. A Manual of Standardized Terminology, Techniques and Scoring Systems for Sleep Stages of Human Subjects; National Government Publication: Los Angeles, CA, USA, 1968.
  2. Iber, C.; Ancoliisrael, S.; Chesson, A.; Quan, S.F. The AASM Manual for The Scoring of Sleep and Associated Events: Rules, Terminology and Technical Specifications; American Academy of Sleep Medicine: Darien, IL, USA, 2007. [Google Scholar]
  3. Hassan, A.R.; Bhuiyan, M.I.H. A decision support system for automatic sleep staging from EEG signals using tunable Q-factor wavelet transform and spectral features. J. Neurosci. Methods 2016, 271, 107–118. [Google Scholar] [CrossRef] [PubMed]
  4. Diykh, M.; Li, Y.; Abdulla, S. EEG sleep stages identification based on weighted undirected complex networks. Comput Methods Programs Biomed. 2020, 184, 105–116. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Kang, D.Y.; Deyoung, P.N.; Malhotra, A.; Owens, R.L.; Coleman, T.P. A State Space and Density Estimation Framework for Sleep Staging in Obstructive Sleep Apnea. IEEE Trans. Biomed. Eng. 2017, 65, 1201–1212. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Abdulla, S.; Diykh, M.; Laft, R.L.; Saleh, K.; Deo, R.C. Sleep EEG Signal Analysis Based on Correlation Graph Similarity Coupled with an Ensemble Extreme Machine Learning Algorithm. Expert Syst. Appl. 2019, 138, 112790–112804. [Google Scholar] [CrossRef]
  7. Ghimatgar, H.; Kazemi, K.; Helfroush, M.S.; Aarabi, A. An automatic single-channel EEG-based sleep stage scoring method based on hidden Markov model. J. Neurosci. Methods 2019, 324, 180320–180336. [Google Scholar] [CrossRef]
  8. Taran, S.; Sharma, P.C.; Bajaj, V. Automatic sleep stage classification using optimize flexible analytic wavelet transform. Knowl. Based Syst. 2020, 192, 105367–105374. [Google Scholar] [CrossRef]
  9. Sharma, M.; Patel, S.; Choudhary, S.; Acharya, U.R. Automated Detection of Sleep Stages Using Energy-Localized Orthogonal Wavelet Filter Banks. Arab. J. Sci. Eng. 2020, 45, 2531–2544. [Google Scholar] [CrossRef]
  10. Hassan, A.R.; Bhuiyan, M.I.H. Automatic sleep scoring using statistical features in the EMD domain and ensemble methods. Biocybern Biomed. Eng. 2016, 36, 248–255. [Google Scholar] [CrossRef]
  11. Hassan, A.R.; Bhuiyan, M.I.H. Automated identification of sleep states from EEG signals by means of ensemble empirical mode decomposition and random under sampling boosting. Comput. Methods Programs Biomed. 2017, 140, 201–210. [Google Scholar] [CrossRef]
  12. Sharma, R.; Pachori, R.B.; Upadhyay, A. Automatic sleep stage classification based on iterative filtering of electroencephalogram signals. Neural. Comput. Appl. 2017, 28, 2959–2978. [Google Scholar] [CrossRef]
  13. Zhang, J.; Yao, R.; Ge, W.; Gao, J. Orthogonal convolutional neural networks for automatic sleep stage classification based on single-channel EEG. Comput. Methods Programs Biomed. 2020, 183, 105089–105100. [Google Scholar] [CrossRef] [PubMed]
  14. Zhang, X.; Xu, M.; Li, Y.; Su, M.; Xu, Z.; Wang, C.; Kang, D. Automated multi-model deep neural network for sleep stage scoring with unfiltered clinical data. Sleep Breath 2020, 24, 581–590. [Google Scholar] [CrossRef] [Green Version]
  15. Mousavi, Z.; Rezaii, T.Y.; Sheykhivand, S.; Farzamnia, A.; Razavi, S.N. Deep convolutional neural network for classification of sleep stages from single-channel EEG signals. J. Neurosci. Methods 2019, 324, 108312–108320. [Google Scholar] [CrossRef] [PubMed]
  16. Korkalainen, H.; Aakko, J.; Nikkonen, S.; Kainulainen, S.; Leino, A.; Duce, B.; Leppänen, T. Accurate Deep Learning-Based Sleep Staging in a Clinical Population with Suspected Obstructive Sleep Apnea. IEEE J. Biomed. Health Inform. 2019, 27, 2073–2081. [Google Scholar] [CrossRef] [PubMed]
  17. Michielli, N.; Acharya, U.R.; Molinari, F. Cascaded LSTM recurrent neural network for automated sleep stage classification using single-channel EEG signals. Comput. Biol. Med. 2019, 106, 71–81. [Google Scholar] [CrossRef]
  18. Kemp, B.; Zwinderman, A.H.; Tuk, B.; Kamphuisen, H.A.; Oberye, J.J. Analysis of a sleep-dependent neuronal feedback loop: the slow-wave microcontinuity of the EEG. IEEE. Trans. Biomed. Eng. 2000, 47, 1185–1194. [Google Scholar] [CrossRef] [PubMed]
  19. Goldberger, A.L.; Amaral, L.A.; Glass, L.; Hausdorff, J.M.; Ivanov, P.C.; Mark, R.G.; Mietus, J.E.; Moody, G.B.; Peng, C.K.; Stanley, H.E. PhysioBank, PhysioToolkit and PhysioNet: components of a new research resource for complex physiologic signals. Circ. Res. 2000, 101, e215–e220. [Google Scholar] [CrossRef] [Green Version]
  20. Stephanie, D.; Myriam, K.; Stenuit, P.; Kerkhofs, M.; Stanus, E. Cancelling ECG artifacts in EEG using a modified independent component analysis approach. EURASIP J. Adv. Signal Process. 2008, 1, 747325, zenodo. Available online: https://zenodo.org/record/2650142#.Xztj3igzabh (accessed on 1 January 2009).
  21. Khalighi, S.; Sousa, T.; Santos, J.M.; Nunes, U. ISRUC-Sleep: A comprehensive public dataset for sleep researchers. Comput. Methods Programs Biomed. 2016, 124, 180–192. [Google Scholar] [CrossRef]
  22. Delorme, A.; Makeig, S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef] [Green Version]
  23. Zhang, Y.; Liu, B.; Ji, X.; Huang, D. Classification of EEG Signals Based on Autoregressive Model and Wavelet Packet Decomposition. Neural. Process Lett. 2017, 45, 365–378. [Google Scholar] [CrossRef]
  24. Law, L.S.; Kim, J.H.; Liew, W.Y.; Lee, S.K. An approach based on wavelet packet decomposition and Hilbert–Huang transform (WPD–HHT) for spindle bearings condition monitoring. Mech. Syst. Signal Process 2012, 33, 197–211. [Google Scholar] [CrossRef]
  25. Cao, Y.; Sun, Y.; Xie, G.; Wen, T. Fault Diagnosis of Train Plug Door Based on a Hybrid Criterion for IMFs Selection and Fractional Wavelet Package Energy Entropy. IEEE Trans. Veh. Technol. 2019, 68, 7544–7551. [Google Scholar] [CrossRef]
  26. Van, O.P.; De, M.B. N4SID: Numerical Algorithms for State Space Subspace System Identification. In Associated Technologies and Recent Developments, Proceedings of the 12th Triennal World Congress of the International Federation of Automatic Control, Sydney, Australia, 18–23 July 1993; Elsevier: London, UK, 1993; Volume 26, pp. 55–58. [Google Scholar]
  27. Shen, H.; Xu, M.; Guez, A.; Li, A.; Ran, F. An accurate sleep stage classification method based on state space model. IEEE Access 2019, 7, 125268–125279. [Google Scholar] [CrossRef]
  28. Sharma, M.; Goyal, D.; Achuth, P.V.; Acharya, U.R. An accurate sleep stage classification system using a new class of optimally time-frequency localized three-band wavelet filter bank. Comput. Biol. Med. 2018, 98, 58–75. [Google Scholar] [CrossRef]
  29. Liang, S.F.; Kuo, C.E.; Hu, Y.H.; Pan, Y.H.; Wang, Y.H. Automatic stage scoring of single-channel sleep EEG by using multiscale entropy and autoregressive models. IEEE Trans. Instrum. Meas. 2012, 61, 1649–1657. [Google Scholar] [CrossRef]
  30. Hsu, Y.L.; Yang, Y.T.; Wang, J.S.; Hsu, C.Y. Automatic sleep stage recurrent neural classifier using energy features of EEG signals. Neurocomputing 2013, 104, 105–114. [Google Scholar] [CrossRef]
  31. Hassan, A.R.; Subasi, A. A decision support system for automated identification of sleep stages from single-channel EEG signals. Knowl. Based Syst. 2017, 128, 115–124. [Google Scholar] [CrossRef]
  32. Zhu, G.; Li, Y.; Wen, P. Analysis and classification of sleep stages based on difference visibility graphs from a single-channel EEG signal. IEEE J. Biomed. Health Inform. 2014, 18, 1813–1821. [Google Scholar] [CrossRef]
  33. Jiang, D.; Lu, Y.N.; Yu, M.A.; Wang, Y. Robust sleep stage classification with single-channel EEG signals using multimodal decomposition and HMM-based refinement. Expert Syst. Appl. 2019, 121, 188–203. [Google Scholar] [CrossRef]
  34. Rahman, M.M.; Bhuiyan, M.I.H.; Hassan, A.R. Sleep stage classification using single-channel EOG. Comput. Biol. Med. 2018, 10, 211–220. [Google Scholar] [CrossRef] [PubMed]
  35. Supratak, A.; Dong, H.; Wu, C.; Guo, Y. A model for automatic sleep stage scoring based on raw single-channel EEG. IEEE T. Neur. Sys. Reh. 2017, 25, 1998–2008. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. A schematic outline of the proposed improved model based essence features (IMBEFs) based sleep stage classification algorithm.
Figure 1. A schematic outline of the proposed improved model based essence features (IMBEFs) based sleep stage classification algorithm.
Sensors 20 04677 g001
Figure 2. The diagram of the parameter optimization process.
Figure 2. The diagram of the parameter optimization process.
Sensors 20 04677 g002
Figure 3. The ROC curve of the classifier to classify the five classes of DRMS database under the AASM standard.
Figure 3. The ROC curve of the classifier to classify the five classes of DRMS database under the AASM standard.
Sensors 20 04677 g003
Table 1. The class description considered in this work under the Rechtschaffen’s and Kale’s (R&K) standard.
Table 1. The class description considered in this work under the Rechtschaffen’s and Kale’s (R&K) standard.
Classes6 Classes5 Classes4 Classes3 Classes2 Classes
StagesAwa vs. REM vs. S1 vs. S2 vs. S3 vs. S4Awa vs. REM vs. S1 vs. S2 vs. S3, S4Awa vs. REM vs. S1, S2 vs. S3, S4Awa vs. REM vs. NREM (S1, S2, S3, S4)Awa vs. Asleep (REM, S1, S2, S3, S4)
Table 2. The class description considered in this work under the American Academy of Sleep Medicine (AASM) standard.
Table 2. The class description considered in this work under the American Academy of Sleep Medicine (AASM) standard.
Classes5 Classes4 Classes3 Classes2 Classes
StagesAwa vs. REM vs. N1 vs. N2 vs. S3, S4Awa vs. REM vs. N1, N2 vs. N3Awa vs. REM vs. NREM (N1, N2, N3)Awa vs. Asleep (REM, N1, N2, N3)
Table 3. The specification of the electroencephalograph (EEG) databases included in this study.
Table 3. The specification of the electroencephalograph (EEG) databases included in this study.
Scoring ManualR&K CriteriaAASM Criteria
dataset nameS-EDF databaseDRMS databaseDRMS databaseISRUC3 database
Epoch length(Seconds)30203020
Number of subjects26202010
Recoding Files34202010
Age25–9620–6520–6530–58
Gender(male-female)17–174–164–169–1
Sampling frequency (Hz)100200200200
EEG channelPz–OzCz–A1Cz–A1C3–A2
StageNumber of epochs
Awa7,3835560135591702
REM6744455530191238
S1(N1)3017178814801123
S2(N2)1,72491,327482512850
S3(N3)2288211239561976
S415103071
Total Number of Epochs10,46433,04012,02658889
Table 4. The outputs of Algorithm 1.
Table 4. The outputs of Algorithm 1.
ClassesOptimal Classifier ω DSSM n DSSM Accuracy(%)
2Bagged Treessym2695.79
3Bagged Treesdb1688.29
4Bagged Treesdb1683.07
5Bagged Treesdb1681.45
6Bagged Treesdb1678.57
Table 5. Comparison of different classifiers in two class classification with different wavelet. The n D S S M = 6 . Highest values are highlighted in boldface.
Table 5. Comparison of different classifiers in two class classification with different wavelet. The n D S S M = 6 . Highest values are highlighted in boldface.
Accuracy (%)db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
Linear Discriminant94.2193.9494.3393.9393.5293.1792.5391.7291.6693.9092.3591.6893.9892.5992.24
Quadratic Discriminant92.9493.7394.4292.8991.9091.0791.1587.6387.1393.6890.9988.3794.1491.2987.27
Quadratic SVM95.1395.0195.1694.8894.6294.4394.1493.7193.4795.0994.1393.5995.1294.2593.68
Fine KNN91.9690.9692.6291.0590.0689.8288.7087.6086.5890.8488.4387.0090.6189.5288.22
Bagged Trees95.7195.7295.7095.5995.4295.4395.1694.9395.0095.7995.0494.8295.7195.2995.01
RUSBoosted Trees94.2894.0894.6894.4194.0793.9493.7593.0293.0894.0593.7892.7994.2093.7693.20
Table 6. The accuracy (%) of the two class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
Table 6. The accuracy (%) of the two class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
ω DSSM
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
n DSSM 595.5295.5595.6995.5995.1795.1695.0994.9894.9695.7294.9494.9295.5495.0795.24
695.7195.7295.7095.5995.4295.4395.1694.9395.0095.7995.0494.8295.7195.2995.01
795.6395.5495.5795.7095.6195.3895.2894.7094.8095.5995.3194.5895.6095.2194.94
895.4795.5195.5195.6495.5095.2395.1594.8594.4795.4595.1694.7595.4495.2094.62
995.5795.5295.5495.6295.7195.3495.1294.9394.3395.4995.2694.5595.4595.2194.45
1095.3895.4595.4695.5795.5495.2895.2394.7294.1295.4895.1094.7695.4495.2394.32
Table 7. The accuracy (%) of three class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
Table 7. The accuracy (%) of three class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
ω DSSM
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
n DSSM 587.9087.9287.7288.0486.7086.7386.1885.5085.2587.8686.1185.5587.5486.0385.43
688.2988.0388.2688.0587.8587.7086.7285.3885.2587.8286.5485.2087.9087.0985.65
787.8887.7288.1387.9487.6787.7687.2084.8584.2787.8887.4385.0087.9687.1884.82
887.6787.8788.1087.9687.8687.6887.2485.6684.1587.8887.5385.8387.7187.1684.20
987.8187.7988.0787.8387.9687.8187.5585.9983.8587.8487.5586.3787.6787.2483.79
1087.6287.8887.8488.0287.8087.7887.5886.1483.9387.9287.5286.6287.2787.1683.53
Table 8. The accuracy (%) of four class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
Table 8. The accuracy (%) of four class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
ω DSSM
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
n DSSM 582.6182.6982.5182.9381.3681.3780.2979.8379.5382.8880.5979.5982.1380.4979.76
683.0782.7182.7682.7882.2982.2381.4079.6379.1682.7181.0779.4382.5881.7579.79
782.3882.7882.8182.4282.2082.3581.4578.9178.3782.7681.7079.0382.4581.6978.62
882.2982.4082.6982.3282.3382.0681.5179.7878.0282.3981.8379.6982.3681.3678.28
982.2982.5682.5482.4982.2482.2581.7680.0077.6882.3581.9980.3582.5381.5878.00
1082.0982.4882.7182.7182.1082.3881.9579.9577.7082.5681.6580.5081.9381.5777.48
Table 9. The accuracy (%) of five class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
Table 9. The accuracy (%) of five class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
ω DSSM
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
n DSSM 581.1481.1580.6681.1779.7979.8379.0478.5578.3081.2079.2778.4180.5179.3078.72
681.4581.3881.4281.1781.0080.6079.9278.3878.1381.3379.6678.1581.1980.2378.71
780.8080.8781.0080.8880.6380.4879.8777.7577.0880.9180.2777.5380.8780.0777.61
880.9581.0781.0880.9280.6280.5280.1378.1376.8981.0080.1578.4880.8580.1677.20
980.9580.9080.7580.8880.6780.5680.2778.7376.7280.8679.9978.9380.6479.9776.78
1080.6480.8680.9780.9480.7180.7280.2978.4276.6280.9380.2278.9480.4779.9076.12
Table 10. The accuracy (%) of six class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
Table 10. The accuracy (%) of six class sleep stage classification with different wavelet bases and different order of DSSM under R&K standard. Only DSSMFs are used, no LEFs.
ω DSSM
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
n DSSM 578.0177.9677.5677.9476.4776.4675.4474.9974.5877.6775.8075.0277.1775.5874.99
678.5778.2078.2678.1677.7577.5676.8875.4175.0778.4076.4375.0278.3877.0275.34
777.3177.3977.5277.3876.8977.0076.3274.1173.6377.3976.5474.2277.0376.7774.09
877.3677.4377.4777.4076.9376.8476.1574.7673.0777.5876.5574.7577.1876.2173.54
977.2577.3877.5677.4476.9476.8976.3974.9173.0177.2276.7975.2177.0776.3473.10
1076.7677.3177.6677.5976.9976.8976.6474.6472.8877.2476.3475.2277.0876.4572.47
Table 11. The accuracy (%) of two class sleep stage classification with different wavelet bases and different order of DSSM under AASM standard. Only DSSMFs are used, no LEFs.
Table 11. The accuracy (%) of two class sleep stage classification with different wavelet bases and different order of DSSM under AASM standard. Only DSSMFs are used, no LEFs.
ω DSSM
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
n DSSM 595.0995.0195.2094.9994.6394.5394.6394.6794.4495.1094.4094.5494.9194.5494.56
695.5995.6395.6895.5595.4295.2995.1094.7594.7995.7395.0294.7995.7295.1294.90
795.1995.2195.5095.3095.1194.8994.9194.3394.2695.1394.9994.3595.1895.0094.30
894.8295.1495.0995.3194.9094.7394.7194.2593.8395.1794.6294.1694.9994.7494.03
995.1695.2895.2895.1095.2695.0494.9694.1693.7995.2294.6894.2595.1194.8193.97
1095.0795.2095.1895.4095.0994.9894.9594.2493.4195.1094.7294.2095.1795.0993.46
Table 12. The accuracy (%) of three class sleep stage classification with different wavelet bases and different order of DSSM under AASM standard. Only DSSMFs are used, no LEFs.
Table 12. The accuracy (%) of three class sleep stage classification with different wavelet bases and different order of DSSM under AASM standard. Only DSSMFs are used, no LEFs.
ω DSSM
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
n DSSM 587.1287.0486.9087.1785.8185.8485.1384.8384.6986.9784.8184.8286.4585.4584.85
687.5287.3587.5587.6387.1286.8886.1684.4984.6187.2685.8784.4887.2186.4684.85
787.6287.3287.8387.4987.1587.3086.5884.5583.9587.4987.0084.3887.4486.8584.18
886.9287.5687.5987.4487.1587.2686.7385.5583.2387.2086.7785.1087.1486.6783.53
987.5287.2887.6887.4387.3487.4387.1685.5683.3687.5586.9585.7487.2086.8683.07
1087.3887.5187.5287.6287.1487.4687.3285.6483.1987.3787.0286.0987.2586.8782.79
Table 13. The accuracy (%) of four class sleep stage classification with different wavelet bases and different order of DSSM under AASM standard. Only DSSMFs are used, no LEFs.
Table 13. The accuracy (%) of four class sleep stage classification with different wavelet bases and different order of DSSM under AASM standard. Only DSSMFs are used, no LEFs.
ω DSSM
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
n DSSM 581.2680.8280.6580.9779.2778.9078.1477.8077.4080.5278.4877.8680.2278.1377.87
681.5180.8781.1881.1280.4479.8679.3577.1677.0180.8979.2777.3580.8179.4077.56
781.1380.9181.4080.8680.2980.2179.6176.9875.9280.7480.0577.1980.4879.6076.52
880.2280.6080.3480.5680.1079.6779.3777.8275.3280.6679.4277.8980.1579.4176.05
980.6680.7580.7780.4780.2580.3879.7277.4675.5481.0779.8978.1180.8979.8275.01
1080.5080.8381.0681.2980.3380.4380.2177.7875.5280.6979.5378.3480.1579.8774.99
Table 14. The accuracy (%) of five class sleep stage classification with different wavelet bases and different order of DSSM under AASM standard. Only DSSMFs are used, no LEFs.
Table 14. The accuracy (%) of five class sleep stage classification with different wavelet bases and different order of DSSM under AASM standard. Only DSSMFs are used, no LEFs.
ω DSSM
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
n DSSM 578.7078.4978.2278.8377.0776.9476.1075.1675.4578.7076.3475.4177.9076.1475.82
678.7278.5578.5478.6777.7477.7377.0875.3674.9978.5877.2275.1578.2977.2175.72
778.5778.4978.3878.1577.9577.7777.4274.8173.5478.4177.1974.6477.9377.0274.52
878.0678.1177.9477.8377.4377.5677.0975.5573.4578.2377.3575.3377.8777.1873.64
978.2878.3078.3077.7477.7778.0377.6475.4973.2778.6077.4175.8878.1477.1173.29
1077.9178.2978.2678.2977.8777.8377.6375.1273.2578.2077.4975.5578.2677.3772.38
Table 15. The accuracy (%) of two class sleep stage classification with different ω L E and l L E under R&K standard.
Table 15. The accuracy (%) of two class sleep stage classification with different ω L E and l L E under R&K standard.
ω LE
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
l LE 395.7195.6995.7095.8495.6895.6495.6895.7195.6495.6795.6695.7795.6995.7095.69
495.7495.6995.7495.9695.6995.7595.7195.6395.7195.7195.5895.7095.7395.5695.63
595.8295.8995.9096.1795.8895.8795.8195.7295.8595.8095.7495.7495.8995.7495.72
695.8095.7195.9396.0695.7095.7395.6995.6795.5895.8495.8195.6395.8295.6595.49
795.7695.7495.6495.7095.7195.5795.6195.5795.6095.5895.5395.7295.6495.5495.51
Table 16. The accuracy (%) of three class sleep stage classification with different ω L E and l L E under R&K standard.
Table 16. The accuracy (%) of three class sleep stage classification with different ω L E and l L E under R&K standard.
ω LE
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
l LE 388.1288.0787.9188.4888.1288.1188.2187.8687.9788.0988.0287.9688.1588.1088.10
488.2288.2288.2288.5988.8188.1287.8688.0887.9188.0288.0587.9288.0287.8587.94
588.5188.5688.6188.7288.8988.7588.1488.1888.0988.4788.3788.1488.6488.1887.98
688.1188.0988.0088.6688.7388.0087.8887.6387.5488.1787.7087.8487.8287.7687.43
787.9287.7687.5888.6288.6287.6587.2787.3487.3587.8387.4987.3987.6187.4687.27
Table 17. The accuracy (%) of four class sleep stage classification with different ω L E and l L E under R&K standard.
Table 17. The accuracy (%) of four class sleep stage classification with different ω L E and l L E under R&K standard.
ω LE
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
l LE 382.8782.7682.9883.0682.9982.8082.8782.9882.7582.8582.8282.9182.8782.9783.02
482.9782.9183.2383.5483.0282.8982.8582.8682.8382.9382.9282.6882.8982.7982.75
583.5383.5283.5383.9783.7183.2583.0482.9983.1883.5783.0783.0783.3882.9882.84
683.2582.9183.6783.8082.5582.7682.2582.3882.4082.9182.4282.1182.8782.3782.17
782.5782.5782.5282.6082.4082.2981.9381.9082.0382.5682.1381.9282.4381.8581.89
Table 18. The accuracy (%) of five class sleep stage classification with different ω L E and l L E under R&K standard.
Table 18. The accuracy (%) of five class sleep stage classification with different ω L E and l L E under R&K standard.
ω LE
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
l LE 381.2481.4281.5281.6681.4581.2981.3181.2881.3581.3781.1981.1381.2781.3281.30
481.4981.5581.6781.9381.7081.0981.1881.1081.2181.3081.4581.3381.4381.2880.97
581.8781.6682.3282.2881.6181.4281.0980.9581.0181.3480.9380.9281.1981.0180.79
681.5981.4881.2981.8281.0780.9880.6780.8580.7581.5280.9080.8981.3680.6880.44
781.0781.0480.8981.4780.7080.3880.5880.5980.5680.8080.6980.4480.7680.4780.55
Table 19. The accuracy (%) of six class sleep stage classification with different ω L E and l L E under R&K standard.
Table 19. The accuracy (%) of six class sleep stage classification with different ω L E and l L E under R&K standard.
ω LE
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
l LE 378.2578.3978.4478.5278.1878.3577.9978.0878.1578.2978.0678.0378.2178.1478.20
478.6378.6478.6778.8078.6778.6578.5678.5478.5478.6278.4478.4578.6378.4878.28
578.9178.9178.8878.9278.2378.4278.3878.2278.3878.7778.2478.2078.8178.4477.98
678.4578.3578.2678.6477.9777.9077.8877.6877.7878.3677.7477.5778.2277.6477.36
777.9277.6677.4877.5577.3177.2277.2577.0877.6377.6077.4577.1877.8177.2277.02
Table 20. The confusion matrix of six classes sleep state classification on DRMS database under the R&K standard. The l L E = 5 , ω L E = d b 4 , n D S S M = 6 , ω D S S M = d b 1 .
Table 20. The confusion matrix of six classes sleep state classification on DRMS database under the R&K standard. The l L E = 5 , ω L E = d b 4 , n D S S M = 6 , ω D S S M = d b 1 .
Automatic Classification 
AwaREMS1S2S3S4Sen (%)Overall Accuracy (%)
ExpertAwa5247968515531593.6878.92
REM2033697865661281.16
S14187842573281014.37
S2262731621185224811989.29
S32000102254352725.71
S417400231271239577.99
Table 21. The accuracy (%) of two class sleep stage classification with different ω L E and l L E under AASM standard.
Table 21. The accuracy (%) of two class sleep stage classification with different ω L E and l L E under AASM standard.
ω LE
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
l LE 395.3995.2195.3496.1895.2595.2595.3395.2195.2695.9995.2895.2195.3595.2495.30
496.0095.7095.8496.2495.7595.3795.3495.2595.3496.2495.3195.2495.8595.3995.43
595.8795.9195.9496.4895.9595.3695.4195.4695.4596.4195.4595.5295.5395.4095.45
695.6195.3795.7996.1695.3395.3695.3795.2895.2596.3495.3895.4295.3295.3095.27
795.4295.3195.2695.9095.3895.2695.2495.3495.3495.3595.2195.3095.3895.2295.11
Table 22. The accuracy (%) of three class sleep stage classification with different ω L E and l L E under AASM standard.
Table 22. The accuracy (%) of three class sleep stage classification with different ω L E and l L E under AASM standard.
ω LE
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
l LE 387.9987.9287.6287.5687.7587.6787.5287.8587.5587.6387.6487.7787.7387.5187.59
487.8087.6587.6787.7187.6887.6487.5187.6987.7787.7687.7887.7087.7987.5887.88
588.0088.2688.0488.2287.9687.9987.8587.9287.8488.1788.0488.0488.0487.9987.87
687.8987.7887.7587.7187.4287.6487.5987.3487.5587.6487.5887.3487.8287.4487.26
787.1387.7187.4387.9287.2387.6287.3887.3987.5287.7587.5587.3487.5187.5887.11
Table 23. The accuracy (%) of four class sleep stage classification with different ω L E and l L E under AASM standard.
Table 23. The accuracy (%) of four class sleep stage classification with different ω L E and l L E under AASM standard.
ω LE
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
l LE 381.1481.4381.4581.4281.2381.4081.4481.5381.3381.5781.0781.5581.6581.2981.15
481.6481.4781.3681.3681.4681.3381.2581.4781.3781.7581.0981.3181.6181.2681.37
581.7982.0381.9182.0881.8481.9381.5981.8281.7382.2381.5181.6582.1281.6481.40
681.8381.6681.6681.7681.3981.4081.3781.3280.9981.5181.1681.0181.8580.9980.65
781.3581.2881.1981.4781.0680.9680.9380.6680.7581.3780.8580.6881.4080.5080.40
Table 24. The accuracy (%) of five class sleep stage classification with different ω L E and l L E under AASM standard.
Table 24. The accuracy (%) of five class sleep stage classification with different ω L E and l L E under AASM standard.
ω LE
db1db2db3db4db5db6db8db16db32sym2sym8sym16coif1coif3dmey
l LE 379.2479.0278.8779.2079.0878.9978.8678.9678.7778.8979.1679.0979.0378.6679.09
479.4179.2679.3379.2579.1478.8079.0879.0378.9178.9378.8279.1679.0579.0379.01
579.9779.5479.4879.9079.4579.3479.3679.3879.6279.7779.3679.1579.5379.4379.05
679.1079.2979.2079.2379.1378.9478.8978.6878.6479.2078.8678.5279.3978.4378.01
779.0478.6978.8978.8578.6778.3978.4878.3378.1678.9878.7178.2278.6678.3077.67
Table 25. The confusion matrix of five classes sleep state classification on DRMS database under the AASM standard. The l L E = 5 , ω L E = d b 4 , n D S S M = 6 , ω D S S M = d b 1 .
Table 25. The confusion matrix of five classes sleep state classification on DRMS database under the AASM standard. The l L E = 5 , ω L E = d b 4 , n D S S M = 6 , ω D S S M = d b 1 .
Automatic Classification 
AwaREMN1N2N3Sen(%)Overall Accuracy (%)
ExpertAwa330653681112192.8979.90
REM1312452933301381.22
N13414802603891017.57
N222949950705641785.52
N37710761311778.79
Table 26. The classification accuracy and Cohen’s Kappa Coefficient of 2–6 class sleep classification on S-EDF database.
Table 26. The classification accuracy and Cohen’s Kappa Coefficient of 2–6 class sleep classification on S-EDF database.
6 Classes5 Classes4 Classes3 Classes2 Classes
Accuracy92.04%92.50%93.87%94.90%98.74%
Cohen’s Kappa Coefficient0.82660.83640.86460.88340.9697
Table 27. the Confusion matrix of six classes sleep state classification on S-EDF database.
Table 27. the Confusion matrix of six classes sleep state classification on S-EDF database.
Automatic Classification 
AwaREMS1S2S3S4Sen (%)Overall Accuracy (%)
ExpertAwa73165483461410099.0992.04
REM8764819619880071.46
S157811745836820019.32
S237586376156312545090.62
S371001003105515946.11
S42300171256106070.20
Table 28. The classification accuracy and Cohen’s Kappa Coefficient of 2–5 class sleep classification on ISRUC3 database.
Table 28. The classification accuracy and Cohen’s Kappa Coefficient of 2–5 class sleep classification on ISRUC3 database.
Classes5 Classes4 Classes3 Classes2 Classes
accuracy81.65%84.68%90.54%96.18%
Cohen’s Kappa Coefficient0.76290.77290.81120.878
Table 29. The confusion matrix for five classes case on ISRUC3 database.
Table 29. The confusion matrix for five classes case on ISRUC3 database.
Automatic Classification 
AwaREMN1N2N3Sen(%)Overall Accuracy (%)
ExpertAwa15371384531590.3181.65
REM36103289681383.36
N191135648242757.70
N26885128231225781.12
N31512229172987.50
Table 30. The accuracy comparison of various published methods on DRMS database under the R&K standard. Highest accuracy in each case is highlighted in bold.
Table 30. The accuracy comparison of various published methods on DRMS database under the R&K standard. Highest accuracy in each case is highlighted in bold.
Epoch Mumber6 Classes (%)5 Classes (%)4 Classes (%)3 Classes (%)2 Classes (%)Cross-Validation
Hassan et al. [3]3040170.7373.5079.1284.493.310-fold
Hassan et al. [11]3040168.7473.0578.882.9694.020.5/0.5
Shen et al. [27]3040178.280.982.787.794.910-fold
Proposed method Without LEFs3040178.5281.2682.8187.9595.5910-fold
Proposed method with IMBEFs3040178.9282.2883.9788.7296.1710-fold
Table 31. The accuracy comparison of various published methods on the Dreams Subjects database under the AASM standard. Highest accuracy in each case is highlighted in bold.
Table 31. The accuracy comparison of various published methods on the Dreams Subjects database under the AASM standard. Highest accuracy in each case is highlighted in bold.
Epoch number5 Classes (%)4 Classes (%)3 Classes (%)2 Classes (%)Cross-validation
Hassan et al. [3]2026572.2879.4483.7595.210-fold
Hassan et al. [11]2026574.5980.085.4297.210-fold
Ghimatgar et al. [7]2026578.0880.3886.8894.820-fold
Proposed Method Without LEFs2026578.7280.987.5295.710-fold
Proposed Method with IMBEFs2026579.9082.0888.2296.4810-fold
Table 32. The accuracy comparison of various published methods on the S-EDF database under the R&K standard. Highest accuracy in each case is highlighted in bold.
Table 32. The accuracy comparison of various published methods on the S-EDF database under the R&K standard. Highest accuracy in each case is highlighted in bold.
Epoch Number6 Classes (%)5 Classes (%)4 Classes (%)3 Classes (%)2 Classes (%)Cross-Validation
Hassan et al. [3]1518890.3891.5092.1194.897.50.5/0.5
Abdulla et al. [6]2380693-
Ghimatgar et al. [7]1518889.9191.1192.1994.6598.190.5/0.5
Ghimatgar et al. [7]4010079.1381.8683.7188.3995.980.5/0.5
Hassan et al. [10]1518888.6290.1191.293.5597.730.5/0.5
Hassan et al. [11]1518888.0783.4992.6694.2398.150.5/0.5
Sharma et al. [12]1513990.0391.1392.2994.6698.0210-fold CV
Michielli et al. [17]1028086.710-fold CV
Shen et al. [27]10350591.992.393.093.998.610-fold CV
Sharma et al. [28]8590091.591.792.193.998.310-fold CV
Liang et al. [29]370883.60.5/0.5
Hsu et al. [30]288087.210-fold CV
Hassan et al. [31]1518889.690.891.693.997.20.5/0.5
Zhu et al. [32]1496387.588.989.392.697.910-fold CV
Jiang et al. [33]3697291.52-fold CV
Rahman et al. [34]1518890.2691.0292.8994.198.240.5/0.5
Supratak et al. [35]4195079.820-fold CV
Proposed Method10436892.0492.5093.8794.9098.7410-fold CV
Table 33. The accuracy comparison of the ISRUC3 database with the AASM standard. Highest accuracy in each case is highlighted in bold.
Table 33. The accuracy comparison of the ISRUC3 database with the AASM standard. Highest accuracy in each case is highlighted in bold.
Epoch number5 Classes4 Classes3 Classes2 Classes
Overall AccuracyGhimatgar et al. [7]888977.5682.7488.2693.76
 Proposed Method888981.6584.6890.5496.18
Cohen’s kappa CoefficientGhimatgar et al. [7]88890.710.750.770.79
 Proposed Method88890.76290.77290.81120.878

Share and Cite

MDPI and ACS Style

Shen, H.; Ran, F.; Xu, M.; Guez, A.; Li, A.; Guo, A. An Automatic Sleep Stage Classification Algorithm Using Improved Model Based Essence Features. Sensors 2020, 20, 4677. https://doi.org/10.3390/s20174677

AMA Style

Shen H, Ran F, Xu M, Guez A, Li A, Guo A. An Automatic Sleep Stage Classification Algorithm Using Improved Model Based Essence Features. Sensors. 2020; 20(17):4677. https://doi.org/10.3390/s20174677

Chicago/Turabian Style

Shen, Huaming, Feng Ran, Meihua Xu, Allon Guez, Ang Li, and Aiying Guo. 2020. "An Automatic Sleep Stage Classification Algorithm Using Improved Model Based Essence Features" Sensors 20, no. 17: 4677. https://doi.org/10.3390/s20174677

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop