A physiological signal database of children with different special needs for stress recognition

This study presents a new dataset AKTIVES for evaluating the methods for stress detection and game reaction using physiological signals. We collected data from 25 children with obstetric brachial plexus injury, dyslexia, and intellectual disabilities, and typically developed children during game therapy. A wristband was used to record physiological data (blood volume pulse (BVP), electrodermal activity (EDA), and skin temperature (ST)). Furthermore, the facial expressions of children were recorded. Three experts watched the children’s videos, and physiological data is labeled “Stress/No Stress” and “Reaction/No Reaction”, according to the videos. The technical validation supported high-quality signals and showed consistency between the experts.


Background & Summary
Serious games differ from ordinary games, which can be adjusted based on user/patient situations. For example, minimum/maximum angles for joints can be defined while playing a game which is very important for disabled people who need physical exercises. In the meantime, it is possible to get measurement data in serious games like balance, range of motion (ROM) etc. Serious games focus on patients' required motions with cognitive actions, while ordinal games focus on entertainment.
Serious games are promising tools to improve upper extremity functions in children with neurological disorders. These games are developed as a rehabilitation program option to provide an exercise environment where children with different special needs can do many repetitions and receive immediate feedback during rehabilitation. Serious games have previously been shown to improve cognitive functions and motor skills such as attention, hand-eye coordination, and visual perception and facilitate learning [1][2][3][4][5] . In addition, games positively improve motivation, attention, processing speed, concentration, and visual discrimination in neurodevelopmental disorders such as dyslexia and mental retardation 5,6 .
Game therapies are more effective than traditional physiotherapy programs in improving upper extremity functions in children with Obstetric Brachial Plexus 7 . Furthermore, serious games-based rehabilitation has increased the motivation of children with dyslexia 8 . Children's motivation during game therapy is essential, which may lead children to withdraw attention and even drop out of therapy. If a child gets stressed during game therapy, he/she can lose his/her motivation to continue the therapy. Thus, game therapies can be more prosperous if they recognize the children's emotions/stress and modify the games considering their emotions/stress to increase their motivation and involvement.
AKTIVES dataset consists of physiological signals and camera recordings of 25 children with different special needs (obstetric brachial plexus injury, dyslexia, and mental retardation) and typically developed children who played serious games. The physiological signals (blood volume pulse (BVP), electrodermal activity (EDA), and skin temperature (ST)) were collected using the wearable Empatica E4 wristwatch 36 . Three experts annotated whether the children were stressed or not stressed and whether the children were given a reaction or no reaction through the game therapy. The dataset contains the video files recorded by the camera in Mp4 format. The videos include the child's face and upper body during the game therapy.
AKTIVES dataset may contribute to the emotion recognition and stress detection models using physiological signals and facial expressions for children with different special needs. In addition, the AKTIVES dataset may serve to tackle the research questions related to (1) the multimodal approach to stress detection using physiological signals, (2) emotion recognition using facial expressions, and (3) the reaction of children to the developed serious games.

Ethics statement. Ethical approval has been obtained from the Research Ethics Committee of Istanbul
Medipol University (NoE-10840098-772.02-6580). Furthermore, the parents of the children have been informed about the experimental procedure for the test and data acquisition setup and signed written consent. An informed consent form is used to inform parents about how the videos of children will be shared.

Participants.
A total of 25 children with obstetric brachial plexus injury, dyslexia, intellectual disabilities, and typically developed children participated in the study. In addition, 3 children with obstetric brachial plexus injury, 7 with intellectual disabilities, 4 with dyslexia, and 11 typically developed children participated in the study. The mean age of the children (10 males, 15 females) was 10.2 ± 1.27 years. Inclusion criteria consisted of a diagnosis of obstetric brachial plexus injury, intellectual disability, or dyslexia, aged between 5 to 14 years. Children with other chronic diseases were excluded from the study. Demographic and diagnostic characteristics of the participants were summarized in Table 1. The disability rates provided are 20-39 (individuals with special requirements), 40-49 (individuals with slight special requirements), 60-69 (individuals with advanced special needs), and 70-79 (individuals with very advanced special needs).  www.nature.com/scientificdata www.nature.com/scientificdata/ Games. Becure CatchAPet and Becure LeapBall serious games were used in this study. First, the child tries to touch the rabbit on the screen in the playground with his hand in the virtual environment in the Becure CatchAPet game (Fig. 1). The child earns points for each contact with the rabbit and loses points for each missed rabbit. The child uses his/her wrist flexion/extension movement during the game. The wrist angles are determined by the physical therapists so that each child can virtually touch a rabbit with the correct range of motion. In the Becure LeapBall game, the child tries to drop the ball on the screen into buckets of the same color (Fig. 2). The child tries to hold and release the ball with grasping movements. The child directs the ball in the virtual environment, and if the child throws it into the correct bucket, he/she earns scores. The fail score increases when the child throws in the wrong basket. Various measures are recorded at the end of each game on the Becure E-Therapy web portal.
annotation. The children's videos were recorded while playing Becure CatchAPet and Becure LeapBall. They were watched by three experts and were annotated as "Stress/No Stress" and "Reaction/No Reaction" every ten seconds. The children's stress situations were assessed using children's body language and annotated as "Stress/No Stress" and "Reaction/No Reaction". The main manifestations of anxiety and stress on the human face involve the eyes (gaze distribution, blinking rate, pupil size variation), the mouth (mouth activity, lip deformations), and the cheeks, as well as the behavior of the head as a whole (head movements, head velocity). Additional facial signs of anxiety and stress in children may include a strained face, facial pallor, and eyelid twitching. All three experts are occupational therapists. Meanwhile, experts had at least 2 years of experience as an occupational therapist. Three experts did not see each other's annotation. The majority of annotation has been kept. For example, if two experts annotated as stressed and one as no, stress has been assigned.

apparatus. Various studies have been done on emotion recognition and stress detection using Empatica E4
wristbands [37][38][39][40][41][42][43] . Emotions via Empatica E4 wristband in children with hearing disabilities have previously been investigated 44 . Further, Empatica E4 has been utilized for emotion recognition in children with atypical or delayed development 45 . In addition, the sympathetic response of children with autism under stressors has been studied using an E4 wristband 46 . The Empatica E4 wristband was previously used for children with autism spectrum disorder (ASD) 46 . In this study, physiological signals blood volume pulse (BVP), electrodermal activity (EDA), and skin temperature (ST) of children with obstetric brachial plexus injury, dyslexia, intellectual disabilities, and typically developed were collected using an Empatica E4 wristband 36 . The sampling frequency of the BVP signal was 64 Hz, while the EDA and ST signal was 4 Hz 36 . The E4 wristband was firmly placed on the child's wrist without cutting off the blood flow. The collected data were synchronized and downloaded via E4 Manager software. The children's upper body and facial expressions were recorded using the camera. The web camera of the laptop  www.nature.com/scientificdata www.nature.com/scientificdata/ computer (Lenovo G510 with i7 processor) where the Becure Software by INOSENS company runs was used to get videos. The videos were in Mp4 format and had 720p resolution with image frames of 1280×720 pixels. The recording began when the physical therapist started the games and automatically ended when the game ended.
Procedure. The study took place in the Technology Laboratory at Medipol University. The children and their parents were informed about the experimental procedure upon arrival. Then parents signed the written consent. The occupational therapists asked children to play the games approximately 2-3 minutes before the experiment so that children could get familiar with them. The child was asked to avoid unnecessary movements and to cover their faces.Children were instructed to fixate their gaze on a black screen for 30 seconds, serving as the baseline condition. Following the baseline period, the children were engaged in a serious game, either "Becure CatchAPet" or "Becure LeapBall, " for 420 seconds. Subsequently, upon completion of the game, the black screen was again presented for an additional 30 seconds. The procedure was repeated independently for each game.
The experimental procedure is given in Fig. 3. The child's physiological signal and facial expression were recorded during the session. In this order, each child was asked to do both Becure CatchAPet and Becure LeapBall games. The experimental setup is presented in Fig. 4. Data processing. The entire experimental setup was collected in a single device to ensure ideal data synchronization during the experiment. The E4 wristband was connected to the E4 Manager, and the device's clock was synchronized before the experiment. E4 Manager was installed on the device on which the experimental setup was run. The clock update of the device presenting the experiment was checked. E4 was saved as a Unix timestamp when data recording started. The device's clock was kept up to date, so this timestamp was synchronized with the whole mechanism. During the experiment, synchronous timestamps with stimuli were collected from all devices. The recording start or end (and video duration) timestamp of this video (creation time may not give correct results) were recorded in the metadata of the recorded video for labels to be synchronized with the physiological data. The face video recording was taken from the device on which the experimental setup was operated to ensure synchronization. While the data was kept separately for each client in Plain Text (.txt) format before preprocessing, it was stored in Comma-Separated Values (.csv) format (tabular data) after preprocessing.
A sixth-order Chebyshev II filter with the 18 dB stopband attenuation (Rs) and 0.1 Hz normalized stopband edge frequency (Wn) was used to preprocess the blood volume pulse (BVP) data 47 . The electrodermal activity (EDA) signal was filtered using a fifth-order Savitzky-Golay filter with a frame length parameter equal to 11 39 . Note that the two individuals that express the same emotions might have physiological signals with different levels 40 . Thus, the BVP and EDA signals were normalized between 0 and 100.
The main purpose of this study is to provide a dataset consisting of physiological signals labeled in terms of "Stress/ No Stress" by the experts. However, the face landmarks and facial emotions of children detected  www.nature.com/scientificdata www.nature.com/scientificdata/ with well-known methods were also given in the AKTIVES dataset for researchers to investigate the correlation between gaming motivation and facial emotions for future research. The videos had a 9.7 frame rate, and the videos' duration depended on the children's gaming performance. The total number of frames in a video file was nearly 4300. The emotions of the children were also recognized using their faces. Initially, the faces of the children were detected using Multi-Task Cascaded Convolutional Neural Networks (MTCNN) 48 . Then, the

Data Records
The data recordings (physiological signals, camera recordings, expert annotations) are available at Synapse | Sage Bionetworks 51 . Figure 5 Table 2.

technical Validation
Three experts annotated "Stress/No Stress" and "Reaction/No Reaction" through the execution of the serious games. The percentage of matching annotations to all annotations was calculated to validate the experts' annotations. The agreement percentage was calculated using Eq. 1. Two experts did the calculations. (Experts 1-2, Experts 1-3, Experts 2-3) and all three experts (Experts 1-2-3).

Agreement same annotation timestamps
The maximum percentage for the "Stress/No Stress" annotations between two experts was 81.45%, while the minimum was 77.24%. The percentage agreement between the three annotations was 68.18% for the "Stress/ No Stress". The minimum result for the "Reaction/No Reaction" annotations was 69.50% between two experts, and the maximum was 83.24%. The agreement results for all experts were calculated as 63.39%. The results are presented in Table 3.
The signal-to-noise ratio (SNR) of the physiological signals acquired via the Empatica E4 device was calculated to validate the signal quality. The SNR metric was obtained with the help of the autocorrelation function utilizing a second-order polynomial to fit the autocorrelation function curve 52 . Whole physiological data were examined separately in the case of SNR. Raw physiological signals were utilized for the SNR estimation.  www.nature.com/scientificdata www.nature.com/scientificdata/ be found in Table 4. In agreement with SNR findings, the physiological data acquired from Empatica E4 were good-quality signals.

Usage Notes
Stress recognition. The most common approach for stress recognition using physiological signals and facial expression includes (1) data collection, (2) signal preprocessing, synchronization, and integration, (3) feature extraction and selection, and (4) machine and deep learning training and validation. A comprehensive overview of all these stages can be found in review papers on stress/emotion detection using wearables [53][54][55] .
We recommend the following Python libraries for further processing the AKTIVES dataset, which we found useful for preprocessing and feature extraction from physiological data. Numpy Library (https://numpy.org) is utilized for feature extraction from BVP, EDA and ST signals. SciPy Library (https://scipy.org), which provides signal processing algorithms such as signal filtering, is used to filter the BVP and EDA signal. The Pandas Library (https://pandas.pydata.org) is used to resample the psychological signal every 10 seconds. NeuroKit2 Library 56 , which is a user-friendly package, (https://neuropsychology.github.io/NeuroKit) to extract the features of EDA signals. For the SNR calculation, the GitHub repository 52 (https://github.com/psychosensing/popane-2021) was used.
The state-of-the-art face detection and emotion recognition include data collection, feature extraction, and machine and deep learning 57,58 . For the dataset, face detection and emotion recognition are implemented in Python programming language using the Opencv Library (https://opencv.org/). Emotion detection is performed using the FER Library (https://pypi.org/project/fer/). For facial landmarks, pre-trained facial landmark recognition in the Dlib Library (http://dlib.net/) is used. accessing data. The AKTIVES dataset must be used for academic purposes only. Researchers who want to access the AKTIVES dataset must first sign the End User License Agreement (EULA). The EULA is available in the AKTIVES Dataset repository. To access AKTIVES Dataset, a signed EULA must be sent to aktives.project@ gmail.com. Only e-mails sent from the academic e-mail address will be considered.