Next Article in Journal
Efficient Evaluation of Wireless Real-Time Control Networks
Next Article in Special Issue
Biosignal Analysis to Assess Mental Stress in Automatic Driving of Trucks: Palmar Perspiration and Masseter Electromyography
Previous Article in Journal
Development and Integration of a Solar Powered Unmanned Aerial Vehicle and a Wireless Sensor Network to Monitor Greenhouse Gases
Previous Article in Special Issue
Adding Pluggable and Personalized Natural Control Capabilities to Existing Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Augmenting the Senses: A Review on Sensor-Based Learning Support

Welten Institute, Open University of the Netherlands, 177 Valkenburgerweg, Heerlen 6419AT, The Netherlands
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2015, 15(2), 4097-4133; https://doi.org/10.3390/s150204097
Submission received: 24 November 2014 / Accepted: 29 January 2015 / Published: 11 February 2015
(This article belongs to the Special Issue HCI In Smart Environments)

Abstract

: In recent years sensor components have been extending classical computer-based support systems in a variety of applications domains (sports, health, etc.). In this article we review the use of sensors for the application domain of learning. For that we analyzed 82 sensor-based prototypes exploring their learning support. To study this learning support we classified the prototypes according to the Bloom's taxonomy of learning domains and explored how they can be used to assist on the implementation of formative assessment, paying special attention to their use as feedback tools. The analysis leads to current research foci and gaps in the development of sensor-based learning support systems and concludes with a research agenda based on the findings.

1. Introduction

The digital and physical worlds are currently merging, opening new possibilities for us to interact with our environment, as well as for our environment to interact with us. This development is mainly driven by two technologies: display technologies and sensor technologies. Display technologies in the sense of personal mobile displays, as also a variety of embedded public displays, enable the integration and presentation of digital information and services in nearly every situation and context [1]. Sensor technologies enable the development of real-time information systems and the extension of classical objects to be enhanced and integrated into digital eco-systems. Everyday objects, which previously did not seem aware of the environment at all, are turning into smart devices with sensing and tracking capabilities. Cisco estimates that by 2020 there will be 50 billion devices connected to the Internet [2] and one of the main drivers for this to happen is the increasing number of low-cost sensors available [3].

A sensor is commonly defined as: “a device that detects or measures a physical property and records, indicates, or otherwise responds to it.”[4]. The mere linguistic definition of a sensor seems restrictive, in the sense that specific computer programs have been used as sensors, by tracking recent songs played, current URLs open, log of incoming calls and some other non-physical properties [5].Consequently, the definition of a sensor being used in this review is: “a physical or virtual object used for tracking, recording or measuring.” An overview of the identified sensors together with their measured properties and identified usages is shown in Appendix A. Coupling sensors with software components creates new types of tools with the capability to measure, analyze and (immediately) present results of the obtained data. The name for these instruments has not been standardized yet, and in previous works they have been referred as smart-sensors [6] sensor systems [7], sensor platforms [8], ecosystems [3], etc. In the remainder of this article these tools will be denoted as sensor-based platforms.

The ability of sensor-based platforms to act according to their retrieved and analyzed data suggests a possible use of them as learning tools. In order to get an overview of the state-of-the-art of sensor-based learning support and to find directions for further research on it, in this literature review we analyzed the learning support of sensor-based platforms that were designed for educational purposes as well as sensor-based platforms that were designed for other purposes but that are also able to support learning through the presentation of relevant information for performance support, analysis and contextual awareness. With the purpose to get an overview of the different areas of learning that have already been influenced by sensor-based platforms, we started our study analyzing the connections between the different types of sensor-based platforms and their support for the commonly distinguished learning domains: the cognitive, psychomotor and affective domain [9]. Since one of the current educational challenges is the implementation of formative assessment [10], within the learning domains we in particularly focused on exploring whether sensor-based platforms can assist on its implementation. Formative assessment provides learners with information that allows them to improve their performance and learning. In our study we carefully analyzed how sensor-based platforms have been used as feedback tools, since formative assessment includes high quality feedback, which should be given as soon as possible after submission; be relevant to the task and the pre-defined assessment criteria; and should help the student to understand how to improve her work (not just highlighting strengths and weaknesses)[11]. However, the required effort for this type of assessment easily leads to a work overload for teachers forcing them to give merely summative instead of formative feedback [12]. Implementing formative assessment with more human work force is currently not a feasible solution, therefore in this review we explored whether sensor-based platforms can contribute to it.

To summarize, this article gives an overview on how sensor-based platforms have been used for learning support, by exploring their contribution on the different learning domains, the implementation of formative assessment, and their status as feedback tools. The remainder of this article is organized as follows: Section 2 presents the classification framework used to analyze the prototypes described in the articles. Section 3 gives an outline of the used methodology. Section 4 presents the results of the analysis. Finally, Section 5 discusses the results and presents an outline for further research on the topic.

2. Classification Framework

With the purpose of identifying the already existing best practices for the use of sensors in learning as well as identifying directions for the further development on the state-of-the-art of sensor-based learning support, in this review we examined and studied the current link between learning support and the state-of-the-art of sensor-based platforms prototypes found in literature. In order to conduct our research we proposed a classification framework examining:

  • Learning domains: get an overview of sensors and learning.

  • Formative assessment: focus our research in sensors and learning, exploring how they can assist with a main current educational challenge.

  • Feedback: deepening our research in sensors and learning studying how they have been used for giving feedback, which is a key element for formative assessment and one of the most important interventions in learning.

To get an overview of the type of learning support that has already been tinted by sensor applications, we first analyzed and classified the existing sensor-based platforms according to the support that they give in the commonly identified learning domains [9]. This classification seems suitable because to our knowledge it covers all aspects of learning, allowing us to get an impression of the development of sensor-based learning support, highlighting the areas of learning that have been already influenced by sensors.

The unobtrusive capabilities of sensor-based platforms to measure and analyze data lead us to think of their possible support for assessment. Therefore we deepen our analysis exploring how the state-of-the-art of sensor-based platforms can assist in the implementation of formative assessment, which is a current educational challenge. To study this contribution we analyzed how the state-of-the-art of sensor-based platforms can be used to assist in the 9 aspects of formative assessment that have been identified in [13,14]. Feedback is a key aspect of formative assessment and one of the most important influences in learning [15], hence to gain insight in the effectiveness of sensor-based platforms as feedback tools, we studied their feedback based on the framework of effective feedback [15].

2.1. Classification Framework for Learning Domains

Currently most well known sensor applications on the market, such as the Polar heart rate monitors[16], Nike+[17], Digifit[18], or Xbox fitness[19]are used in the field of sports. They are designed to track and give feedback about the physical performance of the users, helping them in training their motoric skills. With the intention to explore whether the use of sensor data can go beyond that, we explored in scientific literature the areas where learning support have been given by sensor-based platforms. For that we analyzed the prototypes described in literature according to their support given on the commonly identified learning domains. These domains are: the cognitive, affective and psychomotor domain[9] (see Figure 1). The cognitive domain refers to knowledge and the development of intellectual skills. It includes the recall or recognition of facts, and the development of intellectual abilities and skills[9]. This learning domain contains two dimensions: the knowledge dimension and the cognitive process dimension. The knowledge dimension refers to the type of knowledge that can be acquired and consists of four categories: factual, conceptual, procedural and metacognitive knowledge. The cognitive process dimension deals with how the knowledge is used. It contains six categories ranging from remembering facts to the creation of new concepts and objects using the acquired knowledge[20].In order to get an understanding on how sensors can support the cognitive domain of learning, we explored the practices that have been used by sensor-based platforms to support these two dimensions.

The affective domain refers to the approach in which learners deal emotionally with things, such as values, feelings, motivations and attitudes. This domain is usually categorized according to the complexity of the behavior incorporated by the learner. Starting from being open to receive the phenomena to internalize these phenomena until they become a characteristic feature of the learner [21]. In this review we explored how the identified prototypes have been used to affectively support learning, enabling us to extract and analyze the strategies used by sensor-based platforms to present support on the affective domain.

The psychomotor domain deals with physical movement, coordination and the use of the motor-skill areas. The development of these skills requires practice and it is evaluated in terms of precision, distance, speed or techniques in execution. Six categories have been identified for this domain: reflex movements, fundamental movements, perceptual, physical activities, skill movements, and non-discursive communication [22]. To explore the current sensor-based learning support on the psychomotor domain of learning, we investigated which of these categories have already been supported by sensor-based platforms and analyzed how this support has been achieved.

2.2. Classification for Formative Assessment Support

Once having an overview of the possible use of sensors in learning we wanted to explore whether they can be used to help solving a current challenge in education and learning. As introduced above, sensor-based platforms can unobtrusively measure and analyze data, thus suggesting their use in assessment tasks. Therefore, in this second dimension of our classification framework we have classified the analyzed prototypes according to their functions for formative assessment, investigating in which ways sensor-based platforms can contribute to its implementation. From a broad perspective formative assessment refers to the assessment that provides the learner with information, which allows them to enhance their learning and performance [11]. By examining the qualities that allow highly competent tutors to contribute to formative assessment [13], and the strategies discussed on the “Keeping Learning on Track® Program”[14], we have identified nine aspects that contribute to formative assessment (see Figure 2):

  • Knowledge of subject matter, allows analyzing the performance of the learner, identifying the origin of its errors.

  • Knowledge of criteria and standards, allows giving learners tasks according to their current level.

  • Attitudes toward teaching, deals with the empathy from the tutor towards the students and the desire to help students in their development.

  • Skills in setting, referring to the capacity of setting assessments that reveal understanding and skills and testing the desired outcomes.

  • Evaluative skills, allowing to make appropriate judgments and to deal with the possible responses of the learners.

  • Sharing learning expectations, identifying the learners' expectations and allowing sharing them across the peers.

  • Self-Assessment, allowing to structure opportunities to take responsibility of own learning.

  • Peer-Assessment, allowing to structure opportunities for activating learners instructional resources for each other.

  • Feedback, referring to the evaluative information on the positive and negative features of the student's work.

In this dimension of the classification framework we investigated how the sensor-based prototypes described in literature support these aspects of formative assessment. The analysis of feedback, an essential aspect of assessment, will be done separately and discussed in the next section.

2.3. Classification Framework for Feedback

Feedback is one of the most powerful interventions in learning [15], and one of the most beneficial thing tutors can do to students is to provide them with feedback that allows them to improve their learning[23]. High quality feedback is a key element of formative assessment [11]. Therefore, we decided to analyze the type of feedback given by the studied prototypes. Feedback in this study is defined as the information about a person's behavior or performance of a task, which is used as a basis for improvement [4]. The effective feedback framework in [15] focuses on how feedback can be used to positively influence the learning process. Consequently, we analyzed the alignment between the feedback of the studied prototypes and this framework.

Effective feedback gives answers to the following questions: “where am I going?”, “how am I going?” and “where to next?” (see Figure 3)[15]. The question “where am I going?” refers to the learner's goals; goals produce persistence at task performance while facing obstacles, and support the resumption of disrupted tasks in the presence of more attractive alternatives[24]. The answer to “how am I going?” provides information relative to a task or performance goal of the user. Finally, the answer to “where to next?” shows the learner the next steps to take towards the completion of her goal. Implementing the answers to these questions on a computerized system is not a straightforward task. In order to answer the question of “where am I going?” first it is important to know the goals of the user. The challenge comes in reminding the user about these goals and presenting the user with feedback on how the current task and performance aligns to the goals. Work regarding feedback loops has suggested that by presenting the user with evidence of his current behavior together with the consequences allows the user to perceive an alignment between his performance and goals[25]. Sensors can be used as tools to collect this evidence. Presenting this evidence and the potential consequences is something that can be implemented on a sensor-based platform.

In order to answer “how am I going?”, the performance of the user needs to be tracked, and this performance has to be compared with some rules. The proposed way to classify the type of feedback that gives answer to this second question of is through the five different levels of the complexity of feedback dimension [26], which are:

  • No feedback: no indication provided about the performance of the learner.

  • Simple verification: indication of correct or incorrect performance of the learner.

  • Correct response: indicates the learner how the correct performance should be.

  • Elaborated feedback: indicates why the performance of the learner is correct or incorrect.

  • Try again feedback: informs the learner when the performance is incorrect and allows her to attempt to change it.

The implementation to the answer of “where to next?” has two basic requirements. First, a map with all steps to achieve the learner's goal is required. Second, it is important to identify the current position of the learner on this map. The measuring and analysis qualities of sensor-based platforms seem suitable to identify the current position of the learner on the learning map. Moreover, sensor-based platforms that make use of system adaptation techniques such as direct guidance, content-based filtering [27], and self-adaptation through feedback loops [28], open the possibility for them to present the learner with a personalized learning map.

In this review, we analyzed how these three questions of effective feedback have been answered by the studied prototypes. To identify the answer to the first question: “where am I going?”,we examined whether the technique described of presenting the evidence together with its consequence [25]has been used by the prototypes, and explored whether some other techniques have been used to address this answer.

For “how am I going?” we analyzed how the feedback given by the prototypes relates to the feedback complexity levels[26].Together with this dimension, we also explored the feedback channel used by the prototypes. This channel can usually be visual, audio or haptic. The reason for this exploration is to investigate whether empirical evidence exists backing up these feedback practices.

For “where to next?” we explored how the prototypes have implemented an answer to this question, presenting attention to the inclusion of system adaptation techniques for personalized answers.

3. Method

The purpose of this study is to get an overview on the state-of-the-art of sensor-based learning support and to explore how the existing sensor-based platforms could bring assistance to the solution of an educational challenge, which is the implementation of formative assessment. Therefore we collected articles describing studies about sensor-based prototypes and analyzed them according to our classification framework in order to identify their learning support.

The underlying search for articles was conducted using the online repositories of: Education Resources Information Center Digital Library (ERIC), ScienceDirect (Elsevier),IEEE Computer Society, Association for Computer Machinery and the publisher Springer. The first repository ERIC was selected for being considered the largest repository in education. Elsevier was selected because it contains journals that publish research that merges the technical and educational aspects. The three other repositories were selected for containing the largest digital libraries in computing and engineering.

The search for articles was executed in different phases. The first phase was in the context of an internal study, for which we performed an initial search in early 2013 using the keywords “sensor”, “application” and“learning”. We examined the abstract of these papers looking for computerized applications that have been enhanced by the use of sensors, paying special attention to the ones describing applications that were designed for human learning. This first search left us with 111 articles that were considered relevant for further study.

With the purpose to include the latest research in our repository and to start a formal research on the state-of-the-art of sensor-based learning support, a second search was done in January 2014 using the keywords “sensors”, “software”, “applications” and “learning”, while searching for articles published from 2012 to 2013. The term “software” was added to the query to restrict our search, and to exclude research focused on the hardware of sensors and not on sensor applications. After a scan through the abstracts, looking for applications where sensors have been used for human learning support, 24 articles were selected for a deeper study. While studying the literature we decided to explore more cases where systems have used sensors to adapt their behavior in order to support learners, therefore a later search was performed in March 2014 using the keywords “sensor”, “adaptive”, “system adaptation” and “education” for articles published after 2012. An examination of the abstracts of these search results let us with three articles that have been included in this study.

Finally in order to be sure to include some missing relevant work the state-of-the-art on sensor-based learning support; we included eight more articles and three commercial products to this review that have been pointed out by experts in the field of Technology Enhanced Learning and Human Computer Interaction as representative work in the field of tutoring, feedback and sensor systems.

To select the studies that were included in our analysis, we followed the criteria of including only articles describing sensor-based prototypes, and of which the description of these prototypes presented some information on how they can proportionate some learning support to their users. From the 146 reviewed articles and three commercial products, we were able to identify 112 different sensor-based platforms prototypes. When analyzing articles describing these prototypes, we could identify that only 82 of them include a description of the communication channel between the prototypes and the user. Since this link between the prototype and the user, is the element in a sensor-based prototype responsible to support learning, we decided to only include these 82 prototypes for further analysis.

We conducted the analysis of the prototypes in three stages. On the first stage we explored the learning support given by the prototypes. This support was classified according to the Bloom's taxonomy of learning domains [9] (see Section 2.1). On the second stage we analyzed the contribution of the prototypes in key identified aspects of formative assessment (see Section 2.2). Finally, on the third stage we took a close examination on which of the prototypes did give feedback to the user and how this feedback compared to the effective feedback framework [15] (see Section 2.3).

4. Results

Out of the 82 analyzed prototypes that were selected for further analysis, 51 of them were created inside of an educational context specifically designed to support learning; nevertheless by analyzing the description of their communication channel and reports of their usage we identified a total of 79 prototypes providing users with relevant information for evaluation and analysis, performance support or contextual awareness, hence providing users with learning support. We recognized 79 prototypes supporting learning on the learning domains, 51 prototypes contributing to at least one key identified aspect of formative assessment and 35 prototypes giving feedback to the learner. An overview of these prototypes is found in Appendix B.

4.1. Classification for Learning Domains

With the intention to get an overview of the learning support that has already been given by sensor applications, we classified the analyzed prototypes according to their support in the different learning domains. Out of the list of 82 prototypes we identified 79 prototypes presenting learning support. By examining the output given by the prototypes, we identified that 56 of them present the user with information that can help her to remember facts, understand concepts, analyze situations, etc. Therefore, we classified them as prototypes supporting the cognitive domain of learning. Six of them present information with the purpose to engage users in specific activities, thus we classified them as presenting support to the affective domain of learning. Following this criterion we identified two prototypes supporting both the cognitive and affective domain of learning. The output of 17 of the prototypes presents the learner with information that aims to help her with the improvement of specific movements or her physical abilities. Hence we classified these prototypes as giving support on the psychomotor domain of learning. By analyzing the 56 prototypes that we classified as giving support to the cognitive domain of learning, we could identify three different strategies (see Table 1) that have been used by sensor-based platforms to give this support.

The first strategy identified uses sensors to infer the learner context, in order to present the learner with relevant contextual information. We identified 22 prototypes following this strategy. The learner's context is commonly inferred by detecting specific objects that are situated in her surroundings. The most common technology that has been used to identify these objects is by attaching Near Field Communication (NFC) or Radio Frequency identification (RFID) tags to them. The sensors of the prototypes are able to read these tags and to present the learner with relevant contextual information. The information presented by the prototypes determines the category of the cognitive domain [20]that receives the learning support. For example, the prototype in [29] presents support on remembering factual knowledge. For this prototype NFC tags have been attached to everyday objects. When the prototype senses one of these tags, information about the tagged object is shown to the learner, this information helps her to remember specific facts about it. The prototype in [30] uses the same strategy. Nevertheless, this prototype supports the category of applying factual knowledge. The purpose of the prototype is to help learners to learn Mandarin, for that it uses GPS sensors to identify the context of the learner and presents him with Mandarin phrases that are suitable to be applied in this context.

The second strategy identified on 11 of the prototypes, is similar to the first one; nonetheless this strategy instead of using sensors to track the learner's contexts, it uses sensors to track specific features of the learner such as the learning style [31], competences based on the score of predefined pre-tests [32], attention [3335], emotional state [36,37], uncertainty while using a tutoring system [38,39], trouble solving problems [40] or driving style [41]. The information presented to the learner by these prototypes depends on the tracked values for these features.

The third strategy identified uses sensors to gather relevant data and presents this data to the learner. We identified 23 of the prototypes following this strategy. For example the prototype of NoiseSpy [42] uses the microphone and GPS of mobile devices to retrieve the amount of noise in different places of a city. These different noise measurements are presented in a map allowing town planners to learn about the noise distribution patterns of a city. In this case the use that the learner gives to this information establishes the cognitive domain category supported by the prototype. This strategy is the only one identified being used by commercial products [4345]. These products provide different visualizations of sensor data, which could help learners to analyze different phenomena from natural sciences. The application domain for prototypes using this technique of showing sensor data to support the cognitive domain of learning is broad. It can go from the field of civil engineering as in [46], to the field of sports where due to the advances in wearable sensors, human movements are being studied in new and more precise manners [4749]. Another common application where sensor data supports learning in the cognitive domain is by monitoring the activity, behavior and state of patients in order to gain insight about their health [5054]. These prototypes have been classified as supporting the cognitive domain of learning instead of the psychomotor domain, because the users of these prototypes who are able to make direct use of the sensor data are experts. By analyzing the data these experts can later use their gained knowledge to give proper advice to patients. This proper advice might indeed support them in the psychomotor learning domain, but it comes from the expert and not from the prototype. The prototype shown in [52] is an example of this; this prototype shows how wearable sensors have been used to monitor the movements of people following a heart stroke helping doctors to select the best therapy for them.

The affective domain of learning deals with attitudes, motivations, values, etc. We identified that the information presented to the learner in eight prototypes had the purpose to support them in this domain The analysis of these eight prototypes let us recognize three different strategies that have been used to achieve this support (see Table 2).

The strategy of behavior overview and review uses sensors to track certain aspects of the learner's behavior and presents the learner with the overview of it. By doing so, the learner becomes aware of how she is approaching towards the desired goal, motivating her to change or keep up with his current behavior. This strategy has been used by four of the prototypes. The prototype described in [55] exemplifies this strategy. The purpose of this prototype is to engage users into a more active lifestyle, for that, this prototype uses sensors to track the physical activities performed by the user, and displays the overview of them on their mobile devices. Watching the presented activity overview motivates the user to engage into a more active lifestyle.

The strategy of social network visualization has been used by two of the prototypes; this strategy lets learners compare themselves with peers of their network, motivating them to perform well in their learning activities. An example of this is described in the prototype in [56]. This prototype presents to students of virtual learning environments some smart indicators informing them about their activities, achievements and progress in comparison with other peer students.

The strategy of involving learners in data collection has been identified in two of the prototypes, it supports learning in the affective and the cognitive domain. This strategy has been used to engage learners into scientific activities, by letting them participate in the data-gathering phase of the scientific process. Learners use sensor measurements to gather this data. An example of this strategy is the prototype in [57]. This prototype allows learners to create scientific experiments that are compiled into mobile applications. These applications use the sensors of the mobile devices to assist the learners to conduct their experiments.

Seventeen prototypes have been identified to support the psychomotor domain of learning (see Table 3). For the exploration of this domain we analyzed how the prototypes give support on the six categories of the psychomotor domain of learning [22], identifying support in four of them: fundamental movements, skilled movements, physical activities and non-discursive communication.

Seven of the prototypes present support to fundamental movements, such as walking, running, sitting, etc. The purpose of these prototypes is to help patients going through a rehabilitation process. These prototypes use sensors to track the patients' movements, analyze these movements and give feedback to the patients informing them whether the movements have been performed correctly or incorrectly. As an example, the prototype in [58] uses wearable inertial sensors to identify the posture of patients who are going through rehabilitation after a damage of their motor system. Whenever the posture is incorrect the prototype provides audio feedback.

Support for learning skilled movements, referring to the movements used for dancing, recreation and sports, has been recognized in seven of the prototypes. The strategy used to support the skilled movements is similar as the one used to support the basic ones, prototypes use sensors to track the learner's movements, analyze how they are being performed and show the analyzed results to the learner. The areas of this type of learning assistance that have been identified are: music gestures [59,60], special rehabilitation exercises [61], taekwondo movements [62], snowboarding [63,64] and karate punches [65].

The prototype in [66] is the only one that has been recognized to support physical activities. This prototype uses sensors to track the weather conditions and current fitness of cross-country runners. According to the difficulty of the route, the tracked weather conditions and the tracked current fitness level of the runner, the prototype indicates the runner the route to take for an optimal workout.

Support for learning non-discursive communication referring to the acquisition and development of nonverbal communication skills has been identified in two of the prototypes. The prototype in [67] tracks the facial gestures, voice intonation, volume and speaking rate giving feedback to the learner about the correct use of her nonverbal communication for job interviews. The prototype in [68] is a videogame that tracks the facial expressions in children with autism teaching them how to smile.

4.2. Classification for Formative Assessment Support

To explore how sensor-based learning support can contribute to the solution of one current educational challenge [10,11], we studied how the investigated prototypes can bring assistance to the implementation of key aspects of formative assessment (see Section 2.3). By looking at the information that the prototypes gave to the users we identified 51 of them (see Table 4) contributing to at least one of these aspects.

Twelve of the prototypes have been identified to support the aspect of knowledge of subject matter, which allows experts on making better assessments about the students' performance. This support is achieved due to the monitoring capabilities of sensors. The sensor data presented to the experts (tutors), helps them to analyze and identify the errors of the learner. This type of support is used in sports and healthcare. An example of the sports field is found in the Swimming prototype [47]. In this prototype, wearable accelerometers are attached to the learner. The data received by these sensors allow for the analysis and error identification of the learner's swimming technique. In healthcare the prototype in [50] uses wearable gyroscopes to analyze the gait of patients. This analysis allows detecting gait abnormalities or deteriorations to identify the presence of diseases and pathologies.

The knowledge of criteria and standards, which helps to identify the current learning level of the student, is supported by 16 of the prototypes. The strategy of using sensors to track the learner's performance and to identify his errors, which can be used to support knowledge of subject matter, can also be used to identify the current learning level of the learner. Two of the prototypes identify the current level of the learner by identifying his physiological state. The prototype in [34] uses an electroencephalogram to track the attention level of the learner while attending an online lecture. The prototype shows in which part of the lecture the attention of the learner decreases allowing tutors to give tasks to the learner of the subjects in need of being reviewed. The study in [59] describes a prototype that emulates musical sounds according to certain gestures of the users. In this study teachers who observed students using the prototype, reported that the prototype allowed them to identify the musical level of the students.

We identified two prototypes tracking the emotional state of the learner while doing learning tasks and informing the tutor about this [36,37]. This helps the tutor to increase her empathy towards the learner and therefore supports the key aspect of formative assessment identified as attitudes toward teaching.

Support for skills in setting, which deals with the capacity to set assessments that reveal the knowledge and skills level of students, has been identified in eight prototypes. Four of these prototypes support these aspects by setting assessments to the learners in a spatial context. The prototype in [69] acts as a mobile guide in a museum. It identifies the location of the learner using RFID technology, and according to the location it asks specific questions to the learner and evaluates her answers. Two of the prototypes support skills in setting by tracking the physiological state of the learner. These prototypes display this state to the tutor, allowing them to set appropriate assessments according to the learner's identified state. The prototype in [52] exemplifies this. It uses wearable accelerometers to track the movements of patients following a rehabilitation program after having a heart stroke. The analysis of the tracked movements allows doctors to select the right set of exercises and therapy for them. The last identified technique to support skills in setting has been used by two of the prototypes. Here learners are required to use sensors to complete the tests that tutors have given them. For example, in the prototype in [57] students have to collect and analyze data using the sensors of their mobile devices to answer the scientific tests set by the teacher.

Four prototypes support the evaluative skills. They achieved this support by evaluating the questions that have been previously asked to the learners. The prototype in [32]has been designed to evaluate the answers of learners to predefined tests and makes use of an expert system to present learners with the learning objects that relate to their tests' results.

Contribution for self-assessment, i.e., structuring opportunities for the student to take responsibility about her own learning, was identified in six prototypes. These prototypes structure opportunities to take responsibility of own learning by showing an overview of the actions and performance of the learner. The prototype in [55]shows an example of this, by tracking the physical activity of the user and displaying an overview of it in the form of a virtual garden where the amount of life displayed in the garden is represented by the physical activity of the user. By looking at this representation, the learner is able to reflect and take responsibility about its actions. Support for key elements such as: sharing learning expectation, and peer assessment have not been identified in the studied prototypes.

4.3. Feedback Analysis

Because of its relevance in formative assessment and learning in general, we decided to dedicate a complete subchapter of this review on the analysis of the feedback given by the prototypes. By analyzing the information presented by the prototypes to the user, we could identify that 35 of them, revealed information about the user's performance, activities or states; therefore we selected them for our feedback analysis in this review. In the following subsections of this review we report our exploration on how the questions for effective feedback[15] have been answered by the prototypes.

4.3.1. Where Am I Going?

The answer to “where am I going?” is related to the goals of the user. Five of the prototypes (see Table 5) explicitly display an answer to this question. For example, the user's goal in the prototype described in [70] is to eat healthier and avoid emotional eating. In order to make the user aware of how she stands in respect to her goals, this prototype followed the technique described in [25] of presenting evidence together with consequences. This prototype shows the overview of the user's eating habits as a tree (evidence), where the color (consequence) of the tree depends on the healthiness of the food intake by the user.

The prototypes described in [55,71] used the same technique. The first prototype shows an overview of the healthy activities performed by the user (evidence) as a garden where the amount of flowers and life in the garden (consequence) depend on the amount of physical activities. The second prototype uses the same approach but the metaphor used is the one of an ecosystem. The life of the ecosystem depends on the ecological friendly trips done by the user.

The relevance to answering the question of “where am I going?” by sensor-based platform has been empirically tested in the work in [72]. This work has released two different versions of their prototype. Only one of the versions has presented the user with an overview of her standing in respect to her goal. The results of this study show that the compliance to finish sampling experiences in experience sampling method studies was 23% higher in the group whose participants used the version of the prototype displaying the overview.

4.3.2. How Am I Going?

To answer the question of “how am I going?” the sensor-based platforms are required to track the actions or behaviors of the users, and provide them with information relative to their performance in relation with some predefined rules. Twenty-six of the analyzed prototypes have answered this question (see Table 6). The analysis in this section discusses the form and the channel of feedback given by the studied prototypes.

Form of feedback: Looking at the dimension of complexity of feedback [26], feedback can be given at five different levels including no feedback, simple verification, correct response, elaborated feedback and try again feedback. From the analyzed prototypes one of them gives exclusively a simple verification feedback giving the user points when guesses about her glucose levels are correct [51]. Eight of the prototypes present exclusively the “try again” feedback, telling the user that her action was wrong and letting her to repeat the action until it is performed correctly. Six of the prototypes give both the simple verification and the try again feedback. In [73] this has been achieved by playing harmonic sounds when the gait of the users is correct (simple verification) and by playing strong rhythmic sounds pointing out to the user that its gait needs to be corrected (try again feedback).

Ten of the prototypes present elaborate feedback indicating why the performance of the user is correct or incorrect. To give this feedback, the prototypes present the evidence of the user's actions together with indications of the acceptable standards to conduct her activities. An example of this is the prototype described in [62]. This prototype points out the differences between the movements of an expert martial artist (correct technique or correct standard) and the user, letting the user become aware on how to correct her mistakes. The prototype described in [53] used a different feedback strategy, showing that our proposed framework to analyze the feedback of sensor-based platforms to the question of “how am I going?” was not exhaustive. This prototype instead of indicating the user whether her behavior has been correct or incorrect, it presents her with evidence of her tracked behavior and asks her a question about it, presenting her a chance for self-reflection.

Seven studies reported empirical results about the use of the prototype with participants, all of them showing positive results in regards to the purpose of the prototype. Five of these prototypes used the strategy of try again feedback[33,60,64,76,80] and two of them presented elaborate feedback[67,74].

Channel of feedback: Since users receive the feedback through their senses, in theory there is a feedback channel for each one of them: visual, auditory, haptic, gustatory and olfactory. The feedback channels used by the prototypes were audio, visual and a combination of both. Ten of the analyzed prototypes present their feedback exclusively through the audio channel. The prototype developed in [65] has shown an example of this; the sounds played by the prototype depend on the accuracy of the karate punch technique performed by the user. Eight of the prototypes display their feedback through the visual channel. The prototype in [70] uses the screen of the user's mobile device to show a message saying: “let's count slowly to 10 and breath…”. The combination of visual and audio has been used in three prototypes. In [33] the prototype shows the score of the user on the computer screen and plays sounds whenever the user maintains her concentration. Two of the prototypes provided feedback through the haptic channel. The prototype in [60] exemplifies this type of feedback. It consists of a pair of gloves that give haptic feedback when the user, who is learning how to play the violin, performs incorrectly a specific technique.

Empirical positive results in regards to the purpose of the prototype were found for all of the identified feedback channel practices [33,60,64,67,74,76,80]. Pointing out that the study of [64] showed that for physical activities such as snowboarding, the haptic feedback was perceived faster than the audio feedback.

4.3.3. Where to Next?

The answer to “where to next” is about showing ‘some’ guidance to the user on the next steps to follow. Eight prototypes have been identified which present the user an answer to this question (see Table 7) Five of the prototypes present an indicator of just the next step to take, indicating the next step to do to solve a problem [40], showing the steps required to correct mistakes [69], showing the next activity to engage [51], instructing the user the steps that she needs to follow in order for her to relax and gain self-control again during highly emotional situations [70], and showing which direction to take [66]. Three of the prototypes present the user with a complete personalized learning path for them. This path has been obtained by capturing the user's attention levels during a virtual lecture [34], tracking the user's competences [32], or identifying the user's learning styles [31]. While the prototype in[34] has just pointed out the user the steps to follow, the prototypes described in[31,32] have used system adaptation techniques to present the user with her personalized path. The system adaptation technique presented by [31] uses a literature-based approach, where the number of visits and time spend by the students working with learning objects is used to automatically identify the student's learning style. This approach tracks the behavior of students in order to get hints about their learning style preferences, then it uses a rule-based approach to estimate the preferred learning style from the amount of matching hints. Finally, it presents the learner with a learning path suited for his learning style. None of the prototypes have shown empirical results about their learning support.

5. Discussion

The pairing of sensors with software components has created tools with capabilities to automatically retrieve and analyze data, referred to in this review as sensor-based platforms. In order to explore the use of these tools in learning, we analyzed the prototypes described in literature according to our classification framework. Starting with an exploration of the areas of learning that have been supported by sensor-based prototypes, this review revealed that sensor-based platforms have been designed and used to give support in each of the three learning domains. The domain with the most support (56 of the 82 studies) is the cognitive domain; mirroring what happens with learning in general, where the cognitive domain is the most used and studied [81]. Remarkably, also given the research in [82], which asserts that a comprehensive educational design should merge these domains, in this review we could only identify two prototypes supporting a combination of domains. This presents a research opportunity on finding the implications to create sensor-based platforms able to support multiple domains of learning.

In our search to seek whether sensor-based platforms can be used to help solving current educational challenges, we continued our analysis of the prototypes studying their possible connection with formative assessment. While in this review we did not identified prototypes specifically designed to give formative assessment, our analysis showed that sensor-based platforms have already been used for seven of the nine aspects of formative assessment described in Section 2.2 (see Table 4). The missing aspects for contribution were structuring opportunities for peer assessment and sharing learning expectations.

With the intention to deepen our research in an aspect considered to be fundamental for learning and for formative assessment, our analysis of the prototypes showed that sensor-based platforms are able to retrieve, measure and analyze personal information in order to give feedback on the three questions of effective feedback. For giving an answer to the first question, “Where am I going?” which deals with guiding learners towards their goal, we identified three different used representations. These representations consisted of a description of the learners' goals, showing the learners' performance together with the consequences, and showing a metaphor of the goals and performance instead of the real data values. While we recognize ways to give an answer to this question, we did not found prototypes attempting to formulate an advice on it. Additionally, none of the reviewed articles studied the appropriate timing for giving this type of feedback. An important aspect for answering the question of “where am I going?” which besides relating to the metacognitive skills of a self-regulated learner one of its main purposes is to keep the learner motivated, therefore having an impact on the affective domain of learning. As previously seen in the analysis of the learning domains, the affective domain of learning does not receive as much attention as the cognitive domain, which partly can explain the knowledge gaps on how to use sensor-based platforms to answer “where am I going?”.

Continuing with the second question, this review shows that sensor-based platforms have been used to give an answer to “how am I going?”. We recognized several feedback representations used by sensor-based platforms to answer this second question. These representations can be classified according to different feedback dimensions [26]such as: timing of feedback, feedback channel and complexity of feedback. However, what we miss from the reviewed articles was a study revealing a suitable method to present this answer as feedback to the learner. From the reviewed articles only [60,64,73]presents an explanation for the selection of its feedback method. Overall, also in relation to the other two questions discussed, studies about the effectiveness of the different feedback channels and feedback dimensions are limited, finding only one work [64] comparing the receptivity between the auditory and the haptic feedback channel. Moreover, no study has been identified exploring how the different ways to give feedback using sensor-based platforms, play a role in subjects such as the cognitive load [83], reflection-in-action and reflection-on-action of the learner [84].

The review shows that sensor-based platforms can be used to show the users their next learning steps, therefore answering the question of “where to next?”. What we miss to recognize in the literature is a prototype able to answer the three questions of effective feedback.

This analysis allowed us to identify two main research branches for sensors-based learning support. The first branch deals with the acquisition of relevant data that might be useful for the learner, and the second branch deals with the presentation of this sensor data to the learner. The amount of different prototypes supporting learning for so many different subjects and domains has shown us that several researches have already been undertaken on the acquisition of relevant sensor data for learning. However, we did not identify many studies investigating and reporting on the implications to deliver this relevant inferred sensor data in ways that can effectively support learning. Looking that only 35 out of 82 prototypes have been identified to present the learner with feedback reveals this. Furthermore, we found only one study [64] analyzing different types of feedback methods for their prototypes, and only in few cases the selection of the feedback methods used by the prototypes were argued. This research gaps give us an indication of the state-of-the-art of sensor-based learning support, which can be corroborated by the very few empirical studies found investigating the effectiveness of sensor-based platforms as learning tools. This current state of research in sensors and learning is also reflected in related literature reviews studying the topic of sensors, where the purpose is to analyze these platforms based on techniques to identify objects [85], achieve ambient intelligence [86], augment reality [87], create body sensor networks [88], classify postures and movements using wearable sensors [89], etc. None of the literature studies known to the authors focus on the use of sensors to support learning. These findings about the current maturity of sensor-based learning support align with the lack of use of sensor-applications for formal learning, which are not that popular yet and only deal with the presentation of sensor data for the study natural sciences [4345] together with the arrival to the market of sensor applications such as Nike+[17], Digifit[18], Xbox fitness[19]etc. that support informal learning.

6. Conclusions

In this review we analyzed 82 prototypes found in literature studies according to our classification framework in order to identify the state-of-the-art of sensor-based learning support. The analysis revealed sensor-based learning support as an emerging and promising field of study, which has the potential to support learning in several areas and subjects. In this review we merely identified research studies focusing on the learning aspects of the described sensor-based platforms. This turned out to be a limitation for this review, by not allowing us to clearly identify and analyze the learning strategies used by the prototypes. Nevertheless, this lack of focus on learning effectiveness, points out a research direction for further improvement on the state-of-the-art of sensor-based learning support.

This review shows that the focus on sensor-based applications for learning support is quite broad and that this support can have an effect on all the learning domains. It also shows the potential for sensor-based platforms to contribute on the implementation of formative assessment. Nevertheless, we found a lack of studies focusing on the implications required for sensor-based platforms to present their inferred information in such ways that learners can assimilate it effectively, so that sensor-based platforms can become effective learning tools. This research gap suggests the main research path to follow for the improvement of sensor-based learning support. By following this path, we consider that sensor-based platforms can become reliable learning tools able to reduce the workload of human teachers and therefore contribute to the solution of a current educational challenge, which is the implementation of formative assessment. While more work needs to be done on sensor-based platforms to become common learning tools introduced to formal and non-formal learning programs, this review can be taken as a basis and inspiration towards this goal.

Acknowledgments

The underlying research project is partly funded by the METALOGUE project. METALOGUE is a Seventh Framework Programme collaborative project funded by the European Commission, grant agreement number: 611073 ( http://www.metalogue.eu).

Author Contributions

Jan Schneider is the main author of the article. With the help of the other authors he developed the classification framework used in this review. He performed part of the retrieval from the selected articles analyzed in this review, analyzed the articles, and with the feedback and supervision of the coauthors wrote the text of this literature review. Marcus Specht helped with the retrieval of the analyzed articles, creation of the classification framework and the initial supervision of the writing process of this article. Dirk Börner and Peter van Rosmalen helped with the creation of the classification framework, and carefully supervised the research and writing process for this article.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Identified Sensors Together with Their Measured Property and Identified Functions
Identified Sensors Together with Their Measured Property and Identified Functions
SensorMeasured PropertyHelps withInstallation
AccelerometerAccelerationActivity sensing, Context sensing, Environment sensing, Physiological state sensingEnvironmental, Wearable
Air pollutants sensorsAmount of toxic particles in the atmosphereContext sensing, Environment sensingEnvironmental, Wearable
BarometerPressureActivity sensing, Context sensing, Physiological state sensingEnvironmental, Wearable
Blood glucose meterGlucose on the bloodPhysiological state sensingWearable
BluetoothRadio signalsActivity sensing, Context sensingWearable
CameraVisual lightActivity sensing, Context sensing, Environment sensing, Physiological state sensingEnvironmental, Wearable
CompassEarth magnetic fieldActivity sensingWearable
Electro cardiogram (ECG or EKG)HeartbeatActivity sensingWearable
Electrodermal activity meter (EDA)Skin conductancePhysiological state sensingWearable
Electroencephalogram (EEG)Electrical activity along the scalpActivity sensing, Context sensing Physiological state sensingWearable
Electromyography sensorElectrical activity produced by skeletal musclesActivity sensingWearable
Force gaugeForceActivity sensingWearable
Galvanic skin response sensorSkin conductanceContext sensing, Physiological state,Wearable
Global positioning system (GPS)Earth coordinatesActivity sensing, Context sensing, Environment sensingEnvironmental, Wearable
Global system for mobile (GSM)Radio signalsContext sensingEnvironmental, Wearable
GyroscopeMeasures orientationActivity sensing, Context sensing, Physiological state sensingWearable
HumistorDetects humidityActivity sensing, Physiological state sensingWearable
Infra red cameraInfra red frequency of lightActivity sensing, Context sensingEnvironmental, Wearable
MicrophoneSound wavesActivity sensing, Context sensing, Environment sensingEnvironmental, Wearable
Near Field Communication receiverRadio frequencyContext sensingEnvironmental, Wearable
Radio frequency identification receiverRadio frequencyContext sensingWearable,
SonarDetect objects through sound wavesActivity sensingWearable
Software SensorsDetect user's actionsActivity sensing
Context sensing
Environmental, Wearable
WiFiRadio frequencyContext sensingEnvironmental, Wearable

Appendix B

List of the Analyzed Prototypes
List of the Analyzed Prototypes
PrototypeLearning DomainFormative Assessment ContributionSensors UsedDescription
Ailisto et al., (2006) [90]Cognitive-CamerasRFID readersIt reads tags placed in objects in order to present more information about them.

Anderson & Reiser (1985) [40]CognitiveFeedbackSoftware SensorsA tutoring system that helps students when having trouble solving some problems for the lisp programing language.

Amaratunga et al., (2002) [46]Cognitive-AccelerometersIt monitors the movement of a flagpole and streams this data through the network creating a virtual lab.

Arroyo et al., (2009) [36]CognitiveAttitudes towards teachingCamera, Galvanic skin conductance, pressure mouse, accelerometers.It is a prototype that detects the emotional state of students while interacting with an intelligent tutoring system.

Baca & Kornfeind (2006) [75]-BiathlonCognitiveKnowledge of subject matterCameraIt analyzes the movement of the rifle.
Knowledge of criteria and standards
Feedback

Baca & Kornfeind (2006) [75]-RowingCognitiveKnowledge of subject matterForce transducerIt is analyzes the rowing technique.
Knowledge of criteria and standards
Feedback

Baca, & Kornfeind (2006) [75]-Table tennisCognitiveKnowledge of subject matterAccelerometersIt analyzes the position of the table tennis shots.
Knowledge of criteria and standards
Feedback

Börner et al., (2014) [35]Cognitive-CameraIt is a ambient display that adapts its behavior to capture the attention of learners.

Broll et al., (2011) [91]Cognitive-NFCIt is a game where players need to touch parts of a screen with NFC readers

Chapel (2008) [92]Cognitive-GPSIt provides a communication system for students in a university.
NFC
WiFi

Chavira et al., (2007) [93]Cognitive-RFIDIt gives contextual information to the participants of a conference.
NFC

Chen & Huang, (2012) [69]CognitiveSkills in settingRFIDIt gives a tour through a museum.
Evaluative skills
Feedback

Chu et al., (2010) [94]Cognitive-RFIDIt gives a tour through a botanic garden.

Dung & Florea (2012) [31]CognitiveFeedbackSoftware sensorsIt detects the learning style of students and presents them later with learning objects fitting their style.

Edge et al., (2011) [30]Cognitive-GPSIt helps to learn a second language by presenting users with contextual phrases.

Ghasemzadeh et al., (2009) [48]CognitiveKnowledge of subject matterAccelerometersIt analyzes golf swings.
Knowledge of criteria and standards

GlobiLab for middle & high schools [43]Cognitive--Commercial software to visualize and analyze sensor data.

Greene (2010) [50]CognitiveKnowledge of subject matterGyroscopesIt analyzes the user's gait.
Knowledge of criteria and standards

Hester et al., (2006) [52]CognitiveKnowledge of criteria and standardsAccelerometersIt measures the movements of people who have suffered a heart stroke.
Skills in setting

Hicks et al., (2010) [53]CognitiveSelf-AssessmentAccelerometerIt captures the health and behavior of the user, with the sensors of a mobile device
FeedbackGPS

Hsu & Ho (2012) [32]CognitiveKnowledge of subject matterNFCIt chooses the learning path of the learner according to its tracked competences
Knowledge of criteria and standards
Evaluative skills
Feedback

James et al., (2004)[47] swimmingCognitiveKnowledge of subject matterAccelerometersIt analyzes the movements of the user while swimming.
Knowledge of criteria and standards

James et al., (2004) [47] rowingCognitiveKnowledge of subject matterAccelerometersIt analyzes the movements and applied forces of the user while rowing.
GPS
Knowledge of criteria and standards
Heart rate monitors

Jraidi, I., & Frasson, C. (2012) [38]Cognitive-EEGIt detects the uncertainty of students while performing exercises in an intelligent tutoring system.

Kaasinen et al., (2009) [29]Cognitive-NFCIt reads tags placed in objects in order to present more information

Kanjo(2009) [42] MobAsthmaCognitive-Air pollutants sensorsIt measures the air pollution and compares it asthma cases.
GPS
Software sensors

Kanjo(2009) [42] NoiseSpyCognitive-GPSIt measures the noise pollution.
Microphones

Kanjo(2009) [42] PollutionSpyCognitive-Air pollutants sensorsIt measures the air pollution.
GPS

Karime et al., (2011) [95]Cognitive-RFIDIt is a magic wand that recognizes objects and displays information about them on an ambient screen.

Kozaki et al., (2010) [96]CognitiveKnowledge of subject matterAccelerometersIt tracks the motion of the user in order to infer its activity
Knowledge of criteria and standardsECG

Kubicki et al., (2011) [97]Cognitive-RFIDIt is an interactive tabletop able to identify tangible objects.

Kuflik et al., (2011) [98]Cognitive-RFIDIt is a mobile guide for museums.

Lee & Carlisle (2011) [54]CognitiveKnowledge of subject matterAccelerometersIt detects falls using the accelerometer of mobile devices.
Knowledge of criteria and standardsGPS

Linden et al., (1996) [33]CognitiveFeedbackEEGIt trains children with ADD to pay attention.

Littlewort et al., (2011) [37]CognitiveAttitudes towards teachingCameraThis prototype tracks the facial expressions of kids while solving problems using a tutoring system.

Logger Pro [44]Cognitive-Commercial Software to visualize sensor data.

Lu et al., (2009) [99]Cognitive-MicrophoneIt uses the microphone of the mobile device of the user to identify its context and present information about it.

Mandula et al., (2011) [100]CognitiveSkills in settingRFIDIt identifies indoor locations and objects in order to give contextual information to the user.
Evaluative skills

Maisonneuve et al., (2009) [101]Cognitive-MicrophoneIt measures the noise pollution.
GPS

Muñoz-Organero et al., (2010) [102]Cognitive-RFIDIt uses sensors to identify objects and give information about them.

Muñoz-Organero et al., (2010) [103] Network LabCognitiveSkills in settingRFIDIt identifies objects on a network lab and gives information about them to the users.

Nijholt et al., (2007) [104]Cognitive-CamerasIt captures and displays the non-verbal input of the user.

Ogata et al., (2006) [105]Cognitive-GPSIt helps to learn a second language by presenting users with contextual phrases.
RFID

Pentland (2004) [51] Medical MonitoringCognitiveKnowledge of subject matterAccelerometerIt monitors the health condition of the user.
Blood pressure sensor, EEG
Knowledge of criteria and standards
Heart-rate monitor
Galvanic skin sensor GPS
Thermometers

Pentland (2004) [51] Memory GlassesCognitiveFeedbackBluetoothIt triggers reminders to the user according to its context
GPS
Software Sensors

Pérez-Sanagustín et al., (2012) [106]Cognitive-NFCIt provides contextual information to students inside of the university campus.
RFID

Rahman, & El Saddik (2012) [107]Cognitive-AccelerometersIt gets information about objects by pointing at them.
Infrared cameras

Ramirez-Gonzalez et al.,(2012) [108]CognitiveSkills in settingNFCIt allows teachers to create information about objects so that students can access this information later.

Serbedzija & Fairclough (2012) [41]Cognitive-Heart-rate monitorIt adapts the cockpit of a car according to the user's state and driving rules.
Electromyography
GPS Speedometer

SPARKvue [45]Cognitive-Commercial Software to visualize sensor data.

Spelmezan, Schanowski & Borchers (2009) [49]Cognitive-Bend sensorsIt tracks important moments during snowboarding.
Force sensors
Inertial sensors
Software Compass

Strachan (2005) [79]CognitiveFeedbackGPSIt helps users to navigate through sounds.

Szafir & Mutlu (2013) [34]CognitiveKnowledge of criteria and standardsEEGIt tracks the attention level of students during a virtual lecture, and recommends which subjects to review after it.
Skills in setting
Feedback

Whitehill et al., (2008) [39]CognitiveKnowledge of criteria and standardsCameraThe prototype determines the speed at which lesson material should be presented in a tutoring system according to the facial expressions of the learner.

Garrido (2011) [109]Cognitive - AffectiveSkills in settingNFCIt is a game where objects can be found by using sensors.
Evaluative skillsRFID

Heggen (2012) [57]Cognitive - AffectiveSkills in settingAccelerometerIt uses the sensors on a mobile device to gather scientific data.
Camera
Microphone
GPS

Carroll et al.,(2013) [70]AffectiveSelf-AssessmentAccelerometerIt monitors the emotional state of the user and keeps track of its eating habits.
ECG
FeedbackElectro dermal activity

Consolvo et al., (2008) [55]AffectiveSelf-AssessmentAccelerometer BarometerIt monitors and keeps track of the physical activity of the user.
Camera
Feedback
Compas
Humistor
Microphone
Thermometer microphone

Froehlich et al., (2009) [71]AffectiveSelf-AssessmentAccelerometerIt monitors and keeps track of the means of travel by the user.
Barometer
FeedbackInfrared camera
GSM

Hsieh et al., (2008) [72]AffectiveSelf-AssessmentSoftware sensorsIt monitors and keeps track of the user's activities.
Feedback

Pentland(2004) [51] DiabeNetAffectiveSelf-AssessmentBlood glucose meterIt monitors the glucose condition of the user.
Software sensors
Feedback

Verpoorten et al., (2009) [56]AffectiveFeedbackSoftware sensorsIt adapts a virtual learning environment according to the learner's actions and interests.

Aukee et al., (2004) [74]PsychomotorFeedbackBiofeedback(Barometer)It gives feedback about the pelvic floor activity, and it is used to improve incontinence.

Bevilacqua et al., (2007) [59]PsychomotorKnowledge of criteria and standardsAccelerometersIt maps gestures taught on a music lesson to sounds.
FeedbackGyroscopes

Brunelli, et al., (2006) [58]PsychomotorFeedbackAccelerometersIt corrects the posture of people going through a rehabilitation process.
Inertial sensors

Burish, & Jenkins (1992) [76]PsychomotorFeedbackElectromyographIt teaches patients going through Chemotherapy how to relax.
Thermometer

Cockburn et al., (2008) [68]PsychomotorFeedbackCamerasIt is a game that trains children with autism to perform some facial gestures.

Hoque et al., (2013) [67]PsychomotorFeedbackCamerasIt is a prototype that helps learners to develop social skills for job interviews
Microphones

Kranz et al., (2006) [61]PsychomotorFeedbackAccelerometersIt corrects the movements of patients going through physiotherapy.
Gyroscopes
RFID

Kwon & Gross (2005) [62]PsychomotorFeedbackAccelerometersIt is a motion training system for martial arts.
Cameras

Lehrer et al., (2000) [77]PsychomotorFeedbackECGIt trains users to breath according to the heartbeat.

Li et al., (2012) [78]PsychomotorFeedbackCameraIt is a game based psychomotor skill training for kids with autism.

Paradiso et al., (2004) [73]PsychomotorFeedbackAccelerometersIt produces different sounds according to the gait of the users.
Barometers
Gyroscopes
Sonar

Spelmezan & Borchers (2008) [63]PsychomotorKnowledge of subject matterBend sensorsIt helps to train the snowboarding technique.
Force sensors
Knowledge of criteria and standardsInertial sensors
FeedbackSoftware Compass

Spelmezan et al., (2009) [64]PsyhomotorFeedbackBend sensorsIt helps to train the snowboarding technique using haptic feedback.
Force sensors
Inertial sensors
Software Compass

Takahata et al., (2004) [65]PsychomotorFeedbackAccelerometersIt helps to train karate movements.
Cameras

Vales-Alonso et al.,(2010) [66]PsychomotorFeedbackBarometerIt helps cross country runners with its training.
Heart-rate monitor
Humistor
Thermometer

Van der Linden et al., (2011) [60]PsychomotorFeedbackInertial motion capture sensorsIt is a prototype that helps learners to practice certain movements while playing violin.

Verhoeff et al., (2009) [80]PsychomotorFeedbackAccelerometersIt gives feedback according to the user's gait.
Gyroscopes

Chang et al., (2009) [110]--HumistorIt is a house that adapts certain aspects automatically according to the user's preferences
Light sensors
NFC
RFID
Thermometers

Hsu (2010) [111]--RFIDIt is a house that adapts the music being played automatically according to the user's preferences.

Krause et al., (2006) [112]--AccelerometerIt is a mobile phone that changes its behavior according to the user state and surroundings.
Galvanic skin response
Thermometer

References

  1. Börner, D.; Kalz, M.; Specht, M. Beyond the channel: A literature review on ambient displays for learning. Comput. Educ. 2013, 60, 426–435. [Google Scholar]
  2. Cisco Blog. The Internet of Things. 2011. Available online: http://blogs.cisco.com/news/the-internet-of-things-infographic/ (accessed on 4 March 2014).
  3. Swan, M. Sensor Mania! The Internet of Things, Wearable Computing, Objective Metrics, and the Quantified Self 2.0. J. Sens. Actuat. Netw. 2012, 1, 217–253. [Google Scholar]
  4. Oxford Dictionaries. Avaiable online: http://www.oxforddictionaries.com/ (accessed 3 March 2014).
  5. Miluzzo, E.; Lane, N.D.; Eisenman, S.B.; Campbell, A.T. CenceMe—Injecting Sensing Presence into Social Networking Applications. Proceedings of the 2nd European Conference on Smart Sensing and Context 2007, Kendal, UK, 23–25 October 2007; pp. 1–28.
  6. Hunter, G.W.; Stetter, J.R.; Hesketh, P.J.; Liu, C.C. Smart Sensor Systems. In Nanodevices and Nanomaterials for Ecological Security; Springer: Berlin, Germany, 2012; pp. 205–214. [Google Scholar]
  7. Guo, Y.; Wu, C.; Tsinalis, O.; Silva, D.; Gann, D. WikiSensing: Towards a Cloud-Based Sensor Informatics Platform for Life in a Digital City. Proceedings of Digital Futures 2012 Conference, Aberdeen, UK, 23–25 October 2012; pp. 23–25.
  8. Torresen, J.; Hafting, Y.; Nymoen, K. A New Wi-Fi Based Platform for Wireless Sensor Data Collection. Proceedings of the International Conference on New Interfaces for Musical Expression, Daejeon & Seoul, Korea, 27–30 May 2013; pp. 337–340.
  9. Bloom, B.S.; Englehart, M.B.; Furst, E.J.; Hill, W.H.; Krathwohl, D.R. Taxonomy of Educational Objectives: The Classification of Educational Goal. Handbook I: Cognitive Domain; David McKay: New York, NY, USA, 1956. [Google Scholar]
  10. Russel, M. Levaraging student engagement with assessments: Collecting intelligence to support teaching, student progress and retention. In Improving Student Retention in Higher Education: The Role of Teaching and Learning; Crosling, G., Thomas, L., Heagney, M., Eds.; Routledge: London, UK, 2008. [Google Scholar]
  11. Gedye, S. Formative assessment and feedback: A review. Planet 2010, 23, 40–45. [Google Scholar]
  12. Berlanga, A.J.; van Rosmalen, P.; Boshuizen, H.P.A.; Sloep, P.B. Exploring formative feedback on textual assignments with the help of automatically created visual representations. J. Comput. Assist. Learn. 2012, 28, 146–160. [Google Scholar]
  13. Sadler, D.R. Formative assessment: Revisiting the territory. Assess. Educ. 1998, 5, 77–84. [Google Scholar]
  14. Bennett, R.E. Formative assessment: A critical review. Assess. Educ. Princ. Policy Pract. 2011, 18, 5–25. [Google Scholar]
  15. Hattie, J.; Timperley, H. The power of feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar]
  16. Polar. Available online: http://www.polar.com/en (accessed on 17 April 2014).
  17. Nike+. Available online: http://nikeplus.nike.com (accessed on 17 April 2014).
  18. Digifit. Available online: http://digifit.com/ (accessed on 17 April 2014).
  19. Xbox fitness. Available online: http://www.xbox.com/en-US/xbox-one/games/xbox-fitness (accessed on 17 April 2014).
  20. Krathwohl, D.R. A Revision of Bloom's Taxonomy: An Overview. Theory Pract. 2002, 41, 212–218. [Google Scholar]
  21. Krathwohl, D.R.; Bloom, B.S.; Masia, B.B. Taxonomy of Educational Objectives, the Classification of Educational Goals. Handbook II: Affective Domain; David McKay: New York, NY, USA, 1973. [Google Scholar]
  22. Harrow, A. A Taxonomy of Psychomotor Domain: A Guide for Developing Behavioral Objectives; David McKay: New York, NY, USA, 1972. [Google Scholar]
  23. Nicol, D.; Macfarlane-Dick, D. Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Stud. High. Educ. 2006, 31, 199–218. [Google Scholar]
  24. Bargh, J.A.; Gollwitzer, P.M.; Lee-Chai, A.Y.; Barndollar, K.; Troetschel, R. The automated will: Nonconscious activation and pursuit of behavioral goals. J. Personal. Soc. Psychol. 2001, 81, 1014–1027. [Google Scholar]
  25. Goetz, T. Harnessing the Power of Feedback Loops. Available online: http://www.wired.com/2011/06/ff_feedbackloop/all (accessed on 11 February 2014).
  26. Mory, E.H. Feedback Research Revisited. In Handbook of Research on Educational Communications and Technology; Taylor & Francis: Oxford, UK, 2004; pp. 745–783. [Google Scholar]
  27. Brusilovsky, P. Adaptive navigation support: From adaptive hypermedia to the adaptive web and beyond. Psychnol. J. 2004, 2, 7–23. [Google Scholar]
  28. Brun, Y.; Serugendo, G.D.M.; Gacek, C.; Giese, H.; Kienle, H.; Litoiu, M.; Shaw, M. Engineering self-adaptive systems through feedback loops. In Software Engineering for Self-Adaptive Systems; Cheng, B.H.C., Lemos, R., Giese, H., Inverardi, P., Magee, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2009; Volume 5525, pp. 48–70. [Google Scholar]
  29. Kaasinen, E.; Niemelä, M.; Tuomisto, T.; Välkkynen, P.; Jantunen, I.; Sierra, J.; Kaaja, H. Ubimedia based on readable and writable memory tags. Multimed. Syst. 2009, 16, 57–74. [Google Scholar]
  30. Edge, D.; Searle, E.; Chiu, K.; Zhao, J.; Landay, J.A. MicroMandarin: Mobile Language Learning in Context. Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; ACM Press: New York, NY, USA, 2011; pp. 3169–3178. [Google Scholar]
  31. Dung, P.Q.; Florea, A.M. A literature-based method to automatically detect learning styles in learning management systems. Proceedings of the 2nd International Conference on Web Intelligence, Mining and Semantics (WIMS'12), Craiova, Romania, 13–15 June 2012.
  32. Hsu, C.-C.; Ho, C.-C. The design and implementation of a competency-based intelligent mobile learning system. Expert Syst. Appl. 2012, 39, 8030–8043. [Google Scholar]
  33. Linden, M.; Habib, T.; Radojevic, V. A Controlled Study of the Effects of EEG Biofeedback on Cognition and Behavior of Children with Attention Deficit Disorder and Learning Disabilities I. Biofeedback Self Regul. 1996, 21, 35–49. [Google Scholar]
  34. Szafir, D.; Mutlu, B. ARTFul: Adaptive Review Technology for Flipped Learning. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'13), Paris, France, 27 April–2 May 2013; pp. 1001–1010.
  35. Börner, D.; Kalz, M.; Specht, M. Lead me gently: Facilitating knowledge gain through attention-aware ambient learning displays. Comput. Educ. 2014, 78, 10–19. [Google Scholar]
  36. Arroyo, I.; Cooper, D.G.; Burleson, W.; Woolf, B.P.; Muldner, K.; Christopherson, R. Emotion Sensors Go To School. Proceedings of the 2009 Conference on Artificial Intelligence in Education: Building Learning Systems that Care: From Knowledge Representation to Affective Modelling, Brighton, UK, 6–10 July 2009; pp. 17–24.
  37. Littlewort, G.C.; Bartlett, M.S.; Salamanca, L.P.; Reilly, J. Automated measurement of children's facial expressions during problem solving tasks. Proceedings of 2011 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2011), Santa Barbara, CA, USA; pp. 30–35.
  38. Jraidi, I.; Frasson, C. Student's Uncertainty Modeling through a Multimodal Sensor-Based Approach. Educ. Technol. Soc. 2013, 16, 219–230. [Google Scholar]
  39. Whitehill, J.; Bartlett, M.; Movellan, J. Automatic facial expression recognition for intelligent tutoring systems. Proceedings of 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW'08), Anchorage, AK, USA, 23–28 June 2008; pp. 1–6.
  40. Anderson, J.R.; Reiser, B.J. The LISP tutor: It approaches the effectiveness of a human tutor. BYTE 1985, 10, 159–175. [Google Scholar]
  41. Serbedzija, N.; Fairclough, S. Reflective pervasive systems. ACM Trans. Auton. Adapt. Syst. 2012, 7, 1–19. [Google Scholar]
  42. Kanjo, E.; Bacon, J.; Roberts, D.; Landshoff, P. MobSens: Making Smart Phones Smarter. IEEE Pervasive Comput. 2009, 8, 50–57. [Google Scholar]
  43. Globisens. GlobiLab for Middle & High Schools. Available online: http://www.globisens.net/k-12-software/globilab (accessed on 8 January 2015).
  44. Vernier Software & Technology. Logger Pro. Available online: http://www.vernier.com/products/software/lp/ (accessed on 8 January 2015).
  45. PASCO. SPARKvue. Available online: http://www.pasco.com/family/sparkvue/index.cfm (accessed on 8 January 2015).
  46. Amaratunga, K.; Sudarshan, R. A Virtual Laboratory for Real-Time Monitoring of Civil. Proceedings of the International Conference on Engineering Education, Manchester, UK, 18–21 August 2002.
  47. James, D.A.; Davey, N.; Rice, T. An Accelerometer Based Sensor Platform for in situ Elite Athlete Performance Analysis. Proceedings of 2004 IEEE Sensors, ienna, Austria, 24–27 October 2004; pp. 1373–1376.
  48. Ghasemzadeh, H.; Loseu, V.; Jafari, R. Wearable coach for sport training: A quantitative model to evaluate wrist-rotation in golf. Environments 2009, 1, 1–12. [Google Scholar]
  49. Spelmezan, D.; Schanowski, A.; Borchers, J. Wearable Automatic Feedback Devices for Physical Activities. Proceedings of the 4th International ICST Conference on Body Area Networks, Los Angeles, CA, USA, 1–3 April 2009.
  50. Greene, B.R.; Mcgrath, D.; Donovan, K.J.O.; Neill, R.O.; Burns, A.; Caulfield, B. Adaptive estimation of temporal gait parameters using body-worn gyroscopes. Proceedings of 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Buenos Aires, Argentina, 31 August–4 September 2010; pp. 1296–1299.
  51. Pentland, A. S. Healthwear: Medical Technology Becomes Wearable. Computer 2004, 37, 42–49. [Google Scholar]
  52. Hester, T.; Hughes, R.; Sherrill, D.M.; Knorr, B.; Akay, M.; Stein, J.; Bonato, P. Using Wearable Sensors to Measure Motor Abilities following Stroke. Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks (BSN'06), Cambridge, MA, USA, 3–5 April 2006; pp. 5–8.
  53. Hicks, J.; Ramanathan, N.; Kim, D.; Monibi, M.; Selsky, J.; Hansen, M.; Estrin, D. And Wellness: An Open Mobile System for Activity and Experience Sampling. Proceedings of Wireless Health, San Diego, CA, USA, 5–7 October 2010; pp. 34–43.
  54. Lee, R.Y.W.; Carlisle, A.J. Detection of falls using accelerometers and mobile phone technology. Age Ageing 2011, 40, 690–696. [Google Scholar]
  55. Consolvo, S.; Mcdonald, D.W.; Toscos, T.; Chen, M.Y.; Froehlich, J.; Harrison, B.; Landay, J.A. Activity Sensing in the Wild: A Field Trial of UbiFit Garden. Proceedings of the Conference on Human Factors in Computing Systems (CHI'08), Florence, Italy, 5–8 April 2008; pp. 1797–1806.
  56. Verpoorten, D.; Glahn, C.; Kravcik, M.; Ternier, S.; Specht, M. Personalisation of Learning in Virtual Learning Environments. Proceedings of the European Conference on Technology Enhanced Learning, Nice, Italy, 29 September–2 October 2009; pp. 52–66.
  57. Heggen, S. Integrating participatory sensing and informal science education. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 552–555.
  58. Brunelli, D.; Farella, E.; Rocchi, L.; Dozza, M.; Chiari, L.; Benini, L. Biofeedback System for Rehabilitation Based on a Wireless Body Area Network. Proceedings of the Fourth Annual IEEE International Conference on Pervasive Computing and Communications Workshops, Pisa, Italy, 13–17 March 2006; pp. 531–536.
  59. Bevilacqua, F.; Guédy, F.; Schnell, N.; Fléty, E.; Leroy, N.; Guedy, F.; Flety, E. Wireless sensor interface and gesture-follower for music pedagogy. Proceedings of the International Conference on New Interfaces for Musical Expression, New York, NY, USA, 6–10 June 2007; pp. 124–129.
  60. Van der Linden, J.; Johnson, R.; Bird, J.; Rogers, Y.; Schoonderwaldt, E. Buzzing to play: Lessons learned from an in the wild study of real-time vibrotactile feedback. Proceedings of the Conference on Human Factors in Computing Systems (CHI'11), Vancouver, BC, Canada, 7–12 May 2011; pp. 533–543.
  61. Kranz, M.; Holleis, P.; Spiessl, W.; Schmidt, A.; Tusker, F. The Therapy Top Measurement and Visualization System—An Example for the Advancements in Existing Sports Equipments. J. Comput. Sci. 2006, 5, 76–80. [Google Scholar]
  62. Kwon, D.Y.; Gross, M. Combining Body Sensors and Visual Sensors for Motion Training. Proceedings of Advances in Computer Entertainment Technology, Valencia, Spain, 15–17 June 2005; pp. 94–101.
  63. Spelmezan, D.; Borchers, J. Real-Time Snowboard Training System. Proceedings of the Extended Abstacts on Human Factors in Computing Systems (CHI'08), Florence, Italy, 5–8 April 2008; pp. 3327–3332.
  64. Spelmezan, D.; Jacobs, M.; Hilgers, A.; Borchers, J. Tactile motion instructions for physical activities. Proceedings of the Conference on Human Factors in Computing Systems (CHI'09), Boston, MA, USA, 4–9 April 2009; pp. 2243–2252.
  65. Takahata, M.; Shiraki, K.; Sakane, Y.; Takebayashi, Y. Sound Feedback for Powerful Karate Training. Proceedings of New Interfaces for Musical Expression, Hamamatsu, Japan, 3–5 June 2004; pp. 13–18.
  66. Vales-Alonso, J.; López-Matencio, P.; Gonzalez-Castaño, F.J.; Navarro-Hellín, H.; Baños-Guirao, P.J.; Pérez-Martínez, F.J.; Martínez-Álvarez, R.P.; González-Jiménez, D.; Gil-Castiñeira, F.; Duro-Fernández, R. Ambient Intelligence Systems for Personalized Sport Training. Sensors 2010, 10, 2359–2385. [Google Scholar]
  67. Hoque, M.E.; Courgeon, M.; Martin, J.-C.; Mutlu, B.; Picard, R.W. MACH: My Automated Conversation Coach. Proceedings of the International Joint Conference on Pervasive and Ubiquitous Computing, Zurich, Switzerland, 8–12 September 2013; pp. 697–707.
  68. Cockburn, J.; Bartlett, M.; Tanaka, J.; Movellan, J.; Pierce, M. SmileMaze: A Tutoring System in Real-Time Facial Expression Perception and Production for Children with Autism Spectrum Disorder. Proceedings of IEEE International Conference on Automatic Face & Gesture Recognition, Amsterdam, The Netherlands, 17–19 September 2008; pp. 978–986.
  69. Chen, C.-C.; Huang, T.-C. Learning in a u-Museum: Developing a context-aware ubiquitous learning environment. Comput. Educ. 2012, 59, 873–883. [Google Scholar]
  70. Carroll, E.A.; Czerwinski, M.; Roseway, A.; Kapoor, A.; Johns, P.; Rowan, K.; Schraefel, M.C. Food and Mood: Just-in-Time Support for Emotional Eating. Proceedings of 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), Geneva, Switzerland, 2–5 September 2013; pp. 252–257.
  71. Froehlich, J.; Dillahunt, T.; Klasnja, P.; Mankoff, J.; Consolvo, S.; Harrison, B.; Landay, J.A. UbiGreen: Investigating a Mobile Tool for Tracking and Supporting Green Transportation Habits. Proceedings of the Conference on Human Factors in Computing Systems (CHI'09), Boston, MA, USA, 4–9 April 2009; pp. 1043–1052.
  72. Hsieh, G.; Li, I.; Dey, A.; Forlizzi, J.; Hudson, S.E. Using Visualizations to Increase Compliance in Experience Sampling. Proceedings of the International Conference on Ubiquitous Computing, Seul, Korea, 21–24 September 2008; pp. 164–167.
  73. Paradiso, J.A.; Morris, S.J.; Benbasat, A.Y.; Asmussen, E. Interactive Therapy with Instrumented Footwear. Proceedings of the Extended Abstracts on Human Factors in Computing Systems (CHI'04), Viena, Austria, 24–29 April 2004; pp. 1341–1343.
  74. Aukee, P.; Immonen, P.; Laaksonen, D.E.; Laippala, P.; Penttinen, J.; Airaksinen, O. The effect of home biofeedback training on stress incontinence. Acta Obstet. Gynecol Scand. 2004, 83, 973–977. [Google Scholar]
  75. Baca, A.; Kornfeind, P. Rapid Feedback Systems for Elite Sports Training. IEEE Pervasive Comput. 2006, 5, 70–76. [Google Scholar]
  76. Burish, T.G.; Jenkins, R.A. Effectiveness of Biofeedback and Relaxation Training in Reducing the Side Effects of Cancer Chemotherapy. Health Physiol. 1992, 11, 17–23. [Google Scholar]
  77. Lehrer, P.M.; Vaschillo, E.; Vaschillo, B. Resonant frequency biofeedback training to increase cardiac variability: Rationale and manual for training. Appl. Psychophysiol. Biofeedback 2000, 25, 177–191. [Google Scholar]
  78. Li, K.-H.; Lou, S.-J.; Tsai, H.-Y.; Shih, R.-C. The Effects of Applying Game-Based Learning to Webcam Motion Sensor Games for Autistic Students' Sensory Integration Training. Turk. Online J. Educ. Technol. 2012, 11, 451–459. [Google Scholar]
  79. Strachan, S. GpsTunes—Controlling Navigation via Audio Feedback. Proceedings of Proceedings of the 7th International Conference on Human Computer Interaction with Mobile Devices & Services, Lisbon, Portugal, 7–10 September 2010; pp. 275–278.
  80. Verhoeff, L.L.; Horlings, C.G.C.; Janssen, L.J.F.; Bridenbaugh, S.A.; Allum, J.H.J. Gait & Posture Young and Elderly. Gait Posture 2009, 30, 76–81. [Google Scholar]
  81. Wirth, K.R.; Perkins, D. Learning about Thinking and Thinking about Learning; Innovations in the Scholarship of Teaching and Learning at the Liberal Arts Colleges: St. Olaf and Carleton College, MN, USA; pp. 16–18. February; 2007. [Google Scholar]
  82. Van Merrienboer, J.J.G.; Kirschner, P.A. Ten Steps to Complex Learning: A Systematic Approach to Four-Component Instructional Design; Lawrence Erlbaum: Mahwah, NJ, USA, 2007. [Google Scholar]
  83. Paas, F.; Renkel, A.; Sweller, J. Cognitive Load Theory: Instructional Implications of the Interaction between Information Structures and Cognitive Architecture. Instr. Sci. 2004, 32, 1–8. [Google Scholar]
  84. Schön, D. The Reflective Practitioner: How Professionals Think in Action; Basic Books: New York, NY, USA, 1993. [Google Scholar]
  85. Atzori, L.; Iera, A.; Morabito, G. The Internet of Things: A Survey. Comput. Netw. 2010, 54, 2787–2805. [Google Scholar]
  86. Aztiria, A.; Izaguirre, A.; Augusto, J.C. Learning patterns in ambient intelligence environments: A survey. Artif. Intell. Rev. 2010, 34, 35–51. [Google Scholar]
  87. Carmigniani, J.; Furht, B.; Anisetti, M.; Ceravolo, P.; Damiani, E.; Ivkovic, M. Augmented reality technologies, systems and applications. Multimed. Tools Appl. 2010, 51, 341–377. [Google Scholar]
  88. Garg, M.K.; Kim, D.; Turaga, D.S. Multimodal Analysis of Body Sensor Network Data Streams for Real-time Healthcare. Proceedings of the 11th ACM SIGMM International Conference on Multimedia Information Retrieval, Philadelphia, PA, USA, 29–31 March 2010; pp. 469–478.
  89. Ugulino, W.; Cardador, D.; Vega, K.; Velloso, E.; Milidiu, R.; Fuks, H. Wearable Computing: Accelerometers' Data Classification of Body Postures and Movements. In Advances in Artificial Intelligence—SBIA 2012; Springer: Berlin, Germany, 2012; pp. 52–61. [Google Scholar]
  90. Ailisto, H.; Pohjanheimo, L.; Välkkynen, P.; Strömmer, E.; Tuomisto, T.; Korhonen, I. Bridging the physical and virtual worlds by local connectivity-based physical selection. Pers. Ubiquitous Comput. 2006, 10, 333–344. [Google Scholar]
  91. Broll, G.; Graebsch, R.; Scherr, M.; Boring, S.; Holleis, P.; Wagner, M. Touch to Play—Exploring Touch-Based Mobile Interaction with Public Displays. Proceedings of the Third International Workshop on Near Field Communication, Hagenberg, Austria, 22–22 February 2011; pp. 15–20.
  92. Chapel, E. Mobile technology: The foundation for an engaged and secure campus community. J. Comput. High. Educ. 2008, 20, 15–23. [Google Scholar]
  93. Chavira, G.; Nava, S.W.; Hervas, R.; Bravo, J.; Sanchez, C. Combining RFID and NFC Technologies in an AmI Conference Scenario. Proceedings of the Eighth Mexican International Conference on Current Trends in Computer Science, Michoacan, Mexico, 24–28 September 2007; pp. 165–172.
  94. Chu, H.-C.; Hwang, G.-J.; Tsai, C.-C.; Tseng, J.C.R. A two-tier test approach to developing location-aware mobile learning systems for natural science courses. Comput. Educ. 2010, 55, 1618–1627. [Google Scholar]
  95. Karime, A.; Hossain, M.A.; Rahman, A.S.M.M.; Gueaieb, W.; Alja'am, J.M.; El Saddik, A. RFID-based interactive multimedia system for the children. Multimed. Tools Appl. 2011, 59, 749–774. [Google Scholar]
  96. Kozaki, T.; Nakajima, S.; Tsujioka, T. Estimation of Human Movements from Body Acceleration Monitoring for Ubiquitous Health Care. Proceedings of the 12th International Conference on Advanced Communication Technology (ICACT), Gangwon-Do, Korea, 7–10 February 2010; pp. 430–435.
  97. Kubicki, S.; Lepreux, S.; Kolski, C. RFID-driven situation awareness on TangiSense, a table interacting with tangible objects. Personal and Ubiquitous. Computing 2011, 16, 1079–1094. [Google Scholar]
  98. Kuflik, T.; Stock, O.; Zancanaro, M.; Gorfinkel, A.; Jbara, S.; Kats, S.; Kashtan, N. A visitor's guide in an active museum. J. Comput. Cult. Herit. 2011, 3, 1–25. [Google Scholar]
  99. Lu, H.; Pan, W.; Lane, N.D.; Choudhury, T.; Campbell, A.T. SoundSense: Scalable Sound Sensing for People-Centric Applications on Mobile Phones. the 7th Annual International Conference on Mobile Systems, Applications, and Services (MobySys), Krakow, Poland, 22–25 June 2009; pp. 165–178.
  100. Mandula, K.; Meda, S.R.; Jain, D.K.; Kambham, R. Implementation of Ubiquitous Learning System Using Sensor Technologies. Proceedings of the 2011 IEEE International Conference on Technology for Education (T4E), Chennai, India, 14–16 July 2011; pp. 142–148.
  101. Maisonneuve, N.; Stevens, M.; Niessen, M.E.; Hanappe, P.; Steels, L. Citizen noise pollution monitoring. Proceedings of the 10th Annual International Conference on Digital Government, Puebla, Mexico, 17–20 May 2009; pp. 96–103.
  102. Muñoz-Organero, M.; Ramírez-González, G.A.; Muñoz-Merino, P.J. A Collaborative Recommender System Based on Space-Time Similarities. Pervasive Comput. 2010, 9, 81–87. [Google Scholar]
  103. Muñoz-Organero, M.; Ramírez-González, G.; Muñoz-Merino, P.J.; Kloos, C.D. Evaluating the Effectiveness and Motivational Impact of Replacing a Human Instructor by Mobile Devices for Teaching Network Services Configuration to Telecommunication Engineering Students. Proceedings of the 10th IEEE International Conference on Advanced Learning Technologies, Sousse, Tunisia, 5–7 June 2010; pp. 284–288.
  104. Nijholt, A.; Zwiers, J.; Peciva, J. Mixed reality participants in smart meeting rooms and smart home environments. Pers. Ubiquitous Comput. 2007, 13, 85–94. [Google Scholar]
  105. Ogata, H.; Yin, C.; Yano, Y. JAMIOLAS: Supporting Japanese Mimicry and Onomatopoeia Learning with Sensors. Proceedings of the Fourth IEEE International Workshop on Wireless, Mobile and Ubiquitous Technology in Education (WMUTE'06), Athens, Greece, 16–17 November 2006; pp. 111–115.
  106. Pérez-Sanagustín, M.; Ramirez-Gonzalez, G.; Hernández-Leo, D.; Muñoz-Organero, M.; Santos, P.; Blat, J.; Delgado Kloos, C. Discovering the campus together: A mobile and computer-based learning experience. J. Netw. Comput. Appl. 2012, 35, 176–188. [Google Scholar]
  107. Rahman, A.S.M.M.; El Saddik, A. Mobile PointMe-based spatial haptic interaction with annotated media for learning purposes. Multimed. Syst. 2012, 19, 131–149. [Google Scholar]
  108. Ramirez-González, G.; Cordoba-Paladinez, C.; Sotelo-Torres, O.; Palacios, C.; Muñoz-Organero, M.; Delgado-Kloos, C. Pervasive Learning Activities for the LMS. LRN through Android Mobile Devices with NFC Support. Proceedings of the 2012 IEEE 12th International Conference on Advanced Learning Technologies, Rome, Italy, 4–6 July 2012; pp. 672–673.
  109. Garrido, P.C.; Miraz, G.M.; Ruiz, I.L.; Gomez-Nieto, M.A. Use of NFC-Based Pervasive Games for Encouraging Learning and Student Motivation. Proceedings of the 2011 Third International Workshop on Near Field Communication, Hagenberg, Austria, 22–23 February 2011; pp. 32–37.
  110. Chang, Y.-S.; Hung, Y.-S.; Chang, C.-L.; Juang, T.-Y. Toward a NFC Phone-Driven Context Awareness Smart Environment. Proceedings of the 2009 Symposia and Workshops on Ubiquitous, Autonomic and Trusted Computing, Brisbane, Australia, 7–9 July 2009; pp. 298–303.
  111. Hsu, J.-M. Design and evaluation of virtual home objects with music interaction in smart homes. J. Intell. Manuf. 2010, 23, 1281–1291. [Google Scholar]
  112. Krause, A.; Smailagic, A.; Member, S.; Siewiorek, D.P. Context-Aware Mobile Computing: Learning Context-Dependent Personal Preferences from a Wearable Sensor Array. Context 2006, 5, 113–127. [Google Scholar]
Figure 1. Sensor-based learning support on the learning domains.
Figure 1. Sensor-based learning support on the learning domains.
Sensors 15 04097f1 1024
Figure 2. Sensor-based support on formative assessment.
Figure 2. Sensor-based support on formative assessment.
Sensors 15 04097f2 1024
Figure 3. Framework used for the analysis of sensor-based support on effective feedback.
Figure 3. Framework used for the analysis of sensor-based support on effective feedback.
Sensors 15 04097f3 1024
Table 1. Strategies supporting learning in the cognitive domain.
Table 1. Strategies supporting learning in the cognitive domain.
Sensor Usage (Design)Number of PrototypesExample of Sensors UsedCognitive Domain Category
Contextual information acquisition for filtering22NFC, RFID, GPS, MicrophonesDepends on the information attached to the context
Learner's feature identification and user modeling11EEG, Software sensors, NFC, Cameras, Heart-rate monitorDepends on the information attached to the feature
Sensor Data for contextual reflection and change notification23Accelerometers, Air pollutants sensors Cameras, ECG, EEG, gyroscopes, microphonesDepends on the use of the information by the learner
Table 2. Strategies supporting learning in the affective domain.
Table 2. Strategies supporting learning in the affective domain.
StrategyNumber of PrototypesExample of Sensors Used
Behavior overview and review4Accelerometers, Barometer, Camera, Compass, GPS, Humistor, Microphone Software sensors, Thermometer
Social network visualization2Blood glucose meter,Software sensors
Involving learners in data collection2Accelerometers, Camera, Microphone, Thermometers
Table 3. Overview of the support for learning in the psychomotor domain.
Table 3. Overview of the support for learning in the psychomotor domain.
Category SupportedAmount of PrototypesExample of Sensors Used
Reflex movements0-
Fundamental movements7Accelerometers, Cameras, ECG, Electromyography sensor, Gyroscopes
Perceptual0-
Physical activities1Heart-rate monitor, Thermometer
Skilled movements7Accelerometers, Cameras, Force gauge, Gyroscopes
Non-discursive communication2-
Table 4. Support for the aspects of formative assessment.
Table 4. Support for the aspects of formative assessment.
Aspects of Formative AssessmentNumber of PrototypesStrategies UsedExample of Sensors Used
Knowledge of subject matter12Presenting sensor data about the learner's performanceAccelerometers,Cameras, Gyroscopes, Software sensors

Knowledge of criteria and standards15Presenting sensor data about the learner's performance.Accelerometers,Cameras, EEG, Heart-rate monitors, Galvanic skin response sensor,Gyroscopes, Software sensors
Presenting sensor data about the learner's physiological state

Attitudes toward teaching2Informing the tutor about the emotional state of the learner while performing learning tasksCamera, Galvanic skin conductance, pressure mouse, accelerometers

Skills in setting8Setting assessments according to learner's location.GPS, EEG, Heart-rate monitors,NFC, RFID, Software Sensors
Setting assessments according to learner's physiological state

Evaluative skills4Evaluating answers of learnersGPS, NFC, RFID, Software Sensors

Sharing learning expectations0--

Self-Assessment6Presenting an overview of the learner's performanceAccelerometers, GPS, Software sensors

Peer-Assessment0--

Feedback35Presenting information about the learner's performance, behavior or stateAccelerometers,Cameras, EEG, Heart-rate monitors,Galvanic skin response sensor,Gyroscopes, Software sensors
Table 5. Prototypes answering to “where am I going?”.
Table 5. Prototypes answering to “where am I going?”.
PrototypeTopicStrategy Used to Answer the Question
Carroll et al., (2013)[70]Healthy eatingEvidence: Overview of eating habits represented as a tree.Consequences: The color of the tree changes.
Consolvo et al., (2008)[55]Healthy livingEvidence: Overview user's activities represented as a garden.
Consequences: Life in the garden depends on the activities.
Froehlich et al., (2009)[71]Eco-travelingEvidence: Overview of means of transportation as an ecosystem.
Consequences: Life in the ecosystem depends on the means.
Hicks et al., (2010)[53]Healthy habitsAsk questions about performed activities to reflect about goals.
Hsieh et al., (2008)[72]Physical activitiesEvidence: Overview of user's performance presented together with the goals.
Table 6. Prototypes answering to “how am I going?”.
Table 6. Prototypes answering to “how am I going?”.
PrototypeTopicStrategy Used to Answer the QuestionChannel of Feedback
Aukee et al., (2004) [74]IncontinenceElaborate feedbackVisual

Baca & Kornfeind (2006)[75]-BiathlonRifle movements in BiathlonElaborate feedbackVisual

Baca & Kornfeind (2006)[75]-RowingExerted forces in rowingElaborate feedbackVisual

Baca & Kornfeind (2006)[75]-Table tennisShot position and cadence in table tennisElaborate feedbackVisual

Bevilacqua et al., (2007) [59]Musical levelTry AgainAudio
Simple verification

Brunelli et al., (2006)[58]PostureTry AgainAudio

Burish, & Jenkins(1992) [76]RelaxationTry AgainAudio
Simple verification

Carroll et al., (2013)[70]Healthy eatingElaborate feedbackVisual

Cockburn et al., (2008)[68]Teaching to smileTry AgainVisual
Simple verification

Hicks et al., (2010)[53]Healthy habitsQuestions are asked user letting the user reflect about the answer.Visual

Hoque et al., (2013)[67]Interview coachingElaborate feedbackVisual

Kranz et al.,(2006)[61]PhysiotherapyTry AgainAudio
Visual

Kwon & Gross (2005)[62]Martial artsElaborate feedbackVisual

Lehrer et al., (2000)[77]Breathing techniqueTry AgainAudio

Li et al., (2012)[78]Coordination trainingTry AgainAudio
Simple verificationVisual

Linden et al., (1996)[33]Attention levelTry AgainAudio
Visual

Paradiso et al., (2004)[73]GaitTry AgainAudio
Simple verification

Pentland (2004)[51]DiabetesDiabetesSimple verificationAudio

Spelmezan & Borchers (2008)[63]SnowboardingTry AgainAudio

Spelmezan et al., (2009)[64]SnowboardingTry AgainHaptic

Strachan (2005)[79]Sound navigationTry AgainAudio

Takahata et al., (2004)[65]Martial artsTry AgainAudio
Simple verification

Vales-Alonso et al., (2010)[66]Cross country runningElaborate feedbackVisual

Van der Linden et al., (2011)[60]Violin PlayingTry AgainHaptic

Verhoeff et al., 2009[80]GaitElaborate feedbackAudio

Verpoorten et al., 2009[56]Indicators for virtual learning environmentsElaborate feedbackVisual
Table 7. Prototypes answering to “where to next?”.
Table 7. Prototypes answering to “where to next?”.
PrototypeStrategy Used to Answer the Question
Anderson & Reiser (1985)[40]Informs the user which next step to take.
Carroll et al., (2013)[70]Informs the user which next step to take.
Chen & Huang (2012)[69]Presents a corrective step to follow.
Dung & Florea (2012)[31]Presents a personalized learning path.
Hsu & Ho (2012)[32]Presents a personalized learning path.
Pentland (2004)[51]Memory glassesInforms the user which next step to take.
Szafir & Mutlu (2013)[34]Points out the steps to follow.
Vales-Alonso et al., (2010)[66]Tells the user which direction to take.

Share and Cite

MDPI and ACS Style

Schneider, J.; Börner, D.; Van Rosmalen, P.; Specht, M. Augmenting the Senses: A Review on Sensor-Based Learning Support. Sensors 2015, 15, 4097-4133. https://doi.org/10.3390/s150204097

AMA Style

Schneider J, Börner D, Van Rosmalen P, Specht M. Augmenting the Senses: A Review on Sensor-Based Learning Support. Sensors. 2015; 15(2):4097-4133. https://doi.org/10.3390/s150204097

Chicago/Turabian Style

Schneider, Jan, Dirk Börner, Peter Van Rosmalen, and Marcus Specht. 2015. "Augmenting the Senses: A Review on Sensor-Based Learning Support" Sensors 15, no. 2: 4097-4133. https://doi.org/10.3390/s150204097

Article Metrics

Back to TopTop