Next Article in Journal
Optimized Placement of Frost-Measuring Sensors in Heat Exchangers via Image Processing of Frost Formation Pattern
Previous Article in Journal
Experimental Analysis of Tear Fluid and Its Processing for the Diagnosis of Multiple Sclerosis
Previous Article in Special Issue
Enhancing Safety and Efficiency in Firefighting Operations via Deep Learning and Temperature Forecasting Modeling in Autonomous Unit
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Lower Limb Exoskeleton for Rehabilitation with Flexible Joints and Movement Routines Commanded by Electromyography and Baropodometry Sensors

by
Yukio Rosales-Luengas
1,
Karina I. Espinosa-Espejel
1,
Ricardo Lopéz-Gutiérrez
2,
Sergio Salazar
1 and
Rogelio Lozano
1,3,*
1
Centro de Investigación y de Estudios Avanzados del Instituto Politécnico Nacional (CINVESTAV), Av. IPN #2508, San Pedro Zacatenco, Mexico City 07360, Mexico
2
Investigador por México-Consejo Nacional de Humanidades, Ciencias y Tegnologías (IXM-CONAHCYT), Av. de los Insurgentes Sur #1582, Crédito Constructor, Benito Juárez, Mexico City 03940, Mexico
3
CNRS UMR 7253 Heudiasyc, Université de Technologie de Compiegne, 60203 Compiegne, France
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(11), 5252; https://doi.org/10.3390/s23115252
Submission received: 17 February 2023 / Revised: 30 May 2023 / Accepted: 30 May 2023 / Published: 1 June 2023
(This article belongs to the Special Issue Advances in Intelligent Robotics Systems Based Machine Learning)

Abstract

:
This paper presents the development of an instrumented exoskeleton with baropodometry, electromyography, and torque sensors. The six degrees of freedom (Dof) exoskeleton has a human intention detection system based on a classifier of electromyographic signals coming from four sensors placed in the muscles of the lower extremity together with baropodometric signals from four resistive load sensors placed at the front and rear parts of both feet. In addition, the exoskeleton is instrumented with four flexible actuators coupled with torque sensors. The main objective of the paper was the development of a lower limb therapy exoskeleton, articulated at hip and knees to allow the performance of three types of motion depending on the detected user’s intention: sitting to standing, standing to sitting, and standing to walking. In addition, the paper presents the development of a dynamical model and the implementation of a feedback control in the exoskeleton.

1. Introduction

In recent years, rehabilitation systems have been developed in order to assist patients with their movements and restore their motor ability. Exoskeletons have contributed to patients’ health by helping them to carry out physiotherapeutic routines [1]. Among the tasks with which the exoskeletons assist, we can mention support for walking, for standing, and for the execution of specific therapy routines [2,3,4]. The exoskeletons for patients or elderly care are responsible for helping people with some type of weakness to carry out their daily tasks as in [5], in which the authors developed a lower limb exoskeleton with the purpose of assisting the sit-to-stand and stand-to-sit tasks for people with spinal cord injury or elderly people. In addition, exoskeletons applied to rehabilitation have become an important tool for therapists; they reduce the risks to their health caused by the transport of people [6]. There are various architectures that can be used for bilateral training; for lower limbs, mainly for adults, as in [7], where three healthy male subjects aged 27, 25 and 24, and three subjects undergoing stroke rehabilitation, two men and one woman aged 54, 68 and 30, were enrolled in experiments with walking rehabilitation exoskeletons. Some authors place great emphasis on the mechanical and electronic design of the exoskeleton as a fundamental part of its development, as in [8,9], where the authors present in detail the mechanical design of their exoskeletons.
An important part of the research on exoskeletons consists of the actuators that will be used. In [10], many papers on exoskeletons that were developed using springs as part of their structure were reviewed, such as: series elastic actuators (SEA), parallel elastic actuators (PEA) and elastic actuators of the magneto-rheological series (MRSEA) and other interactions. According to this paper, the use of SEA actuators is important given that they ensure that the coupling between the user and the motor is compliant, thereby protecting the user’s joints from impact loads and other undesirable interactions. Furthermore, as mentioned in [10], “the compliance introduced by the spring facilitates a torque-based control strategy by transforming the torque/force control problem into a position control problem based on the measurement of the springs deformation. These sensors, widely used in exoskeletons, allow a smooth force transmission, accurate force control, lower output impedance, shock tolerance, energy efficiency, and back-drivability in human–robot physical interactions. The spring acts as an impact damper and reduces the actuator inertia felt by the user, thus allowing the user to increase safety and comfort. A further advantage is the peak motor power reduction exploiting the spring capacity of storing energy. In [11], the authors outline the hardware of two printed circuit board (PCB) designs for collecting and conditioning sensor feedback from two SEA subsystems and an inertial measurement unit (IMU).
One of the recently explored areas is the detection of the intention of motion to activate specific movement routines in exoskeletons [12,13]. The human intention (HI) is defined as the determination of a person to perform a movement, which is a fundamental component in the development of exoskeletons and is the goal of this paper. Human intention detected by the correct sensors produces a human–machine interaction in which the patient collaborates with the device during the recovery process. Electromyographic signals can be used as an alternative to detecting the human intention, which are manifestations of the movement generated 30 ms before an action is performed [14,15]. Some studies that apply surface electromyography (sEMG) use the sEMG-driven musculoskeletal (MS) model, which establishes a relationship between signals and joint moment, angular velocity or acceleration [16].
Another group of studies implements machine learning (ML). Machine learning performs discrete motion classification or continuous motion estimation by mapping sEMG inputs to the human movement intention through techniques such as neural networks (NN), vector support machines (VSM), and K-means (KM), among others [14]. In [16], the authors estimated the torque of the knee joint exerted by the set of muscles used for the extension and flexion of the knee, in real time with the implementation of a Hill-type musculoskeletal model. In [17], the authors accomplished the estimation of the torque using sEMG and the optimal control of adaptive impedance during rehabilitation assisted by an active knee orthosis. The optimization model was performed by comparing the estimated sEMG torque with the torque generated by the inverse dynamics tool of the OpenSim software. As an alternative solution, they proposed a multilayer perceptron neural network (NN) to map the EMG signals to the user’s torque. In [18], the authors presented an electric assistance robot that adapts to human intention and a method to recognize the successive phases that a subject performs during the movement of sitting and standing. Surface electromyographic signals (sEMG) and the ground reaction force (FGR) were used as inputs of the algorithm. A neural network assembly and a window of 0.1 s were used to identify each phase.
In this paper, an exoskeleton was developed with rotational elastic type actuators which are a variation of the SEA actuators, whose operation is explained in more detail in Section 3.2. The exoskeleton prototype has six degrees of freedom divided in four DoF for the lower limbs (two for knees joints and the other two for hip joints) and two DoF as part of the mechanism that allows standing and sitting exercises. Mechanically, the exoskeleton can be divided into two main parts, the lifting system, and the lower limb system. In Figure 1, we see these two parts that integrate the complete prototype. The lifting system holds the patient by the torso using a safety harness and is responsible for supporting the full weight of the patient, and is actuated by two linear motors and a 4-bar system used to sit and stand the patient. The lower limb system is an anthropomorphic exoskeleton that supports the patient’s legs, which is actuated by four motors whose axes coincide with the axis of rotation of the patient’s hip and knee joints. The mechanical design of the prototype allows the exoskeleton to take the human patient and assist him/her from sitting to standing position for later assisting him/her in a gait rehabilitation exercises.
The exoskeleton prototype also contains a human intention detection system based on electromyography; these signals along with the signals from the baropodometry sensors are classified by an algorithm that decides which routine (walking, sitting or standing) the exoskeleton should perform. These routines are already programmed into a desired trajectory suitable for each joint of the exoskeleton so that the classification system only generates a movement command.
Finally, the exoskeleton contains a feedback control system, which performs trajectory tracking. When the desired trajectory is finished, the human intention detection system waits again for an effort from the patient to classify the signals obtained and again decide which cycle to start reproducing.
To develop the control law, a dynamical model of the exoskeleton was obtained considering its flexible joints, but a dynamical model of the human is not obtained because we are performing only passive rehabilitation. That means that the patient must not make any effort during the performance of the exoskeleton movement routine. If for some reason the human generates any movement or effort, these are considered disturbances and the proposed control is capable of rejecting them.
Then, the goal of this paper was the use of a signal classifier of baropodometry, electromyography, and torque sensors as a human detection system to develop a lower limb therapy exoskeleton with a feedback control law that considers the non-linearities of the mathematical model to obtain numerical and real-time experimental results.
The paper is organized as follows. In Section 2, the design and mechanical development of the platform is presented. In Section 3, the development of the exoskeleton dynamical model with elastic joints and the obtaining trajectories of motion are presented. The electromyography and baropodometry sensors are described in Section 4. The design and development of the control law are presented in Section 5. The numerical results and real-time experimental results are presented in Section 6. Finally, the conclusions are presented in Section 7.

2. Platform Development

The developed lower limb exoskeleton was specifically designed to meet the following requirements:
  • Adjust the sizes of the Latin American population;
  • Support the patient’s weight without the need for a crane;
  • Assist in the task of retraining training, as well as in the task of sitting and getting up.
In addition, the design must be ergonomic and easy to place.
The final design consists of three parts: the exoskeleton legs with actuators and sensors, the exoskeleton lifting system, and the control unit. The exoskeleton rehabilitation movements are made through a series of links attached by joints that correspond to the patient’s knee and hip joints.

2.1. Exoskeleton Lower Limb System

The exoskeleton’s legs, shown in Figure 2, were designed with an anthropomorphic approach to allow easy adjustment of the patient’s legs. In this sense, the exoskeleton was specifically designed so that it could fit the patient in front of the body and not in the back, with the intention that the patient can be adjusted while sitting, and help the patient to get up and then start the rehabilitation exercises. In addition, from the sitting position, the exoskeleton can help in exercises such as knee extension or the process of sitting and getting up. Considering the variations in population measures, an extension system was added in the links of the legs and trunk, with the purpose of covering a slightly wider range of patients.
The required torque for each joint was provided by an AHC2 family Bosh CD engine, through an elastic transmission, which generates up to 20 Nm of torque. Two absolute encoders model AMT20 of the company CUI INC; Tualatin, U.S.A., with a resolution of 12 bits and SPI interface, were used for each joint to measure the angular position of both the link and the motor, and the angular speeds and accelerations can be calculated.
In order to keep the user safe, we should avoid hyperextension in the joints. Mechanical stops are added in all the joints to obtain the range of motion (RoM) described in Table 1.

2.2. Exoskeleton Lifting System

The lifting system of the exoskeleton is responsible for carrying the entire weight of the patient, from the sitting position to the standing position and for the entire rehabilitation session, leaving the rehabilitation task to the legs only. The exoskeleton lifting system is made up of a double four-bar system operated by two linear actuators, model LACT6P, which can withstand a load of 50 kg each, see Figure 3. It is anchored, at one end, to the hand supports of a standard treadmill and, at the other end, joins the mechanical legs, allowing them to go up and down while always maintaining its vertical.
The patient is supported by a harness (not shown) that is attached to the upper bar of the harness frame, which is joined to the four-bar system in the same place where the mechanical legs are connected. Thus the exoskeleton is able to comfortably lift patients weighing more than average.

2.3. Control System

The control system consists of the real-time control electronic card and the host computer with the interface for the physiotherapist.
The control electronic card, manufactured by National Instruments© (Austin, TX, USA), is responsible for running the controller in real-time, receiving data from absolute encoders and analog sensors, and operating the motors through PWM outputs which are connected to H-bridge controllers; it communicates with the host computer via a USB connection, and also has an emergency stop system available for the patient. It has two secondary cards designed ad hoc for the connection of all the sensors and motors of the exoskeleton, which has six degrees of freedom.
The main computer runs the physiotherapist’s graphic interface, allowing him to command all the movements and exercises that the exoskeleton can execute. For development stages, the main computer is used to program new control laws and test their performance; in this case, the interface shows graphs of all the variables of interest as well as the desired trajectories

3. Mathematical Model

In this section, we present the mathematical models and the desired trajectories for the exoskeleton.

3.1. Dynamical Model of the Exoskeleton

Initially, the dynamical model of the exoskeleton was obtained by considering the simplified model with rigid joints as shown in Figure 4, in which there were two links corresponding to the thigh (link 1) and the leg (link 2) with lengths l 1 and l 2 , and with masses m 1 and m 2 , respectively.
The joints are rotational and correspond to the joint of the hip q 1 ( t ) and the knee q 2 ( t ) , both form the vector of joint positions q ( t ) R 2 .
The dynamical equations of a limb of the exoskeleton in matrix form are:
τ = M ( q ) q ¨ + C ( q , q ˙ ) q ˙ + g ( q ) τ 1 τ 2 = M 11 ( q ) M 12 ( q ) M 21 ( q ) M 22 ( q ) q ¨ 1 q ¨ 2 + C 11 ( q , q ˙ ) C 12 ( q , q ˙ ) C 21 ( q , q ˙ ) C 22 ( q , q ˙ ) q ˙ 1 q ˙ 2 + g 1 ( q ) g 2 ( q ) ,
where M R 2 × 2 is the inertia matrix, C R 2 × 2 is the Coriolis matrix, g R 2 is the vector of gravitational forces and τ R n are the forces of the actuators. Furthermore, the elements of the M and C matrixes are:
M 11 ( q ) = I 1 + I 2 + l 1 2 m 2 + l c 1 2 m 1 + l c 2 2 m 2 + 2 l 1 l c 2 m 2 c o s ( q 2 ) M 12 ( q ) = I 2 + l c 2 2 m 2 + l c 2 2 m 2 + l 1 l c 2 m 2 c o s ( q 2 ) M 21 ( q ) = I 2 + l c 2 2 m 2 + l c 2 2 m 2 + l 1 l c 2 m 2 c o s ( q 2 ) M 22 ( q ) = I 2 + m 2 l c 2 2 C 11 ( q , q ˙ ) = m 2 l 1 l c 2 s i n ( q 2 ) q ˙ 2 C 12 ( q , q ˙ ) = m 2 l 1 l c 2 s i n ( q 2 ) [ q ˙ 1 + q ˙ 2 ] C 21 ( q , q ˙ ) = m 2 l 1 l c 2 s i n ( q 2 ) q ˙ 1 C 22 ( q , q ˙ ) = 0 g 1 ( q ) = [ m 1 l c 1 + m 2 l 1 ] g s i n ( q 1 ) + m 2 g l c 2 s i n ( q 1 + q 2 ) g 2 ( q ) = m 2 g l c 2 s i n ( q 1 + q 2 ) .

3.2. Dynamical Model of Elastic Rotational Actuators

The exoskeleton has elastic rotational actuators. This type of actuator has a spring between the motor shaft and the mechanical joint that adheres to the person. The springs ensure that the coupling between the user and the motor is compliant, thereby protecting the users body from impact loads and other undesirable interactions.
Since the actuators transmit the movement through gears that are not totally rigid, modeling was performed with elastic joints.
An elastic joint is represented by the motor rotor (in yellow) and the link load (in blue). The angular position of the motor rotor is denoted by q m R 2 and the angular position of the link in front of the elastic gear by q e R 2 as shown in Figure 5.
The generalized coordinates are [ q e T q m T ] , where q e = [ q e 1 q e 2 ] T are the angular positions of links 1 and 2, and q m = [ q m 1 q m 2 ] T are the angular positions of motors 1 and 2.
The kinetic energy considering elastic joints is the sum of the kinetic energy of the links and of the rotors:
k ( q e , q ˙ e , q ˙ m ) = 1 2 q ˙ e T M ( q e ) q ˙ e + 1 2 q ˙ m T J q ˙ m ,
where M ( q e ) is the inertia matrix of the rigid robot (considering an infinite stiffness value k i i ), and J is a positive definite diagonal matrix with values in its diagonal equal to the product of the inertia moments of the rotors and the square of the gear ratio: J = d i a g { J 1 r 1 , 1 2 , J 2 r 1 , 2 2 } .
The potential energy is the sum of the gravitational energy stored in the torsional springs:
U ( q e , q m ) = U 1 ( q e ) + 1 2 [ q e q m ] T K [ q e q m ] ,
where U 1 ( q e ) is the energy due to gravity considering the robot with rigid joints. Matrix K is the positive definite diagonal with the torsion constants on the diagonal defined as: K = d i a g { k 1 , k 2 } .
Finally, using the Euler–Lagrange formalism and considering a viscous friction in the motors, it follows that:
M ( q e ) q ¨ e + C ( q e , q ˙ e ) q ˙ e + g ( q e ) + K [ q e q m ] = 0 J q ¨ m + B q ˙ m K [ q e q m ] = τ ,
where B is a positive definite diagonal matrix and its values are the viscous friction parameters of each motor B = d i a g { b 1 , b 2 } .

3.3. Trajectories of Motion

The motion trajectories of each joint were obtained by video analysis of the three movement routines: sitting to standing (STS), standing to sitting (STA), and standing to walking (SW). Markers were placed on the joints to obtain the angular positions. Figure 6 shows the images of the four phases of the cycle from sitting to standing (A-Start, B-Incline, C-Elevation and D-Stabilization) and their respective markers. The angular positions were obtained with respect to the hip and the flexion and extension values were obtained using the coordinates of the markers.
The angular trajectories were generated by an approximation of trigonometric polynomials given by:
q d i ( t ) = a 0 2 + k = 1 n ( a i k c o s ( k ω i t ) + b k s i n ( ω i k t ) ) , i = 1 , 2 ,
where q d i is the path of the i-th joint, ω i is the angular frequency of the ith joint, and n is the degree of the polynomial. The angular frequency is defined as ω i = 2 π / T i , with period T i R + . The desired trajectories corresponding to the hip and knee joint are denoted by q d 1 and q d 2 , respectively.
The angular trajectories obtained for the routines sitting to standing, standing to sitting, and standing to walking, denoted by q d STS R 2 , q d STA R 2 , and q d SW R 2 , respectively, can be found below. A period T i = π is considered so that ω i = 1 for i = 1 , 2 (the period T i can be chosen appropriately) and the order of the polynomial is chosen as n = 8 . Figure 7 shows the desired trajectories obtained from Equations (7)–(12) which are smooth and bounded since the positions and velocities are bounded. The upper bounds are shown in Table 2.
Trajectories from sitting to standing:
q d 1 S T S ( t ) = 22.35 3.66 c o s ( 1 ω t ) 0.68 c o s ( 2 ω t ) 0.32 c o s ( 3 ω t ) 0.26 c o s ( 4 ω t ) 0.13 c o s ( 5 ω t ) 0.26 c o s ( 6 ω t ) 0.09 c o s ( 7 ω t ) 0.27 c o s ( 8 ω t ) 19.31 s i n ( 1 ω t ) 8.77 s i n ( 2 ω t ) 5.64 s i n ( 3 ω t ) 3.9 s i n ( 4 ω t ) 3.39 s i n ( 5 ω t ) 2.65 s i n ( 6 ω t ) 2.29 s i n ( 7 ω t ) 1.98 s i n ( 8 ω t )
q d 2 S T S ( t ) = 34.48 3.17 c o s ( 1 ω t ) 0.36 c o s ( 2 ω t ) 0.52 c o s ( 3 ω t ) 0.47 c o s ( 4 ω t ) 0.23 c o s ( 5 ω t ) 0.67 c o s ( 6 ω t ) + 0.13 c o s ( 7 ω t ) 0.5 c o s ( 8 ω t ) 29.08 s i n ( 1 ω t ) 12.31 s i n ( 2 ω t ) 7.62 s i n ( 3 ω t ) 5.34 s i n ( 4 ω t ) 4.69 s i n ( 5 ω t ) 3.8 s i n ( 6 ω t ) 3.2 s i n ( 7 ω t ) 2.9 s i n ( 8 ω t )
Trajectories from standing to sitting:
q d 1 S T A ( t ) = 22.35 3.05 c o s ( 1 ω t ) 0.12 c o s ( 2 ω t ) + 0.21 c o s ( 3 ω t ) + 0.23 c o s ( 4 ω t ) + 0.41 c o s ( 5 ω t ) + 0.24 c o s ( 6 ω t ) + 0.41 c o s ( 7 ω t ) + 0.23 c o s ( 8 ω t ) + 19.42 s i n ( 1 ω t ) + 8.79 s i n ( 2 ω t ) + 5.64 s i n ( 3 ω t ) + 3.9 s i n ( 4 ω t ) + 3.37 s i n ( 5 ω t ) + 2.65 s i n ( 6 ω t ) + 2.25 s i n ( 7 ω t ) + 1.99 s i n ( 8 ω t )
q d 2 S T A ( t ) = 34.48 2.26 c o s ( 1 ω t ) + 0.42 c o s ( 2 ω t ) + 0.2 c o s ( 3 ω t ) + 0.2 c o s ( 4 ω t ) + 0.51 c o s ( 5 ω t ) + 0.05 c o s ( 6 ω t ) + 0.83 c o s ( 7 ω t ) + 0.24 c o s ( 8 ω t ) + 29.16 s i n ( 1 ω t ) + 12.31 s i n ( 2 ω t ) + 7.64 s i n ( 3 ω t ) + 5.36 s i n ( 4 ω t ) + 4.66 s i n ( 5 ω t ) + 3.86 s i n ( 6 ω t ) + 3.1 s i n ( 7 ω t ) + 2.94 s i n ( 8 ω t )
Trajectories from standing to walking:
q d 1 S W ( t ) = 8.29 + 21.19 c o s ( 1 ω t ) + 0.83 c o s ( 2 ω t ) 1.01 c o s ( 3 ω t ) + 0.01 c o s ( 4 ω t ) + 0.25 c o s ( 5 ω t ) + 0.24 c o s ( 6 ω t ) 0.35 c o s ( 7 ω t ) + 0.05 c o s ( 8 ω t ) 8.43 s i n ( 1 ω t ) + 6.89 s i n ( 2 ω t ) + 1.95 s i n ( 3 ω t ) 1.07 s i n ( 4 ω t ) + 0.8 s i n ( 5 ω t ) + 0.77 s i n ( 6 ω t ) + 0.6 s i n ( 7 ω t ) 0.22 s i n ( 8 ω t )
q d 2 S W ( t ) = 26.575 3.24 c o s ( 1 ω t ) + 14.12 c o s ( 2 ω t ) 2.1 c o s ( 3 ω t ) 1.02 c o s ( 4 ω t ) 0.06 c o s ( 5 ω t ) + 0.68 c o s ( 6 ω t ) 0.27 c o s ( 7 ω t ) + 0.07 c o s ( 8 ω t ) 30.54 s i n ( 1 ω t ) + 3.13 s i n ( 2 ω t ) + 9.27 s i n ( 3 ω t ) 0.99 s i n ( 4 ω t ) + 1.43 s i n ( 5 ω t ) + 1.27 s i n ( 6 ω t ) + 1.12 s i n ( 7 ω t ) + 0.59 s i n ( 8 ω t )

4. Detection of Human Intention

The human intention can be detected using the flexible part of the robot joints. When the user slightly flexes the hip or a knee, the elastic element of the exoskeleton joint contracts, generating feedback to the exoskeleton control that sends a signal to move the prototype motors. This is the simplest way to control the exoskeleton but it requires the user to have at least a small degree of mobility in the joints.
When the user has some impediment to mobilize but still generates electromyography signals, the human intention can be detected with a system that interprets these signals and generates commands to perform desired tasks.
The detection system proposed and developed in this paper has two inputs corresponding to surface electromyography and baropodometry sensors that are read by the myRIO microprocessor where the signals are processed. The detection system recognizes three different movements and then generates the corresponding movement routine for the exoskeleton, as shown in Figure 8.
The electromyographic acquisition is divided into four stages: the skin preparation, the electrode position, the analog conditioning, and the analog-digital conversion. The human skin preparation consists of cleaning the skin to provide sEMG recordings with low noise levels. It ensures the removal of body hair, oils, and flaky skin layers and consequently reduces the impedance at the electrode–gel–skin interface. Using an abrasive solution and wetting clean skin with water reduces the impedance of the skin and electrodes. It also minimizes allergic responses [20].
The surface electrodes are made of silver/silver chloride (Ag/AgCl), which are transducers of the sEMG signal. The Covidien type H124SG Ag/AgCl electrode with a 24 mm diameter is used because this satisfies the requirements of the European Concerted Action Surface EMG for noninvasive assessment of muscle [21].
The assembly of the electrodes in a bipolar configuration is located between the zone of innervation and the regions of the tendon [22]. The Myoware sensor electrodes have a distance between electrodes of 30 mm, which are positioned in the middle of the muscle body aligned with the orientation of the muscle fibers on the four muscles: Rectus femoris (RF), Biceps femoris (BF), Tibialis Anterior (TA) and Gastrocnemius (GAS).
The Myoware sensor performs the assignment of conditioning the sEMG signal and the sensor stages are: the input to the AD8226 instrumentation amplifier, rectification, smoothing, and signal amplification.
The sEMG signal is acquired with the myRIO Digital Analogue converter by using the Nyquist–Shanon sampling theorem and a maximum frequency f m a x of the electromyographic signal of 500 Hz, the sampling frequency must be at least f s 2 f m a x = 1000 Hz [23]. Then, the used sampling frequency is of 1 kHz. The signals are filtered using a band-pass filter type butter-worth of fourth order from 10 to 250 Hz to have better information.

4.1. Human Intention

The identification system detects the human intention to execute three different movements. Those movements are getting up from a chair, sitting in a chair and starting walking. The two possible initial positions are sitting or relaxed standing. The system identifies the initial position and then differentiates between getting up from a chair and the other movements with the initial standing position. Being in the relaxed standing position, distinguishes between sitting and walking intentions.
Characteristics of the movements:
The characteristics of the three movements are described with the activation of the muscles and the ground reaction forces in the front and rear of each foot. F 0 is the reaction force of both feet in the starting position. F 1 is the initial force to move for both feet. F i represents the reaction force at the left limb and F d the reaction force at the right limb.
Getting up from a chair:
In this movement of getting up from a chair, the rectus femoris muscle is an extensor of the knee that contracts and the gastrocnemius muscle contracts to perform the plantar flexion. The ground reaction force behaves like the initial value of F 0 increases to F 1 being higher at the back, generally F 1 > F 0 , which can be seen in Figure 9A.
Sitting on a chair:
In the initial movement of sitting on a chair, with respect to the muscles, to flex the hip the rectus femoris contracts and to flex the knee the biceps femoris contracts. With respect to the ground reaction force, the initial value F 0 increases at the back of both feet F 1 p and at the bottom of both feet the force F 1 a decreases, being the sum equal to the initial reaction force F 0 = F 1 p + F 1 a , which can be seen in Figure 9B.
Start walking:
In the beginning of walking, with respect to the muscles, for flexion of the hip the rectus femoris contracts and for dorsiflexion the tibialis anterior contracts. With respect to the ground reaction force, the initial value F 0 increases in the part on the supporting foot (in this case on the right foot) F d and the force on the left foot F i decreases, where F d > F i and F d > F 0 , which can be seen in Figure 9C.

Signal Threshold Detection

The signal threshold detection method uses the physiological description of sEMG signals from the muscles and the physics of FGR sensors found in Section 4.1.
When a person gets up from a chair, the gastrocnemius contracts to perform plantar flexion which is seen in Figure 10. The sEMG for gastrocnemius (line in color pink) starts to increment its amplitude indicating the intention of motion to get up. Then, rectus femoris muscle that is an extensor of the knee contracts indicating the end of the motion. With respect to the ground reaction force in Figure 11, the FGR signals at the front part of both feet, left front (LF), and right front (RF) start to increase. It indicates the initial force when a person uses the front part of feet to get up.
The identification percentages of the three movement routines are presented in Table 3. Furthermore, the confusion matrix to evaluate the performance of the detection is shown in Table 4.

5. Feedback Control

In this section, the control approach is described. Let us remember that: first, the human intention is detected using electromyograpy and baropodometry, then the system chooses which task will be developed (sitting to standing, standing to sitting, or sitting to walk), and selects one of the desired trajectories q dSTS , q dSTA or q dSW , respectively, which were obtained by video analysis and are already programmed in the exoskeleton’s computer.
To obtain the control law, the system is considered to be connected in a cascade of the dynamics of the robot links and the dynamics of the motors links. The dynamics of the links is actuated by the angles of the motors q m through the flexible joints, the dynamics of the motors is actuated by the motor torques τ . From Equation (5), which corresponds to the dynamical model, we have:
M ( q e ) q ¨ e + C ( q e , q ˙ e ) q ˙ e + g ( q e ) + K q e = K q m q ¨ m = J 1 ( τ K ( q m q e ) B q ˙ m ) .
The methodology used [24] considered q m as the input of the first equation in (13) and a control law q m des R 2 is proposed for q m as:
q m des = q e + K 1 [ M ( q e ) v ˙ + C ( q e , q ˙ e ) v + g ( q e ) K d r ] ,
where K d > 0 R 2 × 2 is diagonal and constant,
v = q ˙ e des λ 1 q ˜ e ,
q ˜ e = q e q e des ,
r = q ˙ e v
where v , q ˜ e and r R 2 , λ 1 > 0 R 2 × 2 is diagonal and constant.
Defining q ˜ m = q m q m des R 2 , and substituting in the first equation of (13) we have:
M ( q e ) r ˙ + C ( q e , q ˙ e ) r + K d r = K q ˜ m .
Lyapunov’s candidate function is chosen as:
V 1 = 1 2 r T M ( q e ) r .
The derivative along the trajectories of the system is:
V ˙ 1 = 1 2 r T M ˙ ( q e ) r + r T M ( q e ) r ˙ = r T K d r + r T K q ˜ m .
If q ˜ m = 0 , then V ˙ 1 < 0 and r tend to zero when t 0 and as r = q ˜ ˙ e + λ 1 q ˜ e with, so q ˜ ˙ e and q ˜ e have zero when t .
As a second step, q m is considered as the input of the second equation of (13). Derive q ˜ m as q ˜ ˙ m = q ˙ m q ˙ m des and q ˜ ¨ m = q ¨ m q ¨ m des , substituting the above into (13) we have:
q ˜ ¨ m = J 1 ( τ B q ˙ m + K ( q e q m ) ) q ¨ m des .
The proposed control law is:
τ = J K 1 q ˜ ˙ m J K 2 q ˜ m + J q m + B q ˙ m K ( q e q m ) ) ,
where K 1 and K 2 are a positive diagonal matrix.
Then,
J 1 ( τ B q ˙ m + K ( q e q m ) ) q ¨ m = K 1 q ˜ ˙ m K 2 q ˜ m ;
therefore,
q ˜ ¨ m = J K 1 q ˜ ˙ m J K 2 q ˜ m .
Lyapunov’s candidate function is chosen as:
V 2 = 1 2 J 1 q ˜ ˙ m 2 + 1 2 q ˜ m 2 .
The derivative along the trajectories of the system is:
V ˙ 2 = J 1 q ˜ ¨ m q ˜ ˙ m + q ˜ ˙ m q ˜ m = q ˜ ˙ m K 1 q ˜ ˙ m q ˜ ˙ m ( K 2 I ) q ˜ ˙ m ,
where I is the identity matrix.
If K 2 > I V ˙ 2 < 0 , then q ˜ m 0 , q ˜ ˙ m 0 , q ˜ ¨ m 0 , from (16), it follows r 0 then q ˜ e 0 .

6. Results

6.1. Numerical Results

6.1.1. Detection Algorithm Implementation

The detection algorithm for the human intention was performed and tested offline with the signals obtained from the sEMG and FGR sensors of the system described in Section 4.1. The database contains 10 s long signals obtained from a healthy female subject during the performance of the three movements: sitting to standing, standing to sitting and standing to walking, which were acquired with a sampling frequency of 1 kHz along 10 trials. The 10 trials are enough to prove the detection algorithm due to the behavior in four different recording sessions, which were performed in the same way. The total number of tasks considering the three movements is equal to 30, which generate muscular fatigue in the patient. Due to the muscular fatigue, we recorded the 30 trials per session because more trials could degrade the signal behavior. In Figure 10, a record of sEMG normalized signals from sitting to standing is shown. The four channels correspond to the four muscles: Rectus femoris (RF) in red, Biceps Femoral (BF) in blue, Tibial anterior (TA) in green and Gastrocnemius-Gas in pink. The envelopes of electromyographic signals were obtained by the MyoWare® sensors and the increment of amplitude represents the contraction of the muscle. The FGR normalized signals from sitting to standing of four sensors corresponding to the position Left Front (LF), Left Back (LB), Right Front (RF), and Right Back (RB) are shown in Figure 11. The increment of FGR signals represents the force exerted by both feet.

6.1.2. Implementation of the Exoskeleton’s Control

The simulation of the control applied to the lower limb robot was carried out in Matlab-simulink. The desired position was entered with a column vector with the values of the two desired joints q d 1 and q d 2 . The control law of Equations (14) and (19) was simulated together with the dynamical model of the Equation (5) considering the parameters indicated in Table 5.
The initial conditions of the angular positions are chosen as: q 1 ( 0 ) = 0 and q 2 ( 0 ) = 0 and the control gains used are: λ 1 = d i a g { 10 , 10 } and K d = d i a g { 1 , 1 } .
The results of the trajectory tracking are shown in Figure 12 and Figure 13. The desired trajectories are shown in blue, the trajectory of the links is shown in black and the trajectory of the motors in red. The tracking position is displayed for the Sitting to Standing (STS), Standing to Sitting (STA), and Standing to Walking (SW) movement routines.
The trajectories of the links q 1 and q 2 (line in color black) tend to the desired trajectories q 1 d and q 2 d (dotted line in color blue). The trajectories of the motors q 1 m and q 2 m (dotted line in color red) tend to the desired trajectory with noise, this behavior is due to the springs considered in the elastic joints (Figure 12 and Figure 13).
For the hip joint, the variation of angular position is greater compared to the knee joint due to the inertia to be moved.
The results of the tracking errors denoted as e q = q d q and e qm = q d q m corresponding to the joins motion with and without the model using elastic joins respectively are shown in Figure 14 and Figure 15.
The tracking error of the e q links motion (line in color black) oscillate around zero without a lot of noise; the tracking error of the motors e qm (line in color blue) varies due to spring behavior.

6.2. Real-Time Experimental Results

This section presents the results obtained from the experimental evaluation for tracking control implemented on a National Instruments myRIO data acquisition card, and programmed in LabView software.
The human intention identification system was proved with a healthy female subject of 27 years with a weight of 55 kg and a height of 1.6 m.
The objective of the controller is the tracking trajectory and the flexible joint exoskeleton performs the movement corresponding to the identified task, see Figure 16.
The exoskeleton trajectory tracking is shown at the top of Figure 17. The Stand to Walking movement (SW) achieved by the hip is shown in Figure 17A and by the knee in Figure 17B. The join trajectories for Stand to Walking are denoted by q 1 S W and q 2 S W (dotted line in blue), and the join desired trajectories are denoted by q 1 d S W and q 2 d S W (line in black). The trajectories of the motor are denoted by q 1 m S W and q 2 m S W .
Subsequently, Figure 17C,D show the exoskeleton movement from Sit to Stand (STS) carried out by joins q 1 S T S and q 2 S T S (dotted line in blue) that follow the desired trajectories q 1 d S T S and q 2 d S T S (line in black) likewise.

7. Conclusions

A motion intention detection system was obtained using electromyography sensors and ground reaction force sensors that operate as a baropodometry sensor. Four muscles were established whose signals are read by the detection system: the Biceps Femoris (BF), the Rectus Femoris (RF), the Tibialis Anterior (TA) and the Gastrocnemius (GAS).
The human intention detection system has a performance of 90% of correct detection. In addition, the performance metrics show an average precision of 0.9, indicating a good performance of the system.
An important contribution of this paper is to show an effective way to obtain the human intention using electromyography and baropodometry signals. However, sensing the intention of movement mechanically is essential for other types of active exercises. For this reason the exoskeleton was designed with this type of flexible actuators containing a torque sensor.
The required trajectories for motion in an exoskeleton’s joints were obtained from the analysis of movement of different videos, which were approximated by trigonometric polynomials. The trajectories in the implementation of the exoskeleton control were suitable for 3D-CAD motion.
The proposed control law is based on the exoskeleton dynamic model; this exoskeleton mobilizes the hip and knee joints. The implemented control has a good performance for tracking trajectory, which means that tracking errors tend to zero. In fact, when a rehabilitation is performed, the most convenient is to adjust the control gains, this action is very simple, only tuning the gains in the sense of closed-loop stability ( K 1 must be positive and K 2 > I ). The error in the link model is smaller than in the motor model, which is explained by the dynamics of the elastic joints in motors.

Author Contributions

Conceptualization, Y.R.-L. and R.L.-G.; methodology, Y.R.-L., K.I.E.-E., R.L.-G., S.S. and R.L.; software, Y.R.-L. and K.I.E.-E.; validation, S.S and R.L.; formal analysis, K.I.E.-E.; investigation, Y.R.-L., K.I.E.-E., R.L.-G., S.S. and R.L.; resources, R.L.; data curation, K.I.E.-E.; writing-original draft preparation, Y.R.-L., K.I.E.-E. and R.L.-G.; writing-review and editing, R.L.-G., S.S. and R.L.; visualization, R.L.-G.; supervision, R.L.; project administration, S.S. and R.L.; funding acquisition, R.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chávez; Cardona, M.A.; Rodríguez; Spitia, F.; Baradica López, A. Exoskeletons to enhance human capabilities and support rehabilitation: A state of the art. Rev. Ing. Bioméd. 2010, 4, 69–80. [Google Scholar] [CrossRef]
  2. Dujardin, F.; Tobenas-Dujardin, A.; Weber, J. Anatomía y fisiología de la marcha, de la posición sentada y de la bipedestación. EMC-Apar. Locomot. 2009, 42, 1–20. [Google Scholar] [CrossRef]
  3. Schenkman, M.; Berger, R.; Riley, P.; Mann, R.; Hodge, W. Whole-body movements during rising to standing from sitting. Phys. Ther. 1990, 70, 638–648, discussion 648–651. [Google Scholar] [CrossRef]
  4. Yoshida, K.; An, Q.; Yozu, A.; Chiba, R.; Takakusaki, K.; Yamakawa, H.; Tamura, Y.; Yamashita, A.; Asama, H. Visual and Vestibular Inputs Affect Muscle Synergies Responsible for Body Extension and Stabilization in Sit-to-Stand Motion. Front. Neurosci. 2019, 12, 1042. [Google Scholar] [CrossRef]
  5. Hernández Hernández, J.; Cruz, S.; López-Gutíerrez, R.; González-Mendoza, A.; Lozano, R. Robust nonsingular fast terminal sliding-mode control for Sit-to-Stand task using a mobile lower limb exoskeleton. Control Eng. Pract. 2020, 101, 104496. [Google Scholar] [CrossRef]
  6. Luciani, B.; Braghin, F.; Pedrocchi, A.; Gandolla, M. Technology Acceptance Model for Exoskeletons for Rehabilitation of the Upper Limbs from Therapists’ Perspectives. Sensors 2023, 23, 1721. [Google Scholar] [CrossRef] [PubMed]
  7. Fan, Y.; Yin, Y. Active and Progressive Exoskeleton Rehabilitation Using Multisource Information Fusion From EMG and Force-Position EPP. IEEE Trans. Biomed. Eng. 2013, 60, 3314–3321. [Google Scholar] [CrossRef] [PubMed]
  8. Ma, G.; Li, M.; Wang, Q. Mechanical design of a whole-arm exoskeleton rehabilitation robot based on PNF. In Proceedings of the 2016 13th International Conference On Ubiquitous Robots and Ambient Intelligence (URAI), Xian, China, 19–22 August 2016; pp. 777–780. [Google Scholar] [CrossRef]
  9. Pratt, J.; Krupp, B.; Morse, C.; Collins, S. The RoboKnee: An exoskeleton for enhancing strength and endurance during walking. In Proceedings of the IEEE International Conference on Robotics And Automation, 2004. Proceedings. ICRA ’04. 2004, New Orleans, LA, USA, 26 April–1 May 2004; Volume 3, pp. 2430–2435. [Google Scholar] [CrossRef]
  10. Tiboni, M.; Borboni, A.; Vérité, F.; Bregoli, C.; Amici, C. Sensors and Actuation Technologies in Exoskeletons: A Review. Sensors 2022, 22, 884. [Google Scholar] [CrossRef] [PubMed]
  11. Herron, C.W.; Fuge, Z.J.; Kogelis, M.; Tremaroli, N.J.; Kalita, B.; Leonessa, A. Design and Validation of a Low-Level Controller for Hierarchically Controlled Exoskeletons. Sensors 2023, 23, 1014. [Google Scholar] [CrossRef]
  12. Aissaoui, R.; Dansereau, J. Biomechanical analysis and modelling of sit to stand task: A literature review. In Proceedings of the IEEE SMC’99 Conference Proceedings. 1999 IEEE International Conference On Systems, Man, And Cybernetics (Cat. No.99CH37028), Tokyo, Japan, 12–15 October 1999; Volume 1, pp. 141–146. [Google Scholar] [CrossRef]
  13. Lerner, Z.; Damiano, D.; Bulea, T. A lower-extremity exoskeleton improves knee extension in children with crouch gait from cerebral palsy. Sci. Transl. Med. 2017, 9, 404. [Google Scholar] [CrossRef] [PubMed]
  14. Zhang, L.; Liu, G.; Han, B.; Wang, Z.; Zhang, T. sEMG Based Human Motion Intention Recognition. J. Robot. 2019, 2019, 3679174:1–3679174:12. [Google Scholar] [CrossRef]
  15. Moore, K.; Agur, A.; Dalley, A. Fundamentos de Anatomía con Orientación Clínica, 7th ed.; Wolters Kluwer: Philadelphia, PA, USA, 2013. [Google Scholar]
  16. Ai, Q.; Ding, B.; Liu, Q.; Meng, W. A Subject-Specific EMG-Driven Musculoskeletal Model for Applications in Lower-Limb Rehabilitation Robotics. Int. J. Humanoid Robot. 2016, 13, 1650005:1–1650005:22. [Google Scholar] [CrossRef]
  17. Peña, G.; Consoni, L.; Santos, W.; Siqueira, A. Feasibility of an optimal EMG-driven adaptive impedance control applied to an active knee orthosis. Robot. Auton. Syst. 2019, 112, 98–108. [Google Scholar] [CrossRef]
  18. Shen, H.; Song, Q.; Deng, X.; Zhao, Y.; Yu, Y.; Ge, Y. Recognition of phases in sit-to-stand motion by Neural Network Ensemble (NNE) for power assist robot. In Proceedings of the 2007 IEEE International Conference on Robotics And Biomimetics (ROBIO), Sanya, China, 15–18 December 2007; pp. 1703–1708. [Google Scholar] [CrossRef]
  19. Roach, K.; Miles, T. Normal hip and knee active range of motion: The relationship to age. Phys. Ther. 1991, 71, 656–665. [Google Scholar] [CrossRef]
  20. Sandoval-Gonzalez, O.; Aguilar-Serena, R.; Rosa, D.; Herrera; Aguilar, I.; González-Sánchez, B. Diseño de un sistema de adquisición de señales electromiográficas inalámbrico. In Proceedings of the Congreso Internacional Sobre Innovación y Desarrollo Tecnológico, Cuernavaca, Mexico, 23–25 March 2013. [Google Scholar]
  21. Merlo, A.; Campanini, I. Technical Aspects of Surface Electromyography for Clinicians. Open Rehabil. J. 2010, 3, 98–109. [Google Scholar] [CrossRef]
  22. Cavalcanti Garcia, M.; Vieira, T. Surface electromyography: Why, when and how to use it. Rev. Andal. Med. Del Deporte 2011, 4, 17–28. [Google Scholar]
  23. Tan, L.; Jiang, J. Signal “Sampling and Quantization”, 3rd ed.; Elsevier: London, UK, 2019. [Google Scholar]
  24. Pham, D. Modeling and control of robot manipulators by L Sciavicco and B Siciliano. Robotica 1998, 16, 701, ISBN 0-07-057217-8. [Google Scholar] [CrossRef]
Figure 1. On the left is the real prototype of the exoskeleton for gait rehabilitation during a walking routine and on the right is the CAD design of the prototype with a full view.
Figure 1. On the left is the real prototype of the exoskeleton for gait rehabilitation during a walking routine and on the right is the CAD design of the prototype with a full view.
Sensors 23 05252 g001
Figure 2. CAD drawing of the exoskeleton legs. In the upper part there is an isometric view and in the lower part a side view of the sitting process.
Figure 2. CAD drawing of the exoskeleton legs. In the upper part there is an isometric view and in the lower part a side view of the sitting process.
Sensors 23 05252 g002
Figure 3. CAD drawing of the exoskeleton lifting system consisting of a double four-bar system operated by two linear actuators.
Figure 3. CAD drawing of the exoskeleton lifting system consisting of a double four-bar system operated by two linear actuators.
Sensors 23 05252 g003
Figure 4. Schematic diagram of the exoskeleton; link 1 corresponds to the thigh and link 2 corresponds to the leg with lengths l 1 and l 2 , and with masses m 1 and m 2 , respectively.
Figure 4. Schematic diagram of the exoskeleton; link 1 corresponds to the thigh and link 2 corresponds to the leg with lengths l 1 and l 2 , and with masses m 1 and m 2 , respectively.
Sensors 23 05252 g004
Figure 5. AD schematic of an elastic rotational actuator, q m is the angular position of the motor, and q e is the angular position of the link.
Figure 5. AD schematic of an elastic rotational actuator, q m is the angular position of the motor, and q e is the angular position of the link.
Sensors 23 05252 g005
Figure 6. Frames of video analysis to obtain the trajectories of the cycle from sitting to standing, using markers in hip, knees, and ankles; (A) Fully seated, (B) Lean to stand, (C) Bow on Hip, knees and ankles, and (D) Fully standing.
Figure 6. Frames of video analysis to obtain the trajectories of the cycle from sitting to standing, using markers in hip, knees, and ankles; (A) Fully seated, (B) Lean to stand, (C) Bow on Hip, knees and ankles, and (D) Fully standing.
Sensors 23 05252 g006
Figure 7. Motion trajectories for exoskeleton, sitting to standing q d STS , standing to sitting q d STA , and standing to walk q d SW , to the hip and knee joint denoted by q d 1 and q d 2 , respectively.
Figure 7. Motion trajectories for exoskeleton, sitting to standing q d STS , standing to sitting q d STA , and standing to walk q d SW , to the hip and knee joint denoted by q d 1 and q d 2 , respectively.
Sensors 23 05252 g007
Figure 8. Block diagram representing the human intention identification system, integrated of electromyography sensors (A. Front view and B. Rear view), baropodometry sensors and the data acquisition card that sends the signals to the classifier in the computer.
Figure 8. Block diagram representing the human intention identification system, integrated of electromyography sensors (A. Front view and B. Rear view), baropodometry sensors and the data acquisition card that sends the signals to the classifier in the computer.
Sensors 23 05252 g008
Figure 9. Effects obtained from the human intention detection system for: (A) Getting up, (B) Sitting, and (C) Starting to walk.
Figure 9. Effects obtained from the human intention detection system for: (A) Getting up, (B) Sitting, and (C) Starting to walk.
Sensors 23 05252 g009
Figure 10. Electromyographic signals obtained from the movement from sitting to standing, Rectus femoris (RF) in red, Biceps Femoral (BF) in color blue, Tibial anterior (TA) in green, and Gastrocnemius (Gas) in pink.
Figure 10. Electromyographic signals obtained from the movement from sitting to standing, Rectus femoris (RF) in red, Biceps Femoral (BF) in color blue, Tibial anterior (TA) in green, and Gastrocnemius (Gas) in pink.
Sensors 23 05252 g010
Figure 11. FGR signals of the movement from sitting to standing, Left Front (LF), Left Back (LB), Right Front (RF), and Right Back (RB).
Figure 11. FGR signals of the movement from sitting to standing, Left Front (LF), Left Back (LB), Right Front (RF), and Right Back (RB).
Sensors 23 05252 g011
Figure 12. Hip joint tracking of movement from (A) Sitting to Standing (STS), (B) Standing to Sitting (STA) and (C) Standing to Walking (SW); q 1 : the trajectory of the link, q 1 d : desired trajectory, q 1 m : the trajectory of the motor.
Figure 12. Hip joint tracking of movement from (A) Sitting to Standing (STS), (B) Standing to Sitting (STA) and (C) Standing to Walking (SW); q 1 : the trajectory of the link, q 1 d : desired trajectory, q 1 m : the trajectory of the motor.
Sensors 23 05252 g012
Figure 13. Knee joint tracking of movement from (A) Sitting to Standing (STS), (B) Standing to Sitting (STA) and (C) Standing to Walking (SW); q 1 : the trajectory of the link, q 1 d : desired trajectory, q 1 m : the trajectory of the motor.
Figure 13. Knee joint tracking of movement from (A) Sitting to Standing (STS), (B) Standing to Sitting (STA) and (C) Standing to Walking (SW); q 1 : the trajectory of the link, q 1 d : desired trajectory, q 1 m : the trajectory of the motor.
Sensors 23 05252 g013
Figure 14. Hip joint tracking error of movement from (A) Sitting to Standing (STS), (B) Standing to Sitting (STA) and (C) Standing to Walking (SW); e q 1 : error of the trajectory of the link, q 1 m : error of the trajectory of the motor.
Figure 14. Hip joint tracking error of movement from (A) Sitting to Standing (STS), (B) Standing to Sitting (STA) and (C) Standing to Walking (SW); e q 1 : error of the trajectory of the link, q 1 m : error of the trajectory of the motor.
Sensors 23 05252 g014
Figure 15. Knee joint tracking error of movement from (A) Sitting to Standing (STS), (B) Standing to Sitting (STA) and (C) Standing to Walking (SW); e q 1 : error of the trajectory of the link, q 1 m : error of the trajectory of the motor.
Figure 15. Knee joint tracking error of movement from (A) Sitting to Standing (STS), (B) Standing to Sitting (STA) and (C) Standing to Walking (SW); e q 1 : error of the trajectory of the link, q 1 m : error of the trajectory of the motor.
Sensors 23 05252 g015
Figure 16. Block diagram that represents the control system, integrated by the human detection system, integrated by electromyography (A. Front view and B. Rear view) and baropodometry, the control law and the exoskeleton prototype.
Figure 16. Block diagram that represents the control system, integrated by the human detection system, integrated by electromyography (A. Front view and B. Rear view) and baropodometry, the control law and the exoskeleton prototype.
Sensors 23 05252 g016
Figure 17. Exoskeleton tracking trajectory of (A) Hip from Standing to Walking (SW), (B) Knee from Standing to Walking (SW), (C) Hip from Sitting to Standing (STS) and (D) Knee from Sitting to Standing (STS); q i : the trajectory of the link, q i d : desired trajectory, q i m : the trajectory of the motor with i = 1 , 2 .
Figure 17. Exoskeleton tracking trajectory of (A) Hip from Standing to Walking (SW), (B) Knee from Standing to Walking (SW), (C) Hip from Sitting to Standing (STS) and (D) Knee from Sitting to Standing (STS); q i : the trajectory of the link, q i d : desired trajectory, q i m : the trajectory of the motor with i = 1 , 2 .
Sensors 23 05252 g017
Table 1. Supported range of motion for the leg joints.
Table 1. Supported range of motion for the leg joints.
RoM for the Human [19]RoM for the Robot
JointMinimum AngleMaximum AngleMinimum AngleMaximum Angle
Hip 22 122 14 90
Knee 0 134 0 110
Table 2. Upper bound in trajectories.
Table 2. Upper bound in trajectories.
RoutineTrajectoriesUpper Bound
Sitting to standing q d 1 S T S ( t ) 51.6276 deg
q d 2 S T S ( t ) 72.9801 deg
q ˙ d 1 S T S ( t ) 118.5088 deg/s
q ˙ d 2 S T S ( t ) 115.8713 deg/s
Standing to sitting q d 1 S T A ( t ) 51.1969 deg
q d 2 S T A ( t ) 74.2387 deg
q ˙ d 1 S T A ( t ) 118.7486 deg/s
q ˙ d 2 S T A ( t ) 167.3876 deg/s
Standing to walking q d 1 S W ( t ) 38.3282 deg
q d 2 S W ( t ) 70.3928 deg
q ˙ d 1 S W ( t ) 66.8698 deg/s
q ˙ d 2 S W ( t ) 99.3652 deg/s
Table 3. Identification percentages.
Table 3. Identification percentages.
Identification PercentagesSTSSTASW
% correct90%80%100%
% incorrect10%20%0%
Table 4. Confusion matrix.
Table 4. Confusion matrix.
STS ActualSTA ActualSW ActualTotal
STS Predicted92011
STA Predicted1809
SW Predicted001010
Total101010
Table 5. Simulation parameters.
Table 5. Simulation parameters.
VariableMeaningValue
m 1 Link mass 1 0.5 kg
m 2 Link mass 1 0.6 kg
I 1 Link moment of inertia 10.12 kg m 2
I 2 Link moment of inertia 20.12 kg m 2
l 1 Link length 10.26 m
l 2 Link length 20.30 m
l c 1 Link center of mass length 10.026 m
l c 2 Link center of mass length 20.026 m
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rosales-Luengas, Y.; Espinosa-Espejel, K.I.; Lopéz-Gutiérrez, R.; Salazar, S.; Lozano, R. Lower Limb Exoskeleton for Rehabilitation with Flexible Joints and Movement Routines Commanded by Electromyography and Baropodometry Sensors. Sensors 2023, 23, 5252. https://doi.org/10.3390/s23115252

AMA Style

Rosales-Luengas Y, Espinosa-Espejel KI, Lopéz-Gutiérrez R, Salazar S, Lozano R. Lower Limb Exoskeleton for Rehabilitation with Flexible Joints and Movement Routines Commanded by Electromyography and Baropodometry Sensors. Sensors. 2023; 23(11):5252. https://doi.org/10.3390/s23115252

Chicago/Turabian Style

Rosales-Luengas, Yukio, Karina I. Espinosa-Espejel, Ricardo Lopéz-Gutiérrez, Sergio Salazar, and Rogelio Lozano. 2023. "Lower Limb Exoskeleton for Rehabilitation with Flexible Joints and Movement Routines Commanded by Electromyography and Baropodometry Sensors" Sensors 23, no. 11: 5252. https://doi.org/10.3390/s23115252

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop