Real Detection of 3D Human Hand Orientation Based Morphology

This paper describes a new methodology to track a human hand movement in 3D space and estimate its orientation and position in real time. The objective of this algorithm development is to ultimately using it in robotic spherical wrist system application. This method involves image processing and morphology techniques in conjunction with various mathematical formulae to calculate the hand position and orientation. The advantage of this technique is that, there is no need for continuous camera calibration which is required in other conventional methods in similar applications. The result of proposed method shows correctly identifying a large number of hand movements and its orientations. The proposed method could therefore be used with different types of tele operated robotic manipulators or in other human-computer interaction applications. This method is more robust and less computationally expensive, unlike other approaches that use costly leaning functions. The high performance is achieved during experiments testing because of its accurate hand movement identification and the low computational load. Keywords : List four to six keywords which characterize the article.


Introduction
Vision-based tracking of hand motions is one of the hottest area of research, many researchers are working on this area e.g.[1], [2], and [3].However, the problem of correctly identification of a hand and completely simulating its movement is not yet solve.Several previously proposed algorithms use features extraction from the hand and then classify it using save data or information, and others (e.g.[4] and [5]).Although some of these techniques have reported some levels of success, but suffers from many restrictions such as present gestures and positions which are fed into the system beforehand.Other solution required invasive.In this paper a new method is presented to identify the human hand in real time and analyze the resultant information to obtain the hand's position and orientation in real time without any prior data base information.The plain background is used for hand detection and the results obtained from it shows high consistency with hands of different sizes and colours and the simulated movement and rotation matched that of the subject's hand.
After obtaining and matching the hand's orientation and position in 3D space, these results used as the input to a simulated robotic manipulator and the position and orientation of its end-effector accurately matched the user's hand.Furthermore, additional information from the hand, such as the opening or closing of the endeffector gripper, is obtained and directly sent to the simulated manipulator robot, which emulated the hand action [6].

Deduction of 3D Hand Orientation
First the hand position is determined and then, the same set of features can be used to estimates the orientation of the hand in 3D space.The three quantities that would need to be determined are the angles that the hand has rotated with respect to the three principal axes, namely, the x, y and z-axes.
The rotation about the z-axis (depth) can be easily determine, which occurs on the plane parallel to the screen.This angle is measured by using two previously obtained features the reference point (located in the middle of the wrist) and the tip of the hand (farthest point on the hand from the reference point).After obtaining the coordinates for these points on the plane parallel to the screen (the xy plane), a line is drawn between them.Therefore, the angle on which hand is rotated about the z-axis can be computed using atan2 function in MATLAB.In practice, the rotation of a hand lies in the range of 0° -180°.
Where, the value of b is equal to the difference between the x coordinate of the reference point and the x coordinate of the human hand tip.Similarly, the value of a is equal to the difference between the y coordinate of the reference point and the y coordinate of the hand tip.Using these calculations, the first angle of the orientation of the human hand is obtained.Besides that, if the hand rotated towards the screen, means the upper half of the hand shape gets larger, refer to  The calculation of the next two angles required to complete the detection of the orientation of the hand is more difficult.The first of these, the angle about the yaxis (the vertical axis) utilizes the hand width property, which was previously obtained.As noted, the hand width is recorded at the beginning of the algorithm and any subsequent change in this property is noted and compared to the original registered width.
Original registration of the human hand was calculated with a 0° rotation about the y-axis, then the width at this initialization stage is the maximum value.Therefore, each decrease in the hand width reflects a relative rotation about the y-axis.If it is assumed that a full rotation of 90° results in a hand width of 1/2 the maximum hand width, a relationship between the angle and the width can be formed, as shown in (1).
(1) This variation, which differs between humans and even between different hands (right or left), can be considered by changing the equation that relates the width to the rotation about the y-axis.Fig. 2. shows an example that demonstrates the algorithm used to determine the rotation about the y-axis.

Robotic Wrist Manipulation
Finally, the accurate hand position and orientation in 3D space obtained from proposed methodology are fed into a robotic manipulator to control the position and orientation of the robotic end-effector such that it mimics the hand movements in real time.The robotic manipulator used in this study is a modified SCARA robot.The proposed design is a SCARA manipulator with a spherical wrist as its end-effector, which gives it complete freedom of movement in 3D space with regards to position and orientation.
Each of the links of the robotic manipulator is assigned to a coordinate frame.The Denavit and Hartenberg (DH) algorithm proposes a matrix method of systematically assigning coordinate systems to each link of an articulated chain.The axis of revolute joint iis aligned with axis zi−1.The xi−1 axis is directed along the normal of axes zi−1 to zifor intersecting axes parallel These parameters and conventions are applied with our robotic manipulator.The resultant DH parameters are shown in Table 1.
Table 1.DH parameters for the proposed simulated robotic manipulator link Using these parameters, the inverse kinematics model describing each link angle and link length with respect to the desired position and orientation of the manipulator wrist can be calculated using ( 2) and (3) Where, zdes is the desired orientation and position of the robot arm, which should be identical to the human hand position and orientation.After obtaining, ϴ2 the system for sin ϴ1 and cos ϴ1 shown in (4) can be solved.(4) Where, ϴ1 = atan2 (sin ϴ1, cos ϴ1).The desired orientation (rotation) matrix uses the notation shown in (5). (5) The corresponding rotation matrices for the three rotation angles can then be multiplied, as shown in (6), to obtain Rdes (6) The orientation matrix can then be used to obtain the desired rotation angles, as shown in (7). (7) Consequently, for any desired orientation and position of the robotic manipulator, we can obtain the link angles and lengths required.The matching of the state of the human hand in 3D space to a real robot can be easily accomplished through simple mathematical calculations.An example of this mapping is shown in Fig. 5. and Fig. 6. that the simulated manipulator follows the hand movements.

Application and Result
In this manuscript, a new algorithm has been proposed that correctly and accurately extracts the 3D position and orientation of a human hand and then apply this algorithm to a simulated robotic manipulator directly to test its accuracy.In all our experiments, the manipulator was successfully tele operated using a single webcam, the details representation of robot parts in corresponding to human hand parts as shown in Fig. 6.The extraction of the human hand rotation about the x and z-axes was obtained.In addition, the range of angles for the hand orientation is considered from a realistic point of view but may be deemed insufficient for certain applications in which the exact calibrated angle of the 3D hand orientation is required.

Fig. 1 .
(c) otherwise when the hand rotated backwards, fare from the screen, the hand image become smaller from the upper half refer to Fig. 1.(b) that is recognisable by eye.

Fig. 1 .
Fig.1.Rotation of the human hand about the z-axis (a) angle of rotation approximately 90°, (b) angle of rotation less that 90°, and (c) angle of rotation greater than 90°.
to zi−1 × zi.The link and joint parameters may be summarized as follows: (i) Link length aiis the offset distance between axes zi−1 and zialong the xi -axis; (ii) Link twist αiis the angle from the zi−1-axis to the zi-axis about the xi-axis; (iii) Link offset diis the distance from the origin of frame i-1 to the xi-axis along the zi−1-axis; (iv)Joint angle θiis the angle between the xi−1 and xi-axes about the zi−1-axis.

Fig. 5 .
Fig. 5. Simulation of the configuration of a human hand with a robotic manipulator (a) original registered configuration, (b) rotation about two angles, and (c) rotation and translation.

Fig. 6 .
Fig. 6.Show the representation of hand feature points on a SCARA manipulator robot.