Hand Gesture Based Wireless Robotic Arm Control for Agricultural Applications

One of the major challenges in agriculture is harvesting. It is very hard and sometimes even unsafe for workers to go to each plant and pluck fruits. Robotic systems are increasingly combined with new technologies to automate or semi automate labour intensive work, such as e.g. grape harvesting. In this work we propose a semi-automatic method for aid in harvesting fruits and hence increase productivity per man hour. A robotic arm fixed to a rover roams in the in orchard and the user can control it remotely using the hand glove fixed with various sensors. These sensors can position the robotic arm remotely to harvest the fruits. In this paper we discuss the design of hand glove fixed with various sensors, design of 4 DoF robotic arm and the wireless control interface. In addition the setup of the system and the testing and evaluation under lab conditions are also presented in this paper.


INTRODUCTION
Harvesting fruits and coconuts is a severe challenge to the agriculture industry as there is reduction in the traditional human harvesters who want to take up other jobs which pays good income. Increase in literacy is another reason many want to take up white collar jobs rather than agricultural work. Mechanical machines are used to do some of the agricultural work like sowing seeds, spraying pesticides etc. In this age of robots, agricultural robots are adding automation to several agricultural sectors. They help in weeding, harvesting, spraying fertilizers, seeding etc. As the robotics industry is still in budding phase, there is a lot of scope for out of the box research to use robotics and automation for agricultural purposes in India specifically suiting Indian climate and agricultural practices. In our work, we want to exploit the latest advances in robotics and automation field and use for agricultural applications, particularly in fruit harvesting. The user wears the glove fixed with various sensors and can control the rover remotely. The rover with the robotic arm and camera roams over the field or

RELATED WORKS
There are various aspects to agricultural robotics to be analyzed and studied. The authors in paper [3] discuss about the streamline approach to Precision Autonomous Farming with autonomous farming vehicles with unmanned sensing and machinery systems. The article on unmanned robotic service units categorizes automatic agricultural vehicles into four groups: guidance, detection, action and mapping [4]. It also discusses about various sensors used under action group including LiDAR, range laser sensors, artificial vision systems, range sonar sensors etc. Various algorithms for positioning the autonomous vehicle under mapping is discussed in elaborate manner. Orchard fruit segmentation using multi-spectral feature is presented in [5] which details about using image processing techniques and algorithms applied to fruit segmentation problem for a robotic agricultural surveillance mission. Current classification approaches for segmentation in agriculture use hand crafted application based features. The authors claim to use feature learning algorithm in combination with conditional random field to apply to multispectral image data which gives better results [5]. A similar work based on vision based navigational robots is presented in [6]. This can be an integral part of unmanned vehicles used in agricultural robotics. Paper [7] talks about the drop-on-demand weed control within Precision Agriculture (PA), where in the herbicide application is controlled down to individual droplets. It discusses about fluid dynamics and electronic design to control droplet dispensing in such robots which are used for weed control. Our earlier paper discusses about the low cost robotic arm for pruning and fruit harvesting applications [8].  Figure 1 shows the system architecture of the proposed glove, harvester and interface. The glove is the Human Control Interface (HCI). The user wears the glove which is embedded with accelerometer, gyrometer and flex sensors. The accelerometer gives the acceleration and tilt and the gyrometer provides the angular velocity and orientation. The harvester is the rover over which the arm is fixed. The sensors are embedded in the hand glove and are interfaced to the microcontroller unit (MCU). Three accelerometers to control the 3 links of the 4 DoF arm and two flex sensors to control the cutter are embedded in the glove. The user HCI is the transmitter and the robotic harvester is the receiver. The transmitter block contains all the sensors, Bluetooth and an MCU, all integrated into a wearable device. The wearable device transmits real time joint angles of the users arm to the robotic arm.

IMU Sensors and Flex Sensors
We use 3 MPU6050 IMU sensors each of which has a 3 -axis accelerometer and a 3 -axis gyroscope integrated on a single chip. It uses I 2 C serial interface. The wearable is device is designed such that it positions an IMU on the user's arm, forearm and hand. The flex sensor has a flat resistance is 25k ohms and bend resistance is 45k-125k. In our application, the flex resistor is placed at the index finger of the glove to measure the amount by which it is bent. The flex sensor is implemented as shown in figure 2. The index finger is mapped to end effecter of robotic arm.

Arduino Pro Mini and HC05 Bluetooth Module
The Arduino Pro Mini MCU communicates with all the sensors present on the wearable device. The Arduino Pro Mini is a microcontroller board based on the ATmega328. It uses I2C communication protocol to access the IMUs and uses the on-chip ADC to sample the input from the Flex sensor. The MCU calculates the angles between different parts of the arm using the IMU data and sends these angles and the flex sensor data to the robotic arm via Bluetooth module HC-05.

Robotic Arm
The robotic arm has a fixed base on which the entire arm has been built. There are three aluminium channels with four servo motors. These servo motors are from Servo City HS-805BB which works on 4.8-6V voltage range and has a maximum travel of 199.5 degrees. The robotic arm is designed similar to a human arm with a joint for shoulder, elbow and wrist. A total of four degrees of freedom can be achieved using this robotic arm among which three are pitch motions and a roll motion. The end effecter would be considered as an additional degree of freedom where it can close its claws there by cutting in the desired way. The robotic arm and glove are shown in Figure 3a and 3b.

Angle Calculation
The MCU obtains 6DOF data from each of the IMUs. The angle of each IMU is calculated using both accelerometer and gyro data. Gyro gives the time rate of change of angle about a particular axis. The current angle (orientation) of the gyro is calculated as follows - Where, is the sampling rate in digital domain. The integration is approximated to the sum of a finite number of samples, assuming that at, . This approximation in the digital domain introduces a drift error. The error increases with time and the calculated angle does not return to 0 in rest position. The accelerometer is used to overcome this problem. An accelerometer measures the forces acting on it. Numerous forces act on a body. So, the angle calculated using an accelerometer is unreliable. But, at rest, only the gravity vector (calculated as is seen on the accelerometer. So, at rest, accelerometer data is used to keep the output angle from drifting away. For this purpose, we used a complementary filter which as follows - Where, is a small value < 1. Angles are calculated for each IMU sensor. The difference between the angles of two sensors gives the angle at the particular joint.

Data Acquisition and Transmission Of Data
The Arduino MCU communicates with the IMU sensors using I2C communication protocol. Data acquisition is carried out as follows: a) Each IMU is initialized by waking it up from sleep. b) Arduino enables each IMU by separately supplying +5V to individual IMUs one at a time. c) Communication is then initiated with the selected IMU chip. The MCU requests a total of 12 registers, which correspond to the 6 degrees of freedom, one at a time from the sensor. After all the registers are acquired, the communication is dropped and the MCU moves on to read the next sensor. d) The MCU repeats this process three times to acquire data from each of the 3 IMU sensors, before moving on to operate on the acquired data. The flex sensor value is mapped to a suitable range of 50 -230. After the completion of calculations, a string is formed of the following format: <sholder_angle>, <elbow_angle>, <wrist_angle>, <end_effecter_roll>, <end_effecter_value>. The string is sent to the Bluetooth module, via serial port, which transmits the string to the slave Bluetooth at the receiver end.

Reception of Data and Data Decoding
Arduino would process the received data by storing the values taken from the Bluetooth module. The value till the first comma would be stored as shoulder angle and likewise. All the angles that are read would be stored in different memory locations separated by commas. The angles decoded are tested for validity and determined to be valid only if each angles lies in the specified range MIN_VALUE to MAX_VALUE for each of the joints. These angles are given if they are found out be valid to the respective servo motors after mapping them to 0 to 180 degrees which is compatible for the servo motors.

EXPERIMENTS AND RESULTS
The test setup consisted of a 4 DoF robotic arm built with lightweight aluminium channels one of which acts as the fixed link and remaining three, mobile links. The end effector that is the cutter is attached at the last link. This cutter cuts the thin twigs that are held within its ambit for this test setup. The user wears the glove with the sensors and MCU setup as shown in the Fig.  3b. The cutter action is imitated by using the thumb and index finger. The tests conducted so far are: success failure tests and reaction time tests.

Success/Failure Tests
The success failure tests are carried out to find how many times the test setup i.e. the 4 DoF arm, succeeded when the user is trying to control the arm remotely. 10 different users were chosen and asked to repeat the predefined gestures for 10 time wearing the glove as shown in Fig. 4. The observed results are tabulated as average as shown in Table 2. Due to limitation in pages, we present only the results of 4 different users in the Table 2. Table 3   The solid line shows the arm and the forearm with the palm at the end. The dotted line indicates the final position of the arm after the gesture is made. The arrow marks indicated the rotation of the arm, forearm and the palm. With all the 10 users' performing the different predefined gestures, 10 times each, the average success percentage in listed in Table 2. We can see from table 2 that the average success rate for G1 when all the ten users perform this gesture for 10 times each is 99% where as it is 98% for both G3 and G4. For G2 we see that the success rate is 100%. The 1 or 2 percentage reduction in the success rate, most of the times is found to be issues with the blue tooth connectivity. We can infer that the success rate depends on the type of wireless connectivity, the bandwidth and range. So, in reality we expect 100% success rate for all the gestures listed in Figure 4 if we have proper connectivity.

Reaction Time Calculations
For each of the gesture the video of the arm reaction is captured and analyzed for the reaction time. The end to end Bluetooth transmission and reception time is a constant for any type of gesture, T C1 . Similarly the Bluetooth pairing time between the transmitter, the glove and the receiver, the robotic arm is again a constant. These are ignored for delay calculation time. The variation in delay comes only how fast the sensors respond to the hand gesture and give the signal to the MCU at the user HCI and how fast this MCU converts these signals into commands. Let us call this time as T 1 . At the robotic harvester side when the commands are received, they have to be decoded by the MCU and then control signals are generated to the respective servo motors at each DoF. Let us call this time as T 2 . The motor reaction time again a constant and let this be T C2 . So the reaction time of the robotic arm is summation of T C1 , T C2 , T 1 and T 2 . Table 3 lists this reaction time which is measured from the video when the success/failure tests are carried out. Tables 4  and 5 lists only four users' gestures and reaction timings and average reaction timings of all the ten users' gestures respectively. The reaction time for all the four gestures for four different users is within the range of 800 to 980 ms with a timing difference of 180 ms. The average reaction timings listed in Table 5 points to the fact that for the chosen gestures the reaction time is well within 1 second.

FUTURE WORKS
The system, implementation, experiment and the results presented in this work is an indicator that we are proceeding in the right direction. However these results are preliminary and we need to test the arm with the real time scenario. The arm has to be fixed in robotic harvesting autonomous vehicle and tested autonomously. The arm can be tested to cut grape bunches in grape vineyard or mangoes, oranges or apples. An entire system with robotic operating system and standard robotic interface like Modbus, Ethercat etc. are other possibilities.

CONCLUSION
As agricultural robotics is bringing revolution in agricultural sector in all the developed countries, the developing nations like India should not miss out the opportunity to make use of such technologies and keep pace with innovations in agricultural sector. In this context we presented the design and implementation of a robotic arm which can be remotely controlled using a hand glove fixed with various sensors. The will enable the user to be in the control room rather than in the field and control the arm. Some of the tests carried out in the lab including the success/failure tests and reaction time measurement tests and their results are very encouraging.