Next Article in Journal
Learning Task Knowledge from Dialog and Web Access
Previous Article in Journal
How? Why? What? Where? When? Who? Grounding Ontology in the Actions of a Situated Social Agent
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Deliberation on Design Strategies of Automatic Harvesting Systems: A Survey

Institute of Multidisciplinary Research for Advanced Materials, Tohoku University, Katahira 2-1-1, Aoba-ku, Sendai, Miyagi 980-8577, Japan
Robotics 2015, 4(2), 194-222; https://doi.org/10.3390/robotics4020194
Submission received: 2 March 2015 / Accepted: 24 April 2015 / Published: 16 June 2015

Abstract

:
In Asia, decreasing farmer and labor populations due to various factors is a serious problem that leads to increases in labor costs, higher harvesting input energy consumption and less resource utilization. To solve these problems, researchers are engaged in providing long term and low-tech alternatives in terms of mechanization and automation of agriculture by way of efficient, low cost and easy to use solutions. This paper reviews various design strategies in recognition and picking systems, as well as developments in fruit harvesting robots during the past 30 years in several countries. The main objectives of this paper are to gather all information on fruit harvesting robots; focus on the technical developments so far achieved in picking devices; highlight the problems still to be solved; and discuss the future prospects of fruit harvesting robots.

1. Introduction

Agriculture and food are the backbone of many developed and under developing countries that helps countries to improve their economic, social and individual status. Agriculture is also one of the main reasons to bring humans together resulting in the establishment and development of human civilizations around the globe over the past 10,000 years. The high-tech, precise and qualitative large-scale modern agriculture industry of today is a result of evolutions in time and different inventions in agriculture. The present era of modern high-tech and controlled environmental agriculture is producing good quality food, taking care to meet the basic nutritional needs for human health. The major changes in agriculture have occurred through domestication of crops and animals, weed control techniques, water management, fertilizer/pesticide application, genetic engineering and the large scale mechanization that ensued in the mid-1990s. These major changes helped the agriculture sector to grow up rapidly with mechanization and precision technologies by discovering incredible innovations and bringing on various revolutions around the world.
In recent decades, advanced technology and the latest results of scientific research have been largely applied in agriculture in order to improve the quality of products and to increase productivity. The rapid growth in the world population demands a constant quality food supply. In Asia, decreasing farmer and agricultural labor populations due to various factors is a serious problem, especially in Japan [1]. As a result, to solve this problem, researchers are engaged in providing long term and low-tech solutions in terms of mechanization and automation of the agriculture sector by using highly sophisticated robots that can replace manpower, in tasks where a person may perform worse than an automatic device in terms of precision, consistency and working cycle. The application of automation in greenhouses is very common these days; especially, modern high-tech greenhouses are equipped with automatic machines and control systems which are derived versions of numerically controlled machines.
Fruit harvesting is an important application in greenhouse horticulture that helps to save on labor costs and harvesting energy consumption, and to improve resource utilization [2,3,4]. In agriculture, some damage resistant agricultural products like olives and almonds can be harvested using trunk or branch shakers [5]. However, delicate fruits, such as tomatoes, oranges, apples or strawberries, for fresh markets cannot be harvested using aggressive methods like shakers. If these methods were used, the fruits could be damaged by being impacted by the branches of the tree during the fall or by the tree directly falling on the ground, and therefore fruit would lose quality and would this result in a reduction of trading income from the fresh produce market. Also, there is the chance of detaching unripened or small, immature fruits by shaking the trunk or branches of a tree [6]. Again, manpower will be required to collect the fruits dropped on the ground after shaking, resulting in increased labor and harvesting operation costs.
On the other hand, the manual fruit harvesting method is highly labor intensive and inefficient in terms of both economy and time. To perform intensive manual harvesting, large labor power is required and at the same time labor wages are constantly rising. The only way to maintain or reduce labor costs per unit of output is to increase productivity of labor or increase the volume of output. Competing on low labor costs is infeasible, given world trade laws and costs of living. Hence, mechanization is the only answer, since it offers, potentially, the only option for reducing harvesting labor expenses, so that growers can stay competitive in the years ahead and even markets can expand [7]. Also, mechanization plays a vital role in securing the future of fruit growers in developed countries. Moreover, in addition to providing means for reducing the drudgery of harvest labor and as the only solution to maintaining harvest productivity, harvest machinery improves the ability of farmers to perform operations in a timely manner. It also reduces the risks associated with the need for large amounts of seasonal hand labor for short periods of time and lessens the social problems which accompany an excessive influx of low-wage workers. The machine harvesting systems are a partial solution to overcome these issues by removing fruits from the trees efficiently; thus, reducing the harvesting cost to about 35%–45% of the total production cost [2] and helping to save the labor cost and harvesting energy consumption and improving resource utilization in agricultural activities [8]. By considering the above mentioned issues and the necessity for, and potential of, fruit harvesting robotics in agriculture, this review paper was presented to provide all the required data and developments that took place in the last 30 years while focused on a single issue. Most of the data regarding fruit harvesting robotics are scattered or dispersed throughout the scientific and technical journals; this paper is key to bringing all the information together in one paper and as a basis for novice researchers to build on. Although there are several review papers available, this paper is in an updated form that can provide an insight to developments in fruit harvesting robotics over three decades throughout the world. This descriptive paper addresses various design strategies in recognition and picking systems, and developments in fruit harvesting robots in the past 30 years to address the above mentioned issues. Section 2 represents the design strategies for picking and recognition systems. Section 3 gives an insight into the developments that took place in fruit harvesting robotics over the last three decades in chronological order while Section 4 discusses the present challenges and future prospect of fruit harvesting robots as a commercial product.

2. Design Strategies

Fruit harvesting robots usually consist of three main units; the first unit is a recognition system in which identification and location of fruits are confirmed, the second unit is a picking system in which grasping and cutting operations are performed; and the third unit is a moving system in which programmed based sub-unit of the robot moves inside the farm or in the furrows during a harvesting operation in greenhouses. Depending on the agricultural application and on the workspace in which the robot will operate, a rotational joint, linear joint, twisting joint, revolving joint and orthogonal joint or a combination of these joints are used to connect the links which form a revolute, spherical, cylindrical, rectangular or telescopic robot structure. The links are further equipped with actuators such as hydraulic pumps, air cylinders, linear actuators or electric motors for output motion. Mechanical components such as gears, bearings, belts or linkages are used to transform output motion from actuators. The feedback sensors such as optical encoders, resolvers, thermocouples, cameras or motion detectors are used to measure the various parameters and provide feedback to the control unit. A motion controller used to generate set points for providing reference measurements while a drive or amplifier is used to transform the control signals. The motion of the robot can be controlled by several control functions such as velocity control, position control, pressure or force control, and electronic gearing, and every function consists of several methods or operations mechanisms to perform the operation. Robotic operations and movements can be controlled by sequential looping programming through the computer. Now-a-days, a number of commercial software programs are available to study the dynamic and kinematic behavior of the robots and obtain motion trajectories.
In recognition systems, cameras such as CCD camera, infrared camera, high-speed camera or multispectral camera are used along with artificial lighting systems if required. A captured image from a camera is then transferred to the computer and a specific image processing algorithm based on particular feature attributes and color specification models are adopted to process the images captured by cameras. The image processing system discriminates the fruit/fruits in natural background and provides three-dimensional location (X, Y, Z coordinates) and orientation of fruit i.e., length of fruit stem and vertical angle of stem. The location helps to move the picking unit to target the fruit while orientation helps to determine grasping and cutting points. The information used to detect and locate the fruits on trees includes shape, size, edges or color while methods and algorithms used to discriminate fruit changes with physical, chemical or geometrical properties of fruits [9,10,11,12]. Also, there are numerous approaches for image processing and data analysis used in recognizing fruits, which shows the importance of fruit recognition systems in harvesting robotics [13]. Figure 1 represents a conceptual design of a fruit harvesting robot.
Figure 1. Overview of a fruit harvesting robot.
Figure 1. Overview of a fruit harvesting robot.
Robotics 04 00194 g001

2.1. Picking System

The robot grippers in horticulture applications for fresh fruit and vegetable manipulation have to fulfill special requirements such as high speed activation, adaptation to a variety of shapes, maximum adherence and minimal pressure, no damage to the product, low maintenance, high reliability, low weight, be approved for contact with foodstuffs, low energy consumption, required positional precision for both gripping and releasing of the product, ease of cleaning, and easy and fast ejection of the product (important for products of low weight). By considering these special requirements and to specify gripping manipulation, Blanes et al. [14] classified the manipulation strategies to design grippers based on the above mentioned factors.
For fruit harvesting robots, the direct contact type strategy using pneumatic, hydraulic and electrical methods is always efficient, as there is less damage to fruit. To reduce the mechanical damage to the fruit during harvesting, several grasp theories and stability analysis were presented [15,16,17,18,19,20,21,22,23,24,25] by considering the curvature of both the fingers and the object at multi-contact points, and the effect of curvature and stiffness on the stability of grasps with and without friction at the point contact was investigated. According to these grasp theories, to facilitate control, a simple end-effector with two parallel curved type fingers always shows better and steady grasping stability than plate fingers. Table 1 provides comparison of electrical, pneumatic and hydraulic gripping systems which helps to determine the specific strategy for particular application. In agriculture, electric and pneumatic grippers have shown good results for single and cluster fruits with high accuracy, repeatability, easy maintenance and small size which make them popular for fruit harvesting robots these days.
Table 1. Comparison of motors manipulation.
Table 1. Comparison of motors manipulation.
Electric GrippersPneumatic GrippersHydraulic Grippers
Accuracy, strength and speedHigh accuracy and repeatability, good strength, high speedHigh accuracy, good strength, high speedGood accuracy, high strength, high speed
SpaceLess floor spaceLess floor spaceLarge floor space
AdvantageLow cost and easy maintenanceEasy maintenanceMechanical simplicity
DisadvantageEasy to damage Needs precise system controlUsed usually for heavy payloads
ApplicationGood results for single fruitsGood results for cluster fruitsGood results for single fruits
After determining the gripping and manipulation strategies, it is always important to determine the interactive factors and design parameters relevant to the strategies and desired gripping application. All the possible interactive factors and input parameters relevant to gripper design need to be framed together which provides the steps to obtain the optimal gripper design. The factors that significantly influence the gripper selection were given by Monkman, et al. [26], such as the robot and machine system, position of components in installation/equipment, size and shape, mass and properties, motion sequence, velocity and acceleration, forces acting on gripper during motion, grasping points, gripper drive parameters, etc., in which the crucial conditions for not only dynamic but also static components are interconnected with the optimal gripper design. After deciding on the important parameters and primitive design, it is always good to perform modeling and simulation of the design in appropriate 3D mechanical CAD software to verify the specified parameters and performance under a controlled environment. Edan et al. [27] reported a finite element modeling technique and optimization of modeling parameters for melon grippers while Bachche et al. [28] and Bachche and Oka [29] described the simulation and modeling methods using Solidworks for a sweet pepper harvesting robot hand (Figure 2). The modeling of the system provides all optimal parameters which can be used to build the prototype and improve on the system performance obtained in the simulation process. The simulation process helps to determine the static and fatigue characteristics and effectiveness of the system. In case of any failure within the system, a part of the system can be redesigned or material can be changed to obtain highly stable and efficient end-effector before the actual manufacturing process takes place. These techniques and methods always help to optimize the picking system parameters through several design studies and it is also possible to perform the motion analysis of the model without actually prototyping it. The final optimized parameters which interact with the environment can be used for prototyping which always saves time, manufacturing costs and complicated calculations. The static characteristics such as stress, strain and displacement analysis help to determine the performance of the system under a working environment while fatigue characteristics such as fatigue life cycle and fatigue load factor plots helps in determining the component analysis for better performance of the designed system.
Monta et al. [30] developed two types of end-effectors for tomato harvesting robots based on the physical properties of tomato. First end-effector prototype has two parallel plate fingers and a suction pad while the second prototype has air pressure pads replacing the suction cup. First prototype is unable to harvest fruits with a short peduncle while the second prototype could harvest fruits regardless of peduncle length. Sakai et al. [31] provided designs based on parallel type manipulation for heavy material handling manipulator in agriculture such as watermelon, pumpkins, cabbage and lettuce. Ling et al. [32] developed a four-finger prosthetic hand and embedded hand controller for tomato harvesting robot. The sensing and picking were 95% and 85%, respectively, compared with a previous prototype. Liu et al. [33] developed a multi-sensory end-effector for spherical fruit harvesting robot using a vacuum pressure sensor, distance sensors, proximity sensors and force sensors. A laser cutting system composed of high power fiber coupled laser diode was used for cutting while suction pad device with two finger gripper was used for grasping the spherical fruit.
Figure 2. Prototype simulation results of end-effector [28,29].
Figure 2. Prototype simulation results of end-effector [28,29].
Robotics 04 00194 g002
Bachche et al. [34], Bachche et al. [35] and Bachche and Oka [36] developed a thermal cutting system for sweet pepper harvesting robots based on current and voltage potentials (Figure 3). These systems assist to avoid virus transformation, reduce the fungal vulnerability and increase the shelf life of fruits by adopting a thermal cutting approach. The design consists of two parallel gripper bars mounted on a frame connected by a specially designed notch plate and operated by a servo motor. Based on voltage and current, two different types of thermal cutting system prototypes—electric arc and temperature arc—were developed. In electric arc, a special electric device was developed to obtain high voltage to perform the cutting operation. At higher voltage, electrodes generate thermal arc which help to cut the stem of sweet peppers. In temperature arc, Nichrome wire was mounted between two electrodes and current was provided directly to electrodes which results in generation of high temperature arc between two electrodes that help to perform cutting operation. These prototypes were tested for several variable field conditions in which temperature arc system was found to be effective and took 1.5 s to perform the cutting operation. The post-harvest inspection of harvested fruits confirmed an increase in the shelf life of fruits and prevention of fungal and virus transformation. The fruits harvested by the thermal arc cutting system can be preserved more than 15 days under normal room conditions.
Figure 3. Overview of thermal cutting end-effector [34,35,36].
Figure 3. Overview of thermal cutting end-effector [34,35,36].
Robotics 04 00194 g003

2.2. Recognition System

2.2.1. Color Camera Recognition System

For the last several years, computers have been used extensively for analyzing images and obtaining the data from images. However, due to variability of agricultural objects, it is very difficult to adopt the existing industrial algorithms to agricultural domain. To cope with this variability, the methods and knowledge of algorithms for the agricultural domain need to be studied which could support the variations in field environment conditions and physical flexibility of agricultural objects. There are many processes available in agriculture where decisions are made based on the appearance of the product [37]. The techniques used for these applications are mostly successful under the constrained conditions for which they were designed, but the algorithms are not directly usable in other applications. In principle, computers are flexible because they can be re-programmed, but in practice it is difficult to modify the machine vision algorithms to run for slightly or completely different applications because of the assumptions and rules made to achieve the specific applications [38].
On the other hand, in the agricultural field, configuration of the trees significantly alters the percentage of visible fruits on the tree. For tree row configurations, with a hedge appearance, the visibility of the fruit can reach 75%–80% of the actual number of fruits which is much better than the 40%–50% of visibility for conventional plantings [39]. One major difficulty in developing machinery to selectively harvest fruits is to determine the location, size and ripeness of individual fruits. These specifications are needed to guide a mechanical arm towards the target object. The computer vision strategies used to recognize a fruit rely on four basic features which characterize the object: intensity, color, shape and texture. Apart from these basic characteristics, many researchers are engaged with developing different approaches to recognize fruits in natural backgrounds. Research work on detection of different fruits and vegetables such as apple [40,41,42,43], cherry fruit [44], cucumber [45,46], orange [47,48], tomato [49,50,51], strawberry [52,53], melon [54,55] and sweet pepper [12,56,57,58] (Figure 4) has been undertaken.
Figure 4. Discrimination and computation of 3D location information of sweet peppers using color CCD cameras based on parallel stereovision principle [12,58]. This positional information of detected green sweet peppers was found to be highly reliable in which the depth accuracy errors and disparity parallax errors were minimal when distance between cameras and fruit was 500–600 mm and distance between two cameras was maintained at 100 mm.
Figure 4. Discrimination and computation of 3D location information of sweet peppers using color CCD cameras based on parallel stereovision principle [12,58]. This positional information of detected green sweet peppers was found to be highly reliable in which the depth accuracy errors and disparity parallax errors were minimal when distance between cameras and fruit was 500–600 mm and distance between two cameras was maintained at 100 mm.
Robotics 04 00194 g004
The detailed review on computer vision methods for locating fruits on trees is given by Jimenez, Ceres and Pons [13] which covers the main features of recognition approaches, sensing systems used to capture the images, image processing strategies used to detect the fruits and results obtained from previous studies. Another detailed review was given by McCarthy et al. [59] on applied machine vision of plants with implications for field deployment in automated farming operations in which the research studies conducted previously were grouped into monocular vision with RGB camera, stereo vision and 3D structure, multispectral imaging and range sensing. Each group focuses on the recognition strategies and approaches under a particular group, and potential methods to enhance the machine vision system design for application in the agricultural field were discussed. Kapach et al. [60] provided a comprehensive review of classical and state-of-the-art machine vision solutions employed in harvesting robot systems, with special emphasis on the visual cues, computational approaches and machine vision algorithms used. The studies on image processing approaches for spherical and non-spherical fruits based on visual cues like color, spectral reflectance, thermal response, texture, shape etc. and based on machine vision algorithms like segmentation, clustering, template matching, shape inference, voting, machine learning etc., were discussed in detail according to the applications and algorithms developed considering specific needs for particular applications. Pal, N. and Pal, K. [9] also reviewed studies on image segmentation techniques focusing on fuzzy and non-fuzzy methods for color segmentation, edge detection, surface based segmentation, gray level thresholding and neural network-based approaches. Adequate attention was paid to segmentation of range images, magnetic resonance images and quantitative evaluation of segmentation.
In 1984, Baylou et al. [61] developed a detection system for asparagus using a stereoscopic visual sensor. This system can detect the asparagus and also provide 3D location. Humburg and Reid [62] developed and tested a machine vision system for identification and location of harvestable spears of asparagus in which a videotape of a row of asparagus was used to simulate the vision system. Accumulated errors were found in recognition of spears at confidence interval levels and these intervals were used to determine the size of the cutting mechanism. An effective vision algorithm was presented by Kondo et al. [63] to detect positions of many small fruits and to identify cluster information along with 3D position. The experimental results showed that this visual feedback control based harvesting method was effective, with a success rate of 70%. Edan et al. [64] reported experimental results by replacing the visual system from a melon harvesting robot presented by Edan and Miles [65]. The new vision system consists of black and white image processing system with algorithm that was able to detect fruits and then compute positional information. The intelligent control system consists of a distributed blackboard system with autonomous modules for sensing, planning and control. This robot with a new vision system had a fruit harvesting success rate of more than 85%.
Takahashi et al. [66] reported a method for measurement of 3D location of apples by binocular stereo vision system which had a 90% recognition rate when image was less dense with fruits; and a 65–70% rate when image was denser with red fruits. A machine vision system and laser ranging system was used by Bulanon et al. [67] to determine 3D location of the apples. A laser ranging system was found to be more effective as it determines the distance from the end-effector to the fruit very easily and accurately while the machine vision system was found to be computationally expensive and time consuming. Tarrio et al. [68] reported a method for a recognition system to detect the small fruits in bunches based on 3D stereoscopic vision. The method uses passive and active 3D reconstruction technique, stereoscopic vision and structured lighting. Two CCD cameras with movable laser diode panels were used to illuminate the scene, and color transformation from RGB to HSV was used in image processing.
Lak et al. [69] described apple fruit recognition system under natural luminance using machine vision. An edge detection and combination of color and shape analyses were utilized to segment images of red apples obtained under natural lighting. An edge detection based algorithm was found to be unsuccessful, while color-shape based algorithm could detect apple fruits in 83.33% of images. Li et al. [70] reported development of a real-time fruit recognition system for pineapple harvesting robots in China using color space transformation from RGB to HSI. Ji et al. [71] mentioned a procedure to develop an automatic recognition system based on color and shape features and a new classification algorithm based on support vector machines for apple recognition. The method was found to be efficient with an 89% apple recognition rate and 352 ms average recognition time per fruit. A recognition system for olives using neural networks based on color and shape extracted from captured RGB images was reported by Gatica et al. [72]. Two cases viz., recognition of olives and overlapping of olives, were analyzed during decision making operation based on neural networks.

2.2.2. Multispectral Recognition System

Multispectral imaging is a technique for recognizing and characterizing physical properties of materials using the principle of the varying absorption (or emission) of different wavelengths of light by the objects. This technique has been applied in various areas of science such as medicine, forensics, geology, and meteorology. The wavelengths of light used in multispectral imaging usually lie within the Infrared (IR) and Near Infrared (NIR) ranges. In contrast to hyperspectral imaging, which characterizes materials by measuring the variation in light intensity over continuous ranges of wavelengths, multispectral imaging utilizes a relatively small set of specific wavelengths. The desired wavelengths may be selected by a set of dichroic interference filters of specific wavelength and pass-band. The variation of light intensity versus wavelength is measured by means of appropriate sensor techniques for each point in the scene being imaged. This technique was originally developed for space-based imaging and can allow extraction of additional information the human eye fails to capture with its receptors for red, green and blue. For a different purpose, a different combination of spectral bands can be used represented by red, green and blue channels. Mapping of bands to colors depends on the purpose of the image and personal preferences. For the last few decades, NIR spectroscopy has shown considerable promise for the non-destructive analysis of food products and is ideally suited for on-line measurements in the agro-food industry due to its advantages: minimal or no sample preparation, versatility, speed and low cost analysis [73].
Most well-known applications of NIR spectroscopy in fruits have focused on the quantitative prediction of chemical composition, internal damage and ripening stage in various fruits such as apple, avocado, banana, caraway, coriander, carrot, Chinese bayberry, citrus fruit, dates, dill, fennel, grape, green beans, guava, Japanese pear, kiwifruit, macadamia, mango, melon, mushroom, nectarine, olive, onion, papaya, peach, peach, pepper, plum, tangerine, tomato, etc. The use of multispectral-image-based perception methods has been studied extensively to assess the crop nutrition level based on crop canopy reflectance in multiple spectral bands [74,75]. Optical properties are based on reflectance, transmittance, absorbance, or scatter of light by the product. Often features related with these properties are chosen for various purposes such as external qualities like size, shape, color, gloss, texture, defect; internal qualities like flavor, texture, nutrition, defect, pH level, total soluble solids content, dry matter, sugar content, nitrogen level, starch, moisture content, essential oil content, different acids, chlorophyll, etc. The detailed review on the determination of various non-destructive measurement of fruits and vegetables quality by means of NIR multi-and hyperspectral imaging techniques; different spectrophotometer designs and measurement principles; other spatial techniques for quality measurements is given by Osborne and Hindle [73], Abbott [76] and Nicolai, et al. [77]. The potential of NIR spectroscopy and imaging as rapid, non-destructive and multi-parametric technique is supported by its applications for fruit and vegetable quality measurement including not only food industry, post-harvest applications but also horticultural crops.
In last few decades, in NIR techniques, most of the work has been done either during quality determination, quality inspection and measurement applications or for post-harvest applications as mentioned in the above section. Though the NIR technique has shown significant results for non-destructive quality measurements of fruits and vegetables delivering satisfactory application development for quality measurement, inspection and post-harvest operations, still there has not been enough research done on NIR multispectral imagining for discriminating the fruits in the natural background. Czarnowski and Cebula [78] investigated the spectral properties of red, green, yellow and cream colored sweet peppers. The relationship between spectral properties of fruits and leaves was studied and the results showed that up to 700 nm of the fruit and leaves had almost same reflectance and absorbance of light while from 700 nm to 1100 nm, there was a significant difference between reflectance and absorbance. Van Henten et al. [79] studied the possibilities of adoption of spectral properties to detect cucumbers with natural background. Two monochrome cameras were used with 850 nm and 970 nm band-filters, simultaneously. The spectral band filters showed better detection results as a significant difference was found between the spectral properties of cucumber and leaves. Hemming [80] used the same vision system [79] to study the possibility of detection for sweet peppers. The study reported that spectral vision system can be used successfully for detection of sweet peppers.
Safren et al. [81] used hyperspectral imaging for detection of green apples in green natural background. A multistage algorithm was developed that uses several techniques, such as principle components analysis (PCA) and extraction and classification of homogenous objects (ECHO) for analyzing hyperspectral data, as well as machine vision techniques such as morphological operations, watershed, and blob analysis. The recognition rate was reported as 88.1% and error estimated as 14.1% due to overlapping of fruits. Rath and Kawollek [82] performed experiments on robotic harvesting of Gerbera Jamesonii based on detection and 3D modeling of cut flower pedicels. Two high-resolution CCD cameras with near-infrared filters were used for image capturing. From the data of both images and eight plant positions, three-dimensional models of the pedicels were created by triangulation. The evaluated 3D model was used to calculate spatial coordinates for the applied robot control. Based on the results, a pneumatic harvest grabber was developed, which harvested the pedicels by cutting them off. The pedicels harvest rate was recorded as 80%. Using binocular stereovision and spectral imaging, the possibilities of cucumber detection were verified by Yuan et al. [83]. A stereo vision system was used to capture monochrome near-infrared images. A fruit detection algorithm was used in image processing and 3D location was computed using a triangulation model. The fruit recognition rate was found as 86% while the distance errors of grasping position were reported as less than 8.6 mm.
Bulanon et al. [84] used a CCD camera with six band pass filters to examine possibilities of multispectral imaging for citrus fruits detection. Accordingly, 600 nm, 650 nm and 700 nm band pass filters were found to be suitable to discriminate citrus fruit. The latest work in multispectral imaging is reported by Bachche [85] in which a multispectral recognition system was developed using 780, 800, 900 and 960 nm infrared optical filters to investigate the relationship between spectral properties of green sweet peppers; the recognition rate, fruit visibility percentage and maturity determination of recognized sweet peppers under variable field conditions. A 960 nm wavelength was found to be feasible and effective to discriminate the sweet peppers as, at this wavelength, there was significant difference observed between reflectance and absorbance of light. Also, the reflectance of chlorophyll content at this wavelength was used as a correlation factor with saturated red wavelength adsorption region. The histogram ratios based on transformed chlorophyll adsorption ratio index and principle component analysis was used during multispectral image processing. The results confirmed higher recognition rates, maximum fruit visibility percentage and 76% true maturity determination decisions of recognized sweet peppers (Figure 5). Bac et al. [86] reported a quantitative type study to classify different plant parts under variable lighting using multispectral system. This system was specially developed for detection of sweet peppers in the greenhouse using multispectral cameras. The image processing algorithm provides pixel based classification of plant parts under varying light.
Figure 5. Results of multispectral recognition and maturity determination system [85]. (First image shows pre-known status of detected fruits while second image shows result image with histograms for maturity determination: green color—matured; other colors—not matured).
Figure 5. Results of multispectral recognition and maturity determination system [85]. (First image shows pre-known status of detected fruits while second image shows result image with histograms for maturity determination: green color—matured; other colors—not matured).
Robotics 04 00194 g005

3. Fruit Harvesting Robots

Among the various farm management operations, harvesting is an important operation which needs not only labor power but also high energy input with high resources. Most of the other farm management operations can be carried out by highly precise and accurate commercialized mechanization techniques but the harvesting operation still has not gained the similar commercialization status to encourage researchers to study and develop agricultural robot applications for harvesting purposes. The study of agricultural robot applications for plant production presumably started with a mechanical citrus harvesting system in 1968 [87].
In 1984, Japanese researchers Kawamura et al. [88] developed the first fruit harvesting robot in Kyoto University, Japan to harvest tomatoes. This robot consists of a 5 DOF manipulator; end-effector; stereovision and battery car as a travelling device. In the following year, Spain and France together started the MAGALI project [40] which focuses on developing a fruit harvesting robot to harvest apples. Under the MAGALI project, several versions of a harvesting robot were built and tested successfully. Prussia [89] reported the ergonomic aspects of manual harvesting which focuses on several ergonomic principles that relate to manual harvesting and emphasizes the merit and demerits of manual harvesting by analyzing visual acuity, color sensitivity, strength, capacity, and productivity.
The first review on robotics and intelligent machines was given by Sistler [90] which highlights the important achievements in mechanization and deliberates on the possibilities of practical agricultural robots. Further, various problems that researchers need to solve, different barriers in development of agricultural robots and how we can increase agricultural productivity by examining the capabilities of intelligent machines and robots are described in detail. Grand D'Esnon et al. [91] reported a second version of MAGALI robot which consists of a spherical manipulator with a vacuum gripper and camera. Four ultrasonic telemeters were used for navigation during harvesting and hydraulic actuators were used to move the manipulator. In 1989, extensive work was performed in fruit harvesting robots: Amaha et al. [92] developed a cucumber harvesting robot at the University of Tokyo, Japan; Whitney and Harrell [93] developed a robotic arm which had a picking rate of one fruit every 5 seconds from outer canopy; an Italian company built a fruit harvesting prototype for citrus which had a 65% harvesting success rate [94]; Sevilla et al. [95] conducted feasibility study of harvesting robot for grapes in France.
Tillett [96] reported a mechatronic mushroom harvester developed in United Kingdom which consists of a black and white vision system, computer controlled Cartesian robot and an end-effector. This robot had an 84% fruit detection rate and 57% harvesting success rate. Sandini et al. [97] developed an autonomous robot for several operations in greenhouses. This robot uses two PAL color cameras with bit-slice microprocessor card for fast image processing. This research was mainly focused on locating the tomatoes in greenhouses using a visual feedback system and navigating the robot through greenhouses to perform several simple operations. Harrell, Adsit, Pool and Hoffman [47] developed a mobile grove-lab to study the use of robotic harvesting of citrus under actual production conditions. This robotic arm was an operational multiple arm harvesting system equipped with several sensors and actuators to perform real-time tasks and controlled through a computer program.
Kubota Co., Sakai, Japan [98] developed a 4 DOF manipulator to harvest oranges. A vacuum pad with an optical proximity sensor in gripper and a stroboscope light with color camera protected by fork shaped cover was used in this manipulator. Pool and Harrell [99] reported developments in a previously built prototype for citrus harvesting. This new design consists of a cutting device with rotating lips that detaches the fruit already enclosed in tube which was attached to a tubular arm for picking soft fruits. This prototype showed a 69% harvesting success rate while 37% of fruits were physically damaged. Kassay [41] reported first Hungarian AUFO robot for harvesting apples. This AUFO robot consists of two color cameras for automatic fruit detection and 3D position of fruit using stereovision system; six arms with movement in vertical plane; four padded fingers with compensation springs to grip the fruit at an accurate grasping pressure; and robotic mobile platform to navigate the robot around the trees. In Israel, Benady and Miles [100] manufactured a melon harvesting robot with Cartesian manipulator mounted on frame moved by a tractor. The robot near-vision system was equipped with laser line projector which illuminates the scene and based on curvature shape where light contacts with melon, using profile transformation and triangulation method, melons were detected and positional information was obtained. To harvest melons at a successful rate, Edan and Miles [65] presented animated, visual and numerical simulations to optimize picking time, actuator speed and planting distance between melons.
Sarig [101] described a detailed review of technical developments and different aspects of harvesting problems in different countries from 1982 to 1992. Tillett [102] reviewed robot manipulators used to handle biological materials within the horticultural industry. The potential applications were discussed and the scope for future developments based on economic as well as technical considerations was deliberated. The developments in locating and picking performance of mushroom harvesting robot were discussed by Reed and Tillett [103]. A black and white vision system with mushroom locating image analysis algorithm, a computer controlled Cartesian robot and a specialized mushroom picking end-effector was used to test the robot’s performance. In total, 84% of mushrooms were located by image analysis algorithm while 57% of mushrooms were harvested successfully. Under an Italian Project, Grattoni et al. [104] reported a mobile robot equipped with a suitable manipulator and driven by a stereo-vision module to harvest asparagus.
The development of grape harvesting robot in Japan was reported by Monta et al. [105]. This robot consists of a manipulator, a visual sensor, a travelling device and end-effectors. The visual sensors locate the peduncle of grape cluster and then end-effector grasp and cut peduncle. Further, a bagging system was also developed to cover the cluster and then grasp and cut the peduncle and then transfer to a container. A Spanish project was conducted to develop a human assistive robotic system for greenhouse operations known as Agribot [106]. The operator using a joystick moves a laser pointer until the laser spot is in the middle of the fruit which helps to obtain 3D position of fruit using a computer. Then, a manipulator with pneumatic gripping system and optical proximity sensor which had the ability to detach fruits was moved towards the target fruit. On the other hand, Edan [107] reviewed several developments in autonomous agricultural robots including guidance systems, greenhouse autonomous systems and fruit harvesting robots. A general concept for a field crops’ robotic machine to selectively harvest easily bruised fruit and vegetables was presented in this review and future trends that must be pursued in order to make robots a viable option for agricultural operations were discussed.
Several types of fruit harvesting robots were reported by Kondo, Monta and Fujiura [49] such as tomato, petty-tomato and cucumber, and robotic systems such as manipulators, end-effectors, visual sensors and travelling devices were discussed. Arima, Kondo and Monta [45] developed a robotic system for cucumber harvesting which consists of a 6 DOF articulated manipulator, a monochrome TV camera with optical filter having a wavelength of 850 nm and a peduncle detector type end-effector. The visual sensor discriminates the green fruits from green background based on morphological characteristics and an end-effector harvests the fruit. Meanwhile, a Spanish project developed an autonomous mobile robot for greenhouse operations known as AURORA [108]. This was a multi-tasking, remotely supervised AND controlled robot developed for greenhouse operations governed by a control architecture that supports both autonomous navigation and shared human control. This robot has been successfully tested in different greenhouses for autonomous navigation and spraying tasks.
Arndt et al. [109] discussed the trends and developments of an automated selective asparagus harvesting robot that has been operational since 1989 at the Centre for Advanced Manufacturing and Industrial Automation (CAMIA), University of Wollongong, Australia. CAMIA prototype showed effective results in harvesting and had a 94% success rate of harvesting. The specific designs and experimental results of AgriBot were presented by Ceres et al. [110]. The robot was tested in the laboratory with artificial trees to check the performance. The overall performance by AgriBot was found to be effective and satisfactory; the gripping-cutting cycle was 2 s; torque requirement was reduced by 3%. Kondo and Monta [111] developed two types of strawberry harvesting robots: one for hydroponic system and the other for soil systems. The robot consists of a 3 DOF manipulator, aspirating-type pneumatic end-effectors and visual sensors. The pneumatic end-effector grasps the fruit with pressure and then twists the peduncle which results in detachment of fruit.
Reed et al. [112] reported a robot that was capable of automatic mushroom detection, sizing, selection, picking, trimming, conveying and transfer of mushroom. A suction cup end-effector was designed for picking the mushrooms while flexible fingers, high-speed knives and padded pneumatic gripper was designed to handle delicate operations such as conveying, trimming and transferring. A monochrome video camera equipped with 32 W circular fluorescent lighting system and image processing algorithms were used for recognition of mushrooms. This robot showed effective performance under selected operations with 80% of average picking rate. Brown [113] performed labor productivity and harvest cost analysis for citrus based on results obtained from eight mechanical harvesting methods. Van Henten, Hemming, Van Tuijl, Kornet, Meuleman, Bontsema and Van Os [79] from Netherlands developed an autonomous robot for harvesting cucumbers. This robot consists of 7 DOF manipulator, autonomous vehicle, end-effector, two computer vision systems for detection and 3D imaging of the fruit and a control scheme that generates collision-free motions for the manipulator during harvesting. Thermal cutting device was used in this robot to prevent spreading of viruses. The robot had 95% detection rate, 80% harvesting success rate and required 45 s to pick one fruit. At the same time, in Japan, Hayashi et al. [114] developed eggplant harvesting robotic system which includes a machine vision algorithm combining a color segment operation, a visual feedback fuzzy control system for manipulator and peduncle cutting mechanism with gripper. This robot showed 62.5% successful harvesting rate and took 64.1 s to cut one fruit. Cho et al. [115] from South Korea developed a 3 DOF lettuce harvesting robot using machine vision and fuzzy logic control. This robot consists of a manipulator with end-effector, a lettuce-feeding conveyor, an air blower, a machine vision system, six photoelectric sensors and fuzzy log controllers. The robot showed effective performance with 94.12% harvesting success rate and took 5 s to cut the lettuce.
Van Henten et al. [116] further presented field tests of a cucumber harvesting robot reported in 2002 with variable field conditions in which the robot took a cycle time of 124 s per harvested cucumber and the average harvesting success rate was found to be 74.4%. Arima et al. [117] developed a Cartesian coordinate type 4 DOF manipulator with suction cup and visual sensor to harvest strawberries on a table-top structure. Hannan and Burks [48] described the main challenges related to fruit detection and robotic harvesting of citrus. The mechanical designs required in fruit harvesting robots and uses of visual systems for future developments were highlighted. Burks et al. [118] provided primarily a literature survey and synthesis which tries to identify the key issues that robotic system developers and horticultural scientists should consider for optimizing plant-machine system performance. Sanders [8] reviewed details of various aspects of orange harvesting systems such as fruit selection, methods of fruit removal, mechanical harvesting systems, manual picking and orchard arrangement. This review was illustrated by the large amount of research data and analysis based on previous work and reviews on orange harvesting.
Kitamura and Oka [56] developed a sweet pepper harvesting robot which was a sliding rail type timing gear based prototype. The robot consists of mobile base controlled by S-box; a vision system with artificial lighting; HSI binarization algorithm for image processing and picking system consisting of parallel metal fingers and pruner. During harvesting experiments, 29 s were taken by prototype to perform harvesting cycle per fruit and judgment errors were reported due to overlapping of fruits. Kondo et al. [119] updated the results for a strawberry harvesting robot developed in 2004. The updates focus on development of new end-effector techniques for grasping and cutting operations. Muscato et al. [120] proposed three end-effector designs for an orange picking robot. First design uses a mirror placed beneath fruit to assist the camera in moving the end-effector to be centered around the fruit and a de-pressurizable tube used to cut and hold the fruit. This end effector was never constructed because of concerns over cost and robustness. The second design prototype consists of three pneumatically activated flexible fingers which grasp the fruit by pneumatic pressure. A force sensor based circular micro-saw cutter was attached to wrist of manipulator to cut the stem of grasped fruit. This prototype showed a high success rate but the cost of implementation was too high to be used in a commercial model. The third design prototype involved two jaws which grasp the fruit and guide the stem to V-shape cutter located on upper jaw; when jaws closed, the fruit detaches from the plant.
Foglia and Reina [121] developed agricultural robot for radicchio harvesting that consists of a double four-bar linkage manipulator, a special gripper and computer vision to localize the plants in the field based on intelligent color filtering and morphological operations. The performance of the computer vision system was analyzed in terms of accuracy, robustness to noises, and variations in lighting. Belforte et al. [122] developed a multi-tasking robotic system which was able to perform several tasks in greenhouse. This work also illustrates the precise operations on precision spraying and precision fertilization and discusses the important features and requirements for robots in a greenhouse.
Further, Ota, et al. [123] performed experiments with a cucumber leaf picking device which consists of a picking rotor composed of knives and brushes, a motor and a vacuum cleaner. The smooth cut surface of the leaf stalk and the smooth cut surface with small skin was 90% when rotor was configured with two knives and two brushes having insertion speed of 50 mms-1. The average execution time per leaf was 1.1–1.3 s which was much higher than the de-leafing robot developed by Van Henten et al. [124]. Kondo et al. [125] developed an end-effector and manipulator control system for a tomato cluster harvesting robot. The robot moves towards the main stem and grabs it followed by a gripping and cutting action by the end-effector. A pushing device was used at the end-effector to hold the harvested tomato cluster and transfer it to the container without vibrating it. A quick motion control with modified input shaping method considering natural frequency of fruit cluster was used to dampen the vibrations.
Tanagaki et al. [126] developed a cherry harvesting robot which consists of a 4 DOF manipulator, 3D vision system, an end-effector, a computer control program and a travelling device. 3D vision sensor was equipped with red and infrared laser diodes which scan object simultaneously. By processing the images from the 3-D vision sensor, the locations of the fruits and obstacles were recognized, and the trajectory of the end effector was determined. Baeten et al. [127] designed an autonomous apple harvesting robot using 6 DOF industrial manipulator, silicon funnel gripper and camera mounted inside the gripper. A three stage approach was used during harvesting operation. The apple harvesting success rate was 80% with 8-10 s time to pick one apple.
Irie et al. [128] developed asparagus harvesting robot coordinated with 3D vision sensor. A telescopic robot hand which includes set of DC motors and 3D vision sensor used to grip and cut the plants in greenhouse. The cycle time for harvesting operation was recorded as 13.9–23.9 s. Van Henten et al. [129] illustrated optimal manipulator design for a cucumber harvesting robot using task specific manipulation technique and parameter optimization simulations. Several types of robots with different link parameters were analyzed for performance index and a 4 link PPRR type manipulator was found effective. Scarfe et al. [130] reported development of an autonomous kiwifruit picking robot based on intelligent vision system. The robot receives instruction by radio link and operates autonomously as it navigates through the orchard, picking fruit, unloading full bins of fruit, fetching empty bins and protecting the picked fruit from rain. The robot has 4 picking arms, each of which will pick one fruit per second. Chatzimichali et al. [131] used an integrated robotic system that able to move in the field, identify white asparagus stems, grasp them and then cut accordingly without any physical damage.
Aljanobi et al. [132] used ready-made industrial manipulator having 6 DOF to harvest the dates. Hayashi et al. [133] evaluated strawberry harvesting robot which consists of cylindrical manipulator, end-effector with suction device, machine vision unit, storage unit and travelling unit. The fruit detection rate was recorded as 60% while successful harvesting rate was recorded as 41.3% with suction device and 34.9% without suction device and picking cycle time was noted as 11.5 s. De-An et al. [134] presented design and control of apple harvesting robot using PRRRP structure manipulator, spoon-shaped end-effector with pneumatic actuated gripper and image based vision servo control system. A fruit recognition algorithm based on support vector machine was applied to detect apple. The harvesting success rate was recorded as 77% with average picking time as 15 s per apple. Kohan et al. [135] developed Rosa Damascena harvesting robot using stereoscopic machine vision. A 4 DOF manipulator with stereovision technique was used and relation between camera-flower distance and camera-camera distance was evaluated. It was found that an increase in the distance between the cameras reduces the stereoscopic error, while the increase in the distance between the cameras and the flowers increases the error. The overall harvesting success rate was noted as 82.22%. Li et al. [136] reviewed fruit harvesting methods for fruit harvesting robots. A large research data is categorized according to methods of harvesting, machine vision systems and image date analysis.
Feng et al. [137] mentioned a new strawberry harvesting robot for elevated-trough culture using sonar camera sensor and autonomous navigation system. A 6 DOF industrial manipulator was used with a gripping and cutting tool. The successful harvesting rate was recorded as 86% with a harvesting time of 31.3 s on average and an average error for fruit location was found to be less than 4.6 mm. Wang et al. [138] reported design and co-simulation for tomato harvesting robot using a 4 DOF manipulator and machine vision servo system. The viability and validity of the tomato harvesting robot was preliminarily confirmed by co-simulation for the electromechanical system. Yang et al. [139] described design and experiments of an intelligent monorail cucumber harvesting system. An intelligent harvester runs on monorail and carry harvest box, monorail assembly bracket system and control system. An infrared sensor and intelligent camera used in image processing and matured cucumbers can be distinguished by using gray transformation algorithm, trimming image edge algorithm, and locally maximal variance between-class threshold algorithm. The harvest success rate was reported as 97.28% when operating speed was 0.6 m/s. Hemming et. al. [140] reported a robot for harvesting sweet-pepper in greenhouses in which a 9 DOF redundant manipulator, two 5 megapixel 2/3” CCD RGB color cameras, a 3D time of flight camera and artificial lighting rig were used. Two different types of end-effector for detaching sweet peppers were used. The first end-effector had a combined grip and cut mechanism while the second end-effector first stabilized fruit using suction cups after which two lips enclose the fruit and cut the fruit peduncle. The fruit detection and localization was performed at two different levels using color and 3D time of flight cameras. The modular robotic system was tested under simplified laboratory conditions in which the detection rate was 97%, localization rate was 86% and harvesting efficiency was 79%. In a commercial greenhouse, this system has proven the ability to harvest sweet peppers autonomously.
Bac et al. [141] analyzed the state-of-the-art and provided insight on future perspectives for harvesting robots in high-value crops. This review article emphasizes harvesting robots regarding crops harvested in a production environment, performance indicators, design process techniques used, hardware design decisions and algorithm characteristics. The current challenges and limitations in developing commercial fruit harvesting robots were discussed and directions for future harvesting robots were deliberated. The article reports on average success rates for localization, detachment, harvest, fruit damage, peduncle damage and cycle time as 85%, 75%, 66%, 5%, 45% and 33 s, respectively, for the 50 selected different types of fruit harvesting robots.
The timeline of development of fruit harvesting robots around the world can be seen in Figure 6 which shows the horticultural product and the country where it was developed. In the 90s, considerable work has been carried out in Japan and USA on fruit harvesting robots, while in the 21st century, many other countries also started to contribute to this field.
Figure 6. Timeline for research work in fruit harvesting robots around the world.
Figure 6. Timeline for research work in fruit harvesting robots around the world.
Robotics 04 00194 g006

4. Discussion

Over the centuries, agriculture has transformed into the modern bio-industry it is today, something inconceivable to humans when agriculture started in traditional hunter-gatherer societies. The major changes in agriculture have occurred through domestication of mechanization, use of modern high-tech sophisticated farm management techniques, adoption of cutting edge technologies and pushing engineering to its utmost limits for precision farming and protected cultivation. The changes and developments in agriculture have been seen around the globe in terms of incredible revolutions, and various innovations and developments of machines and robots over time. The application of robots in horticultural product harvesting has also shown very significant and promising results; especially when the agriculture labor population is decreasing with an increase in labor wages and increasing harvesting energy consumption.
So far, considerable and remarkable research work has been carried out around the world, and still many researchers are engaged in finding solutions for several specific task-oriented problems and developing cutting edge technologies which will help the growth of modern agriculture. On the other hand, no harvesting robot has reached the stage of commercialization due to several problems and difficulties such as their low operation speed, low fruit recognition rate in variable field conditions, low rate in obtaining spatial information on fruit, low success rate in fruit harvesting, complexities in the robot manipulator movements, complicated manipulator control methods, difficulties in mobile navigation of robots, problems in simultaneous see-grasp-cut operations, accuracy-repeatability-cycle time gaps and the high costs that are involved in research and manufacturing. Besides these problems, there are several other problems concerning the physical and morphological properties of fruits, horticultural constraints and variable field conditions.
To overcome these problems and to obtain the automatic fruit harvesting system in the agriculture sector, ideally, three main problems need to be solved [13]: (1) the guidance of the robot through the crops, (2) the location and characterization of the fruit on the trees, and (3) the grasping and detachment of each piece. The first problem is not critical and can be solved using one operator to guide the robot through the crops or adopting line tracing moving base system. The other two problems have received remarkable attention during the last 30 years, although no commercial harvesting robot is available. To solve problems of location and characterization of fruit on trees, an efficient recognition system is required that can locate the fruits on trees with their positional information, i.e., the location and orientation of the fruit. Further, the recognition system should be able to locate the occluded or fruits partially covered by leaves in the variable field environments. Also, to solve the problem of grasping and detachment of fruits, an effective gripping and cutting system is required that can harvest the fruits under various conditions without causing any physical damage to the fruit. Moreover, the picking system should be able to handle the soft, delicate fruits during harvesting time with respect to their various shapes and sizes without causing any damage to trees, and also be able to perform the harvesting operation at higher speeds and very precisely.
As the research methods and approaches presented to solve discrimination problems of green fruit provide significant insights on recognition and computation of positional information of green fruits in natural backgrounds, adopting this type of research for other agricultural fruits has three distinct approaches: first, determining the optimal color-space model for each individual fruit; second, applying the same color-space model for all agricultural products as a universal solution and third, combining the color-space models that have a significant effect on color attributes and features of fruits to be detected. In the first case, the fruit recognition rate will be largely increased while the method will be time consuming as it needs considerable time to collect images of each individual fruit, process and analyze the data and draw conclusions. Applying this type of research for major agricultural fruits will widen the scope for fruit harvesting robots. In the second case, the time can be saved but the recognition rate will be decreased and accuracy of detection will be low as every fruit demonstrates considerable distinctions in physical and chemical properties. In the third case, the search for an optimal combination of color spaces will be a tough challenge for researchers and relies on sophisticated research which will not only increase the detection accuracy but also could be used as a universal solution to detect the fruits in natural backgrounds.
In agriculture, fruits and vegetables demonstrate great diversity in their properties and, due to that, researchers need to design and develop different systems for each product. Application of one type of robot designed for a specific product was not feasible for another product. This problem is common to many developments in science and technology, and there are hopes to create some multi-tasking and multi-sensory devices. So, instead of developing separate robotic systems for each product, researchers should try to find out a universal ‘one size fits all’ solution. This type of work of course needs time, money, lots of research and much more, but this will be an extreme engineering miracle in agriculture. For example, in recognition systems, based on feature attributes, using a single multispectral or hyperspectral vision system with intelligent image processing algorithm, detection and obtaining spatial information for several types of fruits or vegetables is possible. Using one color camera and one multispectral camera [142], the fruits could be recognized successfully and also the maturity status of respective fruits could be determined. This will ensure that only matured fruits could be harvested using robots so as to reduce the operational waste and increase yielding capacity. For end-effectors, developing multi-tasking and multi-sensory end-effector, performing several operations in a sequential loop is possible. So far, researchers are working to develop gripping systems to grasp fruit or fruit clusters only, and this type of gripping requires special care to avoid physical damage to fruits based on various sizes and shapes. On the other hand, this situation could be changed by developing multi-sensory grippers to grasp peduncle of the fruit or fruit cluster instead of grasping fruits directly. This type of devices fits with almost all types of fruits or fruit clusters and also helps to avoid physical damage to fruits. A combination of these types of robotic systems will help to create real-time intelligent robotic systems for fruit harvesting. The real-time cost effective and fully automatic robotic fruit harvester might have come a long way from the final commercial prototype; however, there is still some room in future research to develop these automatic fruit harvesters or harvesting systems

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. FAO. Core data statistics on agriculture labor population in the world. 2012. Available online: http://www.faostat.fao.org (accessed on 10 August 2012).
  2. Kondo, N.; Ting, K.C. Robotics for plant production. Artif. Intell. Rev. 1998, 12, 227–243. [Google Scholar] [CrossRef]
  3. Kondo, N.; Monta, M. Fruit harvesting robotics. J. Robot. Mechatron. 1999, 11, 321–325. [Google Scholar]
  4. Hashimoto, Y. Agro-robotics. J. Robot. Mechatron. 1999, 11, 171–172. [Google Scholar]
  5. Ferguson, L.; Rosa, U.A.; Castro-Garcia, S.; Lee, S.M.; Guinard, J.X.; Burns, J.; Krueger, W.H.; O'Connell, N.V.; Glozer, K. Mechanical harvesting of california table and oil olives. Adv. Hortic. Sci. 2010, 24, 53–63. [Google Scholar]
  6. Coppock, G.E. Harvesting early and midseason citrus fruit with tree shaker. Fla. Agric. Exp. Station. J. Ser. 1967, 2824, 98–104. [Google Scholar]
  7. Holt, J.S. Implications of reduced availability of seasonal agricultural workers on the labor intensive sector of us agriculture. In Proceedings of ASAE Annual International Meeting, Toronto, Canada, 18–22 July 1999.
  8. Sanders, K.F. Orange harvesting system review. Biosyst. Eng. 2005, 90, 115–125. [Google Scholar] [CrossRef]
  9. Pal, N.; Pal, K. A review on image segmentation techniques. Pattern Recogn. 1993, 26, 1277–1294. [Google Scholar] [CrossRef]
  10. Jimenez, R.; Jain, A.K.; Ceres, R.; Pons, J.L. Automatic fruit recognition: A survey and new results using range/attenuation images. Pattern Recogn. 1999, 32, 1719–1736. [Google Scholar] [CrossRef]
  11. Radke, R.J.; Andra, S.; Al-Kofahi, O.; Roysam, B. Image change detection algorithms: A systematic survey. IEEE Trans. Image Process. 2005, 14, 294–307. [Google Scholar] [CrossRef] [PubMed]
  12. Bachche, S.; Oka, K. Distinction of green sweet pepper by using various color space models and computation of 3 dimensional coordinates location of recognized green sweet peppers based on parallel stereovision system. J. Syst. Des. Dyn. 2013, 7, 178–196. [Google Scholar]
  13. Jimenez, A.R.; Ceres, R.; Pons, J.L. A survey of computer vision methods for locating fruit on trees. Trans. ASAE 2000, 43, 1911–1920. [Google Scholar] [CrossRef]
  14. Blanes, C.; Mellado, M.; Ortiz, C.; Valera, A. Technologies for robot grippers in pick and place operations for fresh fruits and vegetables. Span. J. Agric. Res. 2011, 9, 1130–1141. [Google Scholar] [CrossRef]
  15. Montana, D.J. Contact stability for two-fingered grasps. IEEE Trans. Robot. Autom. 1992, 8, 421–430. [Google Scholar] [CrossRef]
  16. Funahashi, Y.; Yamada, T.; Tate, M.; Suzuki, Y. Grasp stability analysis considering the curvatures at contact points. In Proceedings of the International Conference on Robotics and Automation, Minneapolis, MN, USA, 20–28 April 1996; Volume 4, pp. 3040–3046.
  17. Jenmalm, P.; Goodwin, A.W.; Johansson, R.S. Control of grasp stability when humans lift objects with different surface curvatures. J. Neurophysiol. 1998, 79, 1643–1653. [Google Scholar] [PubMed]
  18. Svinin, M.M.; Kaneko, M.; Tsuji, T. Internal forces and stability in multi-finger grasps. Control Eng. Pract. 1999, 7, 413–422. [Google Scholar] [CrossRef]
  19. Morales, A.; Sanz, P.J.; del Pobil, A.P.; Fagg, A.H. Vision-Based three-finger grasp synthesis constrained by hand geometry. Robot. Auton. Syst. 2006, 54, 496–512. [Google Scholar] [CrossRef]
  20. Birglen, L.; Gosselin, C.M. Grasp-state plane analysis of two-phalanx underactuated fingers. Mech. Mach. Theory 2006, 41, 807–822. [Google Scholar] [CrossRef]
  21. Kragten, G.A.; Herder, J.L.; Schwab, A.L. On the influence of contact geometry on grasp stability. In Proceedings of the ASME 2008 IDETC/CIE, New York, NY, USA, 3–6 August 2008.
  22. Aleotti, J.; Caselli, S. Interactive teaching of task-oriented robot grasps. Robot. Auton. Syst. 2010, 58, 539–550. [Google Scholar] [CrossRef]
  23. Noohi, E.; Moradi, H.; Noori, N.; Ahmadabadi, M.N. Manipulation of polygonal objects with two wheeled-tip fingers: Planning in the presence of contact position error. Robot. Auton. Syst. 2011, 55, 44–55. [Google Scholar] [CrossRef]
  24. Daoud, N.; Gazeau, J.P.; Zeqhloul, S.; Arsicault, M. A real-time strategy for dexterous manipulation: Fingertips motion planning, force sensing and grasp stability. Robot. Auton. Syst. 2012, 60, 377–386. [Google Scholar] [CrossRef]
  25. Li, Z.; Li, P.; Yang, H.; Wang, Y. Stability tests of two-finger tomato grasping for harvesting robots. Biosyst. Eng. 2013, 116, 163–170. [Google Scholar] [CrossRef]
  26. Monkman, G.J.; Hesse, S.; Steinmann, R.; Schunk, H. Robot Grippers; Wiley-VCH Verlag GmbH and Co. KGaA Weinheim: Germany, 2007. [Google Scholar]
  27. Edan, Y.; Haghighi, K.; Stroshine, R.; Cardenas-Weber, M. Robot gripper analysis: Finite element modeling and optimization. Appl. Eng. Agric. 1992, 8, 563–570. [Google Scholar] [CrossRef]
  28. Bachche, S.; Oka, K.; Sakamoto, H. Design and modeling of gripper and cutting tool system for sweet pepper harvesting robot hand. In Proceedings of the MAGDA Conference in Pacific Asia, Kaohsiung, Taiwan, 14–16 November 2011.
  29. Bachche, S.; Oka, K. Modeling and performance testing of end-effector for sweet pepper harvesting robot. J. Robot. Mechatron. 2013, 25, 705–717. [Google Scholar]
  30. Monta, M.; Kondo, N.; Ting, K.C. End-effector for tomato harvesting robot. Artif. Intell. Rev. 1998, 12, 11–25. [Google Scholar] [CrossRef]
  31. Sakai, S.; Lida, M.; Umeda, M. Heavy material handling manipulator for agricultural robot. In Proceedings of the IEEE International Conference on Robotics and Automation, Washington, DC, USA, 11–15 May 2002; pp. 1062–1068.
  32. Ling, P.P.; Ehsani, R.; Ting, K.C.; Chi, Y.; Ramalingam, N.; Klingman, M.H.; Draper, C. Sensing and end-effector for a robotic tomato harvester. 2004 ASAE Annu. Meet. 2004. [Google Scholar] [CrossRef]
  33. Liu, J.; Li, P.; Li, Z. A multi-sensory end-effector for spherical fruit harvesting robot. In Proceedings of the IEEE International Conference on Automation and Logistics, Jinan, China, 18–21 August 2012; pp. 258–262.
  34. Bachche, S.; Oka, K.; Sakamoto, H. Development of thermal cutting system for sweet pepper harvesting robot in greenhouse horticulture. In Proceedings of the JSME Conference on Robotics and Mechatronics, Hamamatsu, Japan, 27–29 May 2012.
  35. Bachche, S.; Oka, K.; Sakamoto, H. Development of current based temperature arc thermal cutting system for green pepper harvesting robot. In Proceedings of the Shikoku-section Joint Convention of the Institute of Electrical and related Engineers, Takamatsu, Japan, 29 September 2012.
  36. Bachche, S.; Oka, K. Performance testing of thermal cutting system for sweet pepper harvesting robot in greenhouse horticulture. J. Syst. Des. Dyn. 2013, 7, 36–51. [Google Scholar] [CrossRef]
  37. Tillett, R.D. Image analysis for agricultural processes: A review of potential opportunities. J. Agric. Eng. Res. 1991, 50, 247–258. [Google Scholar] [CrossRef]
  38. Jain, A.K.; Dorai, C. Practicing vision: Integration, evaluation and applications. Pattern Recogn. 1997, 30, 183–196. [Google Scholar] [CrossRef]
  39. Juste, F.; Sevilla, F. Citrus: A european project to study the robotic harvesting of oranges. In Proceedings of the 3rd International Symposium on Fruit, Nut and Vegetable Harvesting Mechanization, Denmark-Sweden-Norway, 5–15 August 1991; pp. 331–338.
  40. Grand D’Esnon, A. Robot Harvesting of Apples. In Proceedings of Agr-Mation, Chicago, IL, USA, 25–28 February 1985.
  41. Kassay, L. Hungarian robotic apple harvester. 1992 ASAE Annu. Meet. 1992, 92-7042, 1–14. [Google Scholar]
  42. Bulanon, D.M.; Kataoka, T.; Zhang, S.; Ota, Y.; Hiroma, T. Optimal thresholding for the automatic recognition of apple fruits. 2001 ASAE Annu. Meet. 2001. [Google Scholar] [CrossRef]
  43. Tabb, A.; Peterson, D.; Park, J. Segmentation of apple fruit from video via background modeling. 2006 ASAE Pap. 2006. [Google Scholar] [CrossRef]
  44. Tanagaki, K.; Fujiura, T.; Akase, A.; Imagawa, I. Cherry harvesting robot. In Proceedings of the International Workshop on Bio-Robotics, Information Technology and Intelligent Control for Bio-Production Systems, Sapporo, Japan, 9–10 September 2006; pp. 254–260.
  45. Arima, S.; Kondo, N.; Monta, M. Development of robotic system for cucumber harvesting. Jpn. Agric. Res. Q. 1996, 30, 233–238. [Google Scholar]
  46. Van Henten, E.J.; Hemming, J.; Van Tuijl, B.A.J.; Kornet, J.G.; Bontsema, J. Collision-free motion planning for a cucumber picking robot. Biosyst. Eng. 2003, 86, 135–144. [Google Scholar]
  47. Harrell, R.C.; Adsit, P.D.; Pool, T.A.; Hoffman, R. The florida robotic grove-lab. Trans. ASABE 1990, 33, 391–399. [Google Scholar] [CrossRef]
  48. Hannan, M.W.; Burks, T.F. Current developments in automated citrus harvesting. 2004 ASAE Annu. Inter. Meet. 2004. [Google Scholar] [CrossRef]
  49. Kondo, N.; Monta, M.; Fujiura, T. Fruit harvesting robots in japan. Adv. Space Res. 1996, 18, 181–184. [Google Scholar] [CrossRef]
  50. Kondo, N.; Yamamoto, K.; Yata, K.; Kurita, M. A machine vision for tomato cluster harvesting robot. ASAE Annu. Inter. Meet. 2008, 5, 3111–3120. [Google Scholar]
  51. Jiang, H.; Peng, Y.; Ying, Y. Measurement of 3-d locations of ripe tomato by binocular stereo vision for tomato harvesting. 2008 ASAE Annu. Inter. Meet. 2008. [Google Scholar] [CrossRef]
  52. Guo, F.; Cao, Q.; Cui, Y.; Masateru, N. Fruit location and stem detection for strawberry harvesting robot. Trans. Chin. Soc. Agric. Eng. 2008, 24, 89–94. [Google Scholar]
  53. Rajendra, P.; Kondo, N.; Ninomoya, K.; Kamata, J.; Kurita, M.; Shiigi, S.; Hayashi, S.; Yoshida, H. Machine vision algorithm for robots to harvest strawberries in tabletop culture greenhouse. Eng. Agric. Environ. Food 2009, 2, 24–30. [Google Scholar] [CrossRef]
  54. Benady, M.; Edan, Y.; Hetzroni, A.; Miles, G.E. Design of a field crops robotic machine. Pap. ASAE 1991, 91-7028, 1–7. [Google Scholar]
  55. Dobrusin, Y.; Edan, Y.; Grinshpun, J.; Peiper, U.M.; Hetzroni, A. Real-time image processing for robotic melon harvesting. Pap. ASAE 1992, 92-3515, 1–16. [Google Scholar]
  56. Kitamura, S.; Oka, K. Recognition and cutting system of sweet pepper for picking robot in greenhouse horticulture. In Proceedings of the IEEE International Conference on Mechatronics and Automation, Niagara Falls, Ontario, Canada, 29 July–1 August 2005; Volume 4, pp. 1807–1812.
  57. Kitamura, S.; Oka, K. Improvement of the ability to recognize sweet peppers for picking robot in greenhouse horticulture. In Proceedings of the International Joint Conference on SICE-ICASE, Busan, Korea, 18–21 October 2006; pp. 353–356.
  58. Bachche, S.; Oka, K.; Ogawa, N. Distinction of green sweet pepper by using various color space models. In Proceedings of the Annual Conference of the Robotics Society of Japan, Sapporo, Japan, 17–20 September 2012.
  59. McCarthy, C.L.; Hancock, N.H.; Raine, S.R. Applied machine vision of plants: A review with implications for field deployment in automated farming operations. Intell. Serv. Robot. 2010, 3, 209–217. [Google Scholar] [CrossRef] [Green Version]
  60. Kapach, K.; Barnea, E.; Mairon, R.; Edan, Y.; Ben-Shahar, O. Computer vision for fruit harvesting robots-state of art and challenges ahead. Int. J. Comput. Vis. Robot. 2012, 3, 4–34. [Google Scholar] [CrossRef]
  61. Baylou, P.; El Hadi Amor, B.; Monsion, M.; CBouvet, C.; Boussau, G. Detection and three-dimensional localization by stereoscopic visual sensor and its application to a robot for picking asparagus. Pattern Recogn. 1984, 17, 377–384. [Google Scholar] [CrossRef]
  62. Humburg, D.S.; Reid, J.F. Field performance for machine vision for selective harvesting of asparagus. Appl. Eng. Agric. 1986, 2, 2–5. [Google Scholar]
  63. Kondo, N.; Nishitsuji, Y.; Ling, P.P.; Ting, K.C. Visual feedback guided robotic cherry tomato harvesting. Trans. ASAE 1996b, 39, 2331–2338. [Google Scholar] [CrossRef]
  64. Edan, Y.; Rogozin, D.; Flash, T.; Miles, G.E. Robotic melon harvesting. IEEE Trans. Robot. Autom. 2000, 16, 831–835. [Google Scholar] [CrossRef]
  65. Edan, Y.; Miles, G.E. Design of an agricultural robot for harvesting melons. Trans. ASAE 1993, 36, 593–603. [Google Scholar] [CrossRef]
  66. Takahashi, T.; Zhang, S.; Fukuchi, H. Measurement of 3-d locations of fruit by binocular stereo vision for apple harvesting in an orchard. 2002 ASAE Annu. Inter. Meet. 2002. [Google Scholar] [CrossRef]
  67. Bulanon, D.M.; Kataoka, T.; Okamoto, H.; Hata, S. Determining the 3-d Location of the Apple Fruit During Harvest. In Proceedings of the Automation Technology for Off-Road Equipment, Kyoto, Japan, 7 October 2004; pp. 91–97.
  68. Tarrio, P.; Bernardos, A.M.; Casar, J.R.; Besada, J.A. A harvesting robot for small fruit in bunches based on 3-d stereoscopic vision. In Proceedings of the World Congress Conference on Computers in Agriculture and Natural Resources, Orlando, FL, USA, 24–26 July 2006; pp. 270–275.
  69. Lak, M.B.; Minaei, S.; Amiriparian, J.; Beheshti, B. Apple fruits recognition under natural luminance using machine vision. Adv. J. Food Sci. Technol. 2010, 2, 325–327. [Google Scholar]
  70. Li, B.; Wang, M.; Wang, N. Development of a real-time fruit recognition system for pineapple harvesting robots. In Proceedings of the Annual Meeting of ASABE, Pittsburgh, PA, USA, 20–23 June 2010.
  71. Ji, W.; Zhao, D.; Cheng, F.; Xu, B.; Zhang, Y.; Wang, J. Automatic recognition vision system guided for apple harvesting robot. Comput. Electr. Eng. 2012, 38, 1186–1195. [Google Scholar] [CrossRef]
  72. Gatica, G.; Best, S.; Ceroni, J.; Lefranc, G. Olive fruits recognition using neural networks. Procedia Comput. Sci. 2013, 17, 412–419. [Google Scholar] [CrossRef]
  73. Osborne, B.G.; Hindle, P.H. Practical NIR Spectroscopy with Applications in Food and Beverage Analysis; Longman Scientific: Harlow, Esex, UK, 1993. [Google Scholar]
  74. Kim, Y.S.; Reid, J.F.; Hansen, A.C.; Zhang, Q. On-field crop stress detection system using multispectral imaging sensor. Agric. Biosyst. Eng. 2000, 1, 88–94. [Google Scholar]
  75. Sui, R.; Wilkerson, J.B.; Hart, W.E.; Wilhelm, L.R.; Howard, D.D. Multi-spectral senseo for detection of nitrogen status in cotton. Appl. Eng. Agric. 2005, 21, 167–172. [Google Scholar] [CrossRef]
  76. Abbott, J. Quality measurement of fruits and vegetables. Postharvest Biol. Technol. 1999, 15, 207–225. [Google Scholar] [CrossRef]
  77. Nicolai, B.M.; Beullens, K.; Bobelyn, E.; Peirs, A.; Saeys, W.; Theron, K.I.; Lammertyn, J. Non-destructive measurement of fruit and vegetable quality by means of nir spectroscopy: A review. Postharvest Biol. Technol. 2007, 46, 99–118. [Google Scholar] [CrossRef]
  78. Czarnowski, M.; Cebula, S. Spectral properties of sweet pepper fruits. Folia Hortic. 1998, 10, 39–51. [Google Scholar]
  79. Van Henten, E.J.; Hemming, J.; Van Tuijl, B.A.J.; Kornet, J.G.; Meuleman, J.; Bontsema, J.; Van Os, E.A. An autonomous robot for harvesting cucumbers in greenhouses. Auton. Robot. 2002, 13, 241–258. [Google Scholar]
  80. Hemming, J.; Wageningen University, Wageningen, the Netherlands; Bachche, S.; Tohoku University, Sendai, Japan. Personal communication. 2003. [Google Scholar]
  81. Safren, O.; Alchanatis, V.; Ostrovsky, V.; Levi, O. Detection of green apples in hyperspectral images of apple-tree foliage using machine vision. Trans. ASABE 2007, 50, 2303–2313. [Google Scholar] [CrossRef]
  82. Rath, T.; Kawollek, M. Robotic harvesting of gerbera jamesonii based on detection and three-dimensional modeling of cut flower pedicels. Comput. Electr. Agric. 2009, 66, 85–92. [Google Scholar] [CrossRef]
  83. Yuan, T.; Li, W.; Feng, Q.; Zhang, J. Spectral Imaging for Greenhouse Cucumber Fruit Detection Based on Binocular Stereovision. 2010 ASAE Annu. Inter. Meet. 2010. [Google Scholar] [CrossRef]
  84. Bulanon, D.M.; Burks, T.F.; Alchanatis, V. A multispectral imaging analysis for enhancing citrus fruit detection. Environ. Control Biol. 2010, 48, 81–91. [Google Scholar] [CrossRef]
  85. Bachche, S. Automatic Harvesting for Sweet Peppers in Greenhouse Horticulture. Ph.D. Dissertation, Kochi University of Technology, Kochi, Japan, 2013. [Google Scholar]
  86. Bac, C.W.; Hemming, J.; van Henten, E.J. Robust pixel-based classification of obstacles for robotic harvesting of sweet-pepper. Comput. Electr. Agric. 2013, 96, 148–162. [Google Scholar] [CrossRef]
  87. Schert, C.E.; Brown, G.K. Basic considerations in mechanizing citrus harvest. Trans. ASAE 1968, 11, 343–346. [Google Scholar]
  88. Kawamura, N.; Namikawa, K.; Fujiura, T.; Ura, M. Study on agricultural robot. J. Jpn. Soc. Agric. Mach. 1984, 46, 353–358. [Google Scholar]
  89. Prussia, S.E. Ergonomics of manual harvesting. Appl. Ergon. 1985, 16, 209–215. [Google Scholar] [CrossRef]
  90. Sistler, F.E. Robotics and intelligent machines in agriculture. IEEE J. Robot. Autom. 1987, 3, 3–6. [Google Scholar] [CrossRef]
  91. Grand D'Esnon, A.; Rabatel, G.; Pellenc, R. Magali: A self-propelled robot to pick apples. Available online: http://agris.fao.org/agris-search/search.do?recordID=US8853733 (accessed on 15 June 2015).
  92. Amaha, K.; Shono, H.; Takakura, T. A harvesting robot of cucumber fruits. Available online: http://agris.fao.org/agris-search/search.do?recordID=US9166423 (accessed on 15 June 2015).
  93. Whitney, J.D.; Harrell, R.C. Status of citrus harvesting in florida. J. Agric. Eng. Res. 1989, 42, 285–299. [Google Scholar] [CrossRef]
  94. Blandini, G.; Levi, P. First approaches to robot utilization for automatic citrus harvesting. In Land and Water Use; Dodd, V.A., Grace, P.M., Eds.; A.A. Balkema: Rotterdam, Netherlands, 1989; pp. 1903–1907. ISBN 9061919800. [Google Scholar]
  95. Sevilla, F.; Sittichareonchai, F.; Fatou, J.M.; Constans, A.; Brons, A.; Davenel, A. A robot to harvest grape: A feasibility study. SAE Pap. No.: 89-7084. 1989. [Google Scholar]
  96. Tillett, R.D. Initial development of a mechatronic mushroom harvester. In Proceedings of the International Conference on Mechatronics: Designing Intelligent Machines, Institution of Mechanical Engineers, Cambridge, UK; 1990; pp. 109–114. [Google Scholar]
  97. Sandini, G.; Buemi, F.; Massa, M.; Zucchini, M. Visually guided operations in greenhouse. In Proceedings of the IEEE International Workshop on Intelligent Robots and Systems, Ibaraki, Japan, 3-6 July 1990; pp. 279–285.
  98. Hayashi, U.; Ueda, Y. Orange harvesting robot; Kubota Co.: Sakai, Japan, 1991. [Google Scholar]
  99. Pool, T.A.; Harrell, R.C. An end-effector for robotic removal of citrus from the tree. Trans. ASABE 1991, 34, 373–378. [Google Scholar] [CrossRef]
  100. Benady, M.; Miles, G.E. Locating melons for robotic harvesting using structured light. ASAE Pap. No.: 92-7021. 1992. [Google Scholar]
  101. Sarig, Y. Robotics of fruit harvesting: A state-of-the-art review. J. Agric. Eng. Res. 1993, 54, 265–280. [Google Scholar] [CrossRef]
  102. Tillett, N.D. Robotic manipulators in horticulture: A review. J. Agric. Eng. Res. 1993, 55, 89–105. [Google Scholar] [CrossRef]
  103. Reed, J.N.; Tillett, R.D. Initial experiments in robotic mushroom harvesting. Mechatronics 1994, 4, 265–279. [Google Scholar] [CrossRef]
  104. Grattoni, P.; Cumani, A.; Guiducci, A.; Pettiti, G. Automatic harvesting of asparagus: An application of robot vision to agriculture. In Proceedings of the SPIE 2058, Mobile Robots VIII, Boston, MA, USA, 1 February 1994; pp. 200–210.
  105. Monta, M.; Kondo, N.; Shibano, Y. Agricultural robot in grape production system. In Proceedings of the IEEE International Conference on Robotics and Automation, Nagoya, Japan, 21–27 May 1995; Volume 3, pp. 2504–2509.
  106. Buemi, F.; Massa, M.; Sandini, G. Agrobot: A robotic system for greenhouse operations. Robot. Agric. Food Ind. 1995, 4, 172–184. [Google Scholar]
  107. Edan, Y. Design of an autonomous agricultural robot. Appl. Intell. 1995, 5, 41–50. [Google Scholar] [CrossRef]
  108. Mandow, A.; Gomez de Gabriel, J.M.; Martinez, J.L.; Munoz, V.F.; Ollero, A.; Garci a-Cerezo, A. The autonomous mobile robot aurora for greenhouse operation. IEEE Robot. Autom. Mag. 1996, 3, 18–28. [Google Scholar] [CrossRef]
  109. Arndt, G.; Rudziejewski, R.; Stewart, V.A. On the future of automated selective asparagus harvesting technology. Comput. Electr. Agric. 1997, 16, 137–145. [Google Scholar] [CrossRef]
  110. Ceres, R.; Pons, J.L.; Jimenez, A.R.; Martin, J.M.; Calderon, L. Design and implementation of an aided fruit-harvesting robot (agribot). Ind. Robot Int. J. 1998, 25, 337–346. [Google Scholar] [CrossRef]
  111. Kondo, N.; Monta, M. Strawberry harvesting robots. ASAE Pap. No.: 99-3071. 1999. [Google Scholar]
  112. Reed, J.N.; Miles, S.J.; Butler, J.; Baldwin, M.; Noble, R. AE—Automation and emerging technologies: Automatic mushroom harvester development. J. Agric. Eng. Res. 2001, 78, 15–23. [Google Scholar] [CrossRef]
  113. Brown, G.K. Mechanical harvesting systems for the florida citrus juice industry. 2002 ASAE Annu. Meet. 2002. [Google Scholar] [CrossRef]
  114. Hayashi, S.; Ganno, K.; Ishii, Y.; Tanaka, I. Robotic harvesting system for eggplants. Jpn. Agric. Res. Q. 2002, 36, 163–168. [Google Scholar] [CrossRef]
  115. Cho, S.I.; Chang, S.J.; Kim, Y.Y. Development of a three degrees-of-freedom robot for harvesting lettuce using machine vision and fuzzy logic control. Biosyst. Eng. 2002, 82, 143–149. [Google Scholar] [CrossRef]
  116. Van Henten, E.J.; Van Tuijl, B.A.J.; Hemming, J.; Kornet, J.G.; Bontsema, J.; Van Os, E.A. Field test of an autonomous cucumber picking robot. Biosyst. Eng. 2003, 86, 305–313. [Google Scholar] [CrossRef]
  117. Arima, S.; Kondo, N.; Monta, M. Strawberry harvesting robot on table-top culture. 2004 ASAE Annu. Meet. 2004. [Google Scholar] [CrossRef]
  118. Burks, T.F.; Villegsa, F.; Hannan, M.; Flood, S.; Sivaraman, B.; Subramanian, V.; Sikes, J. Engineering and horticultural aspects of robotic fruit harvesting: Opportunities and constrains. HortTechnology 2005, 15, 79–87. [Google Scholar]
  119. Kondo, N.; Ninomoya, K.; Hayashi, S.; Tomohiko, O.; Kubota, K. A new challenge of robot for harvesting strawberry grown on table top culture. 2005 ASAE Annu. Meet. 2005. [Google Scholar] [CrossRef]
  120. Muscato, G.; Prestifilippo, M.; Abbate, N.; Rizzuto, I. A prototype of an orange picking robot: Past history, the new robot and experimental result. Ind. Robot Int. J. 2005, 32, 128–138. [Google Scholar]
  121. Foglia, M.; Reina, G. Agricultural robot for radicchio harvesting. J. Field Robot. 2006, 23, 363–377. [Google Scholar] [CrossRef]
  122. Belforte, G.; Deboli, R.; Gay, P.; Piccarolo, P.; Ricauda Aimonino, D. Robot design and testing for greenhouse applications. Biosyst. Eng. 2006, 95, 309–321. [Google Scholar] [CrossRef]
  123. Ota, T.; Bontsema, J.; Hayashi, S.; Kubota, K.; Van Henten, E.J.; Van Os, E.A.; Ajiki, K. Development of a cucumber leaf picking device for greenhouse production. Biosyst. Eng. 2007, 98, 381–391. [Google Scholar] [CrossRef]
  124. Van Henten, E.J.; Van Tuijl, B.A.J.; Hoogakker, G.J.; Van Der Weerd, M.J.; Hemming, J.; Kornet, J.G.; Bontsema, J. An autonomous robot for de-leafing cucumber plants grown in a high-wire cultivation system. Biosyst. Eng. 2006, 94, 317–323. [Google Scholar] [CrossRef]
  125. Kondo, N.; Taniwaki, S.; Tanihara, K.; Yata, K.; Monta, M.; Kurita, M.; Tsutumi, M. An end-effector and manipulator control for tomato cluster harvesting robot. 2007 ASAE Annu. Meet. 2007. [Google Scholar] [CrossRef]
  126. Tanagaki, K.; Fujiura, T.; Akase, A.; Imagawa, I. Cherry-harvesting robot. Comput. Electr. Agric. 2008, 63, 65–72. [Google Scholar] [CrossRef]
  127. Baeten, J.; Donne, K.; Boedrij, S.; Beckers, W.; Claesen, E. Autonomous fruit picking machine: A robotic apple harvester. Field Serv. Robot. 2008, 42, 531–539. [Google Scholar]
  128. Irie, N.; Tagushi, N.; Horie, T.; Ishimatsu, T. Development of asparagus harvester coordinated with 3-d vision sensor. J. Robot. Mechatron. 2009, 21, 583–589. [Google Scholar]
  129. Van Henten, E.J.; van't Slot, D.A.; Hol, C.W.J.; van Willigenburg, L.G. Optimal manipulator design for a cucumber harvesting robot. Comput. Electr. Agric. 2009, 65, 247–257. [Google Scholar] [CrossRef]
  130. Scarfe, A.J.; Flemmer, R.C.; Bakker, H.H.; Flemmer, C.L. Development of an autonomous kiwifruit picking robot. In Proceedings of the International Conference on Autonomous Robots and Agents, Wellington, New Zealand, 10–12 February, 2009; pp. 380–384.
  131. Chatzimichali, A.P.; Georgilas, I.P.; Tourassis, V.D. Design of an advanced prototype robot for white asparagus harvesting. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Singapore, 14–17 July 2009; pp. 887–892.
  132. Aljanobi, A.A.; Al-Hamed, S.A.; Al-Suhaibani, S.A. A setup of mobile robotic unit for fruit harvesting. In Proceedings of the IEEE International Workshop on Robotics in Alpe-Adria-Danube Region, Budapest, 24– June 2010; pp. 105–108.
  133. Hayashi, S.; Shigematsu, K.; Yamamoto, S.; Kobayashi, K.; Kohno, Y.; Kamata, J.; Kurita, M. Evaluation of a strawberry-harvesting robot in a field test. Biosyst. Eng. 2010, 105, 160–171. [Google Scholar] [CrossRef]
  134. Zhao, A.; Lv, J.; Ji, W.; Zhang, Y.; Chen, Y. Design and control of an apple harvesting robot. Biosyst. Eng. 2011, 110, 112–122. [Google Scholar]
  135. Kohan, A.; Borghaee, A.M.; Yazdi, M.; Minaei, S.; Sheykhdavudi, M.J. Robotic harvesting of rosa damascena using stereoscopic machine vision. World Appl. Sci. J. 2011, 12, 231–237. [Google Scholar]
  136. Li, P.; Lee, S.M.; Hsu, H. Review on fruit harvesting method for potential use of automatic fruit harvesting systems. Procedia Eng. 2011, 23, 351–366. [Google Scholar] [CrossRef]
  137. Feng, Q.C.; Wang, X.; Zheng, W.G.; Qui, Q.; Jiang, K. New strawberry harvesting robot for elevated-trough culture. Int. J. Agric. Biol. Eng. 2012, 5, 1–8. [Google Scholar]
  138. Wang, J.; Zhou, Z.; Du, X. Design for tomato harvesting robots. In Proceedings of Chinese Control Conference, Hefei, China, 25–27 July 2012; pp. 5105–5108.
  139. Yang, Z.; Zhang, W.; Zhang, J.; Ji, C.; Li, W. Design and Experiment of Intelligent Monorail Cucumbers Harvester System, Proceedings of ASABE annual meeting, Kansas, MI, USA, 21–24 July 2013.
  140. Hemming, J.; Bac, C.W.; Bart, A.J.; van Tuijl, B.A.J.; Barth, R.; Bontsema, J.; Pekkeriet, E. A robot for harvesting sweet-pepper in greenhouses. In Proceedings of the International Conference of Agricultural Engineering, Zurich, Switzerland, 6–10 July 2014.
  141. Bac, C.W.; van Henten, E.J.; Hemming, J.; Edan, Y. Harvesting robots for high-value crops: State-of-the-art review and challenges ahead. J. Field Robot. 2014, 31, 888–911. [Google Scholar] [CrossRef]
  142. Fernández, R.; Montes, H.; Salinas, C.; Sarria, J.; Armada, M. Combination of rgb and multispectral imagery for discrimination of cabernet sauvignon grapevine elements. Sensors 2013, 13, 7838–7859. [Google Scholar] [CrossRef] [PubMed]

Share and Cite

MDPI and ACS Style

Bachche, S. Deliberation on Design Strategies of Automatic Harvesting Systems: A Survey. Robotics 2015, 4, 194-222. https://doi.org/10.3390/robotics4020194

AMA Style

Bachche S. Deliberation on Design Strategies of Automatic Harvesting Systems: A Survey. Robotics. 2015; 4(2):194-222. https://doi.org/10.3390/robotics4020194

Chicago/Turabian Style

Bachche, Shivaji. 2015. "Deliberation on Design Strategies of Automatic Harvesting Systems: A Survey" Robotics 4, no. 2: 194-222. https://doi.org/10.3390/robotics4020194

Article Metrics

Back to TopTop