Elsevier

Biosystems Engineering

Volume 146, June 2016, Pages 71-84
Biosystems Engineering

Special Issue: Robotic Agriculture
Research Paper
Design of an eye-in-hand sensing and servo control framework for harvesting robotics in dense vegetation

https://doi.org/10.1016/j.biosystemseng.2015.12.001Get rights and content
Under a Creative Commons license
open access

Highlights

  • A framework design is proposed for eye-in-hand sensing and visual servo control.

  • The framework is primarily intended for harvest robotics in dense crops.

  • An example implementation is demonstrated for a sweet-pepper use-case.

A modular software framework design that allows flexible implementation of eye-in-hand sensing and motion control for agricultural robotics in dense vegetation is reported. Harvesting robots in cultivars with dense vegetation require multiple viewpoints and on-line trajectory adjustments in order to reduce the amount of false negatives and correct for fruit movement. In contrast to specialised software, the framework proposed aims to support a wide variety of agricultural use cases, hardware and extensions. A set of Robotic Operating System (ROS) nodes was created to ensure modularity and separation of concerns, implementing functionalities for application control, robot motion control, image acquisition, fruit detection, visual servo control and simultaneous localisation and mapping (SLAM) for monocular relative depth estimation and scene reconstruction. Coordination functionality was implemented by the application control node with a finite state machine. In order to provide visual servo control and simultaneous localisation and mapping functionalities, off-the-shelf libraries Visual Servoing Platform library (ViSP) and Large Scale Direct SLAM (LSD-SLAM) were wrapped in ROS nodes. The capabilities of the framework are demonstrated by an example implementation for use with a sweet-pepper crop, combined with hardware consisting of a Baxter robot and a colour camera placed on its end-effector. Qualitative tests were performed under laboratory conditions using an artificial dense vegetation sweet-pepper crop. Results indicated the framework can be implemented for sensing and robot motion control in sweet-pepper using visual information from the end-effector. Future research to apply the framework to other use-cases and validate the performance of its components in servo applications under real greenhouse conditions is suggested.

Keywords

Framework
Harvest robots
Visual servo control
ROS
SLAM

Cited by (0)