Next Article in Journal
A Simple Convolutional Neural Network with Rule Extraction
Next Article in Special Issue
FFESSD: An Accurate and Efficient Single-Shot Detector for Target Detection
Previous Article in Journal
A Review of Tunable Orbital Angular Momentum Modes in Fiber: Principle and Generation
Previous Article in Special Issue
A YOLOv2 Convolutional Neural Network-Based Human–Machine Interface for the Control of Assistive Robotic Manipulators
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Autonomous Robotics for Identification and Management of Invasive Aquatic Plant Species

1
Mechanical and Aerospace Engineering, North Carolina State University, Raleigh, NC 27695, USA
2
Crop and Soil Sciences, North Carolina State University, Raleigh, NC 27695, USA
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(12), 2410; https://doi.org/10.3390/app9122410
Submission received: 29 April 2019 / Revised: 5 June 2019 / Accepted: 6 June 2019 / Published: 13 June 2019

Abstract

:

Featured Application

Development of a small fleet of fully autonomous boats capable of subsurface invasive aquatic plant identification and treatment, consequently minimizing manual labor with more efficient, safe, and timely weed management.

Abstract

Invasive aquatic plant species can expand rapidly throughout water bodies and cause severely adverse economic and ecological impacts. While mechanical, chemical, and biological methods exist for the identification and treatment of these invasive species, they are manually intensive, inefficient, costly, and can cause collateral ecological damage. To address current deficiencies in aquatic weed management, this paper details the development of a small fleet of fully autonomous boats capable of subsurface hydroacoustic imaging (to scan aquatic vegetation), machine learning (for automated weed identification), and herbicide deployment (for vegetation control). These capabilities aim to minimize manual labor and provide more efficient, safe (reduced chemical exposure to personnel), and timely weed management. Geotagged hydroacoustic imagery of three aquatic plant varieties (Hydrilla, Cabomba, and Coontail) was collected and used to create a software pipeline for subsurface aquatic weed classification and distribution mapping. Employing deep learning, the novel software achieved a classification accuracy of 99.06% after training.

1. Introduction

1.1. Impact and Treatment of Aquatic Weeds

While native aquatic plants are essential components of aquatic ecosystems, non-native invasive species can expand rapidly throughout water bodies and cause severe economic and ecological impacts (Figure 1) [1,2]. Adverse economic impacts include: impairing recreational activities (fishing, swimming, and boating); flooding caused by reduced drainage; hindering boat navigation; blocking water intakes for hydroelectric turbines, drinking water, and irrigation; and reducing the potability of fresh water due to foul taste and odor [1,3,4,5]. Indirect economic effects include reduced property values and reduced revenue from impacted businesses [2,6]. Aquatic flora consume oxygen at night; thus, excessive weed growth can deprive fish and other aquatic animals of this vital resource, leading to their death. Other ecological effects include overpopulation of small fish, which find shelter in the plants, and the creation of breeding habitats for some mosquito species [3]. Of the many invasive aquatic plant species, Hydrilla verticillata, commonly known by its genus Hydrilla, likely causes the most economic damage in the United States [1]. In Florida alone, state and federal entities spend millions of dollars annually towards Hydrilla management ($14.5 million in 1997 [1] and $66 million total from 2008 to 2015 [7]). The economic impact on a single water body is significant; a detailed study on a single Florida lake estimated potential Hydrilla-related losses of $40 million per year (property values, agricultural support, flood control, recreation, etc.) [6].
Given these adverse impacts, management of invasive aquatic vegetation is of utmost concern. While mechanical control (plucking or raking the plants) is the most direct approach, it can be time-consuming and costly (estimated cost of $1000 per acre in 1996) [8,9]. Biological control methods (e.g., fish which consume large quantities of aquatic weeds) are routinely employed but can be counter-productive. One example is the grass carp, which can consume and control vegetation [9]; however, their body waste compromises water quality, killing sensitive plants and animal species, and fertilizing additional weed growth [3,10]. Biological approaches also introduce the risk of invasive animal species [11]. Seven species of carp native to Asia, introduced to control invasive plant species, have recently spread up the Mississippi River system and have crowded out native fish populations. While not without its drawbacks, chemical control (the application of herbicides) is considered the most attractive aquatic weed management method in terms of minimizing cost, time, and collateral damage [5].
Efficient chemical-based vegetation control in aquatic bodies requires proper identification and quantification of the extent to which vegetation is present, both for optimal chemical selection and treatment strategies [3,4]. The approach of manually identifying weed-infested regions within a water body and applying herbicides at the target location(s) is labor intensive, time-consuming, and involves risk of herbicide exposure to personnel [12]. Spraying herbicide throughout the water body leads to much higher usage, resulting in economic loss, longer decomposition times, and subsequent environmental hazards.
Two unmanned watercraft, the TORMADA (Lake Restoration Inc., Rogers, MN) and WATER STRIDER (Yamaha Motor Company, Iwata, Shizuoka, Japan) are commercially available for herbicide dispersal in water bodies. These boats offer remote control by a human operator but have limited tank capacity (3.8 and 8.0 L for TORMADA and WATER STRIDER, respectively) and lack autonomous navigation and weed identification functionality. Hänggi outlined the design, fabrication, and testing of an autonomous boat for monitoring algae blooms in Lake Geneva and has included a summary of multiple autonomous watercraft developed by other authors [13]. These prototypes are intended for a variety of measurement and mapping tasks, but none are purposed for aquatic vegetation identification and treatment.

1.2. Image-Based Machine Learning: Application to Aquatic Weed Identification

Automated plant identification, which relies on photographic images, is a well-developed methodology for agricultural applications. With recent advances in artificial intelligence, several machine learning algorithms have been developed for image classification. Current identification/classification methods involve image preprocessing followed by feature extraction. The extracted features, which typically include the color, shape and texture of leaves, Histograms of Oriented Gradients (HOG), etc., are then used to train classifiers. This approach was applied to identify certain Ficus plant species by acquiring photographic images of leaves, performing the image preprocessing operations shown in Figure 2, and extracting features from the modified images [14]. These authors separately implemented a support vector machine (SVM) and artificial neural network (ANN) as classifiers. A similar approach was adopted by another group of researchers to identify aquatic weeds growing on the water surface, namely Eichhornia crassipes, Pistia stratiotes, and Salvinia auriculata [15]. Visible light images captured from unmanned aerial vehicles (UAVs) were utilized to train multiple classifiers. In addition to SVM, the optimum-path forest classifier (OPF), and Bayesian classifier were investigated.
In similar research, feature learning on high resolution weed images (water hyacinth, tropical soda apple, and serrated tussock) was performed using a filter bank, followed by image classification via the texton approach (K-means clustering) [16]. Feature extraction and OPF classifiers were successfully implemented in another study to identify invasive yellow flag iris plants [17].
Despite the documented success of these feature extraction-based plant classification methods, they cannot be directly applied to subsurface plant identification due to limited visibility and the associated lack of photographic clarity. Despite lacking the resolution and clarity of photography, hydroacoustic imaging is an emerging option for the quantification of underwater vegetation. Currently, hydroacoustic data files can be uploaded and processed offline via a web-based service (Navico BioBase), which generates biomass-concentration maps. However, the limited resolution of hydroacoustic data impedes feature extraction and typical machine learning approaches to species classification.
Deep learning is a more advanced machine learning technique that could overcome these limitations [18]. It employs deep neural networks (DNNs) to simultaneously accomplish feature extraction and classification tasks [19]. The fundamental component of a DNN is an artificial neural network (ANN). As shown in Figure 3, a standard feedforward ANN consists of multiple fully connected layers (input, hidden, and output layers); each layer consists of a varying number of nodes (artificial neurons) depending on the complexity of the network.
The input layer X = [ x 1 , x 2 , x n ] T takes n extracted features, which are multiplied by an adaptable weight matrix W 1 = [ w 11 1 , w 12 1 , , w 1 h 1 ; ; w n 1 1 , , w n h 1 ] , where h is the number of neurons in the hidden layer. These values are multiplied by another adaptable weight matrix W 2 = [ w 11 2 , w 12 2 , , w 1 m 2 ; ; w h 1 2 , , w h m 2 ] , where m is the number of ANN outputs B 2 . Each ANN output corresponds to a specific classification (m classes in this case). By incorporating non-linear activation functions in the output layer neurons (e.g., softmax or sigmoid), the output layer values can be transformed to predict the classification for an input image. ANN weights are randomly initialized and optimized through a repetitive training process.
Unlike standard ANNs, DNNs integrate feature extraction by incorporating several additional layers [20]. Convolutional neural networks (CNNs, Figure 4), one of the most popular subtypes of DNNs, replace matrix multiplication with convolution in the initial layers. Instead of multiplying all inputs (pixels of image) with different weight sets, a small window of inputs is multiplied by a weight set and the window is iteratively shifted to cover the entire input matrix (image) while using the same weights. This decreases the number of weights and reduces network complexity and associated preprocessing [21]. However, with significantly deeper structures (as compared to ANNs), CNNs and other DNNs have a significantly larger numbers of learnable parameters. Consequently, DNN training is computationally intensive and has only recently become viable due to advances in computer processing. After training, however, DNNs can be utilized for real-time image classification with minimal processor capabilities.

1.3. Research Objective

To address current deficiencies in aquatic weed management, this research seeks to develop and demonstrate a small fleet of fully autonomous boats capable of subsurface hydroacoustic imaging (to scan aquatic vegetation), machine learning (for automated weed identification), and herbicide deployment (for vegetation control). These capabilities aim to minimize manual labor and provide more efficient, safe (reduced chemical exposure to personnel), and timely weed management.

2. Methods

2.1. Autonomous Boat Development

The first phase of research and development was the design and fabrication of water vehicles which provided the following functionality: fully autonomous navigation, coordination between at least two vehicles, hydroacoustic data collection, variable rate herbicide application, and at least two hours of continuous operation without refuel/recharge (Figure 5). For practical purposes, a surface vehicle, rather than a submarine platform, was chosen. Other design choices included the use of a 15-gallon (56.8 L) herbicide tank, a capacity recommended by the herbicide manufacturer sponsoring this research for the applications detailed here; electric propulsion; and battery-based energy storage. Additional design considerations and details are presented below.

2.1.1. Hull Design and Fabrication

A multi-hull, V-shaped design was selected for each vehicle to provide optimal stability and drag characteristics in the low-speed operating range (0.5–1.5 m/s) best suited to hydroacoustic mapping. With the use of hydrostatic analysis, pontoon dimensions were computed to support the geometry and weight of fully loaded payload components (batteries, electronics, herbicide tank filled to 15 gallon capacity, electronics, propulsion system, etc.) with 50% or less pontoon submersion [23]. The resulting geometry of each pontoon was 244 × 26 × 30 cm (96.0 × 10.2 × 11.9 in), not including keel dimensions.
Production of fiberglass pontoons was a multi-step process involving the fabrication of polystyrene plugs (Figure 6a,b) and fiberglass molds (Figure 6c) for the left and right halves of each pontoon, bolting the mold halves together (Figure 6d), and forming each pontoon within the mold (Figure 6e). Pontoon caps were fabricated using a separate mold (Figure 6f). A keel was integrated into the base of the pontoons for improved tracking performance. Multiple coats of polyester resin and liquid rubber were applied to the surface of the pontoons and caps for enhanced waterproofing and cosmetics.
Internal and external struts were waterjet-cut from aluminum sheets, bent to final geometry, and bolted between and within the pontoons (Figure 7). The struts provided enhanced lateral support and load-bearing capabilities. Low-density (32 kg/m3 or 2 lbf/ft3), high-buoyancy urethane foam was added to the interior of the pontoons both to cradle batteries and electrical components and prevent sinking following capsizing or hull penetration. Hatch doors with water-resistant draw latches were installed above this region (Figure 5) to provide access to the batteries and electronics for battery recharging and electronics troubleshooting.

2.1.2. Propulsion and Steering

An off-the-shelf electric trolling motor (Figure 8a, Minn Kota Powerdrive 45, Johnson Outdoors, Racine, WI, 200.2 N thrust rating) was selected for propulsion, owing to its relatively high efficiency and thrust capability and its quiet operation. For remote steering, the factory-installed steering motor was interfaced to a motor controller with remote control (RC) input and feedback capabilities (Pololu Jrk 12v12, Pololu Corporation, Las Vegas, NV, USA). Measurement of the shaft position for closed-loop control was implemented through a custom-mounted potentiometer rotationally coupled to the shaft via mechanical gears on each of these components (Figure 8b). A translating, spring-loaded frame and oversized gear teeth accommodated bow-to-stern shaft wobble, inherent in the system, to maintain gear meshing and avoid damage to the potentiometer. An airboat propulsion system, consisting of a direct current (DC) motor-driven propeller on a RC-servo-actuated rotating platform, was also evaluated. Despite its impressive steering capabilities (0.2 m turning radius) and low entanglement risk, this system was thrust-limited (22.7 N max thrust), inefficient with regards to power consumption, noisy, and highly sensitive to wind and wave disturbances.

2.1.3. Navigation and Control Unit

At a minimum, autonomous navigation requires a GPS receiver and magnetic compass for localization (sensing position and heading) and processor for maintaining real-time closed-loop control of steering and propulsion systems. A wireless transmitter/receiver with reasonable transmission range is also needed for communication with a base station and other autonomous vehicles in the fleet. The Pixhawk autopilot module, an open-hardware device originally manufactured by 3DR (3DR, Berkeley, CA), was selected from the available alternatives for its widespread usage and support community and its compatibility with a wide range of sensors and software (Figure 9a). Mission Planner open-source software, installed on a standard notebook computer (Dell Latitude 5550, Windows 7 OS), coupled with USB-based telemetry radios, served as the primary on-shore interface between the human operator and the onboard autopilot module (Figure 9b). User-configurable remote-control transmitters (FrSky Taranis X9D Plus, FrSky Electronic Co., Limited, Jiangsu, China), each paired with an X8R receiver aboard each vehicle, could be used for direct control of a vehicle as desired (launching, object avoidance, etc.). Toggling a transmitter switch alternated between autonomous and manual control modes for the corresponding vehicle.
Hydroacoustic mapping for weed quantification typically involves navigation of a watercraft through a series of parallel linear trajectories, commonly known as transects, spread throughout the target region (Figure 9b). Such navigation requires tracking accuracy. Transects, compiled within Mission Planner, were loaded to the module via telemetry communication. From the Mission Planner interface, the operator can also remotely adjust autonomous navigational control parameters, as well as monitor navigational performance in real-time (actual vs. desired trajectories, velocity, heading, etc.; Figure 10). The autopilot module implements a version of L1 trajectory tracking [24,25]. In this method, a reference point “L1_ref” on the desired trajectory is calculated, and the vehicle is directed (through steering inputs) towards that point (Figure 10). L1 tracking period, a key user-defined parameter, determines the aggressiveness of the vehicle in reaching the L1_ref point. The reference point is kept sliding constantly along the required trajectory. Due to trade-offs in parameter tuning (e.g., increasing gains can lead to oversteering and weaving), the parameters (primarily L1 tracking period and proportional, integral, and derivative (PID) controller gains) were fine-tuned with intensive experimentation.

2.1.4. Herbicide Dispersal System

For design of the herbicide dispersal system, Hydrilla was chosen as the targeted aquatic weed and Aquathol-K, a liquid-based chemical effective in the treatment of Hydrilla and other aquatic weeds, was selected as the treatment herbicide. A downstream chemical dilution system, with a drop hose outlet and an RC-controlled (via electronic speed controller, ESC) variable-speed pump, was utilized for dispersal. With this design, herbicide concentrate is stored within an onboard tank, diluted with water drawn from the lake, and dispersed through a hose submerged below the water surface. Dilution occurs at an injector downstream the pump, which passively combines the concentrate with lake water via the Venturi effect. Fluid analysis of the dispersal system and a series of tests were conducted for pump selection (8000 series SHURflow industrial pump, 100 psi rating) and to characterize chemical dispersal rates as a function of RC input (Figure 17b). These relationships allowed optimal RC pump control inputs to be determined as a function of transect spacing, boat velocity, desired application rate, and lake depth. Dispersal system design and analysis is further detailed in [23].
Design optimization methodologies, incorporating Stevin’s law of fluid statics, were used to optimize the placement of onboard components (batteries, electronics bin, chemical tank, propulsion system, dispersal pump, etc.). The objective of the algorithm, implemented with Excel Solver, was to minimize the pitch angle of the boat throughout all levels of tank payload (empty to full); a key constraint was avoidance of forward pitching (bow pitched downward). The methods and results, as detailed in [23], were adapted for later prototype generations as component dimensions, weights, and placement options (e.g., component storage within the pontoons), were altered.

2.1.5. Hydroacoustic Imaging

An off-the-shelf hydroacoustic imaging system (Lowrance Elite 4 Chirp, Lowrance Electronics, Tulsa, OK) was purchased and integrated within each boat. Its transducer was silicone-mounted within the base of a plastic container at the fore region of the boat. The unit allowed geo-tagged transducer data to be stored on a micro secure digital (microSD) card in .sl2 file format.

2.2. Machine Learning for Aquatic Vegetation Classification

The second phase of the project involved the development of automated methods for classifying and quantifying aquatic weeds, and successively utilizing this data to determine locations for targeted herbicide treatment.

2.2.1. Data Preprocessing

The acquired .sl2 data, being in a binary format with no manufacturer-provided tools or guidelines for opening and reading the contents, could not be directly used for machine learning classification algorithms. To convert this .sl2 data into a series of standard digital images (.jpg format) with geotagging, a two-step approach was used. First, the .sl2 file was opened using the Reefmaster Sonar Viewer (ReefMaster Software Ltd., West Sussex, U.K.), a software program that enables the display of sonar data (hydroacoustic imagery and corresponding geographical location) in a continuous video form. Figure 11 is a screenshot of hydroacoustic data recorded at Lake Raleigh viewed using the Reefmaster Sonar Viewer software. Primary and DownScan™ images are displayed along with a map indicating the scanning location. Both images show the ground surface with aquatic plants. DownScan™ (bottom right in Figure 11) was selected since it provides a relatively clearer view of the ground and vegetation, compared to Primary scan. Imagery display options, including contrast and brightness, horizontal scaling, and color scheme, were selected within Reefmaster for maximum visual clarity.
Next, still hydroacoustic images were acquired at fixed time intervals using the MATLAB Toolbox ‘Screen Record’ by Nassim Khaled [26], which captures PC monitor contents in real-time. Playback of imagery at nine times the original speed expedited the capture process. The image capture rate was selected to achieve 20%‒30% spatial overlap between consecutive images. The toolbox code was modified to incorporate clock time instead of CPU time, creating images with consistent overlap by eliminating dependence on CPU load. Images were digitally masked and cropped to remove unnecessary information and facilitate the machine learning process (Figure 12). The top left portion of the image contains the GPS coordinates, while DownScan™ covers the remaining portion.

2.2.2. Hardware and Software Configuration

As noted earlier, DNNs, being computationally-intensive, require powerful processing units for training in a reasonable timeframe. DNNs have millions of learnable parameters which undergo hundreds or thousands of optimization cycles during training. This precludes the usage of standard CPUs, which have a limited number of cores (eight in typical desktop computers), for DNN training. However, modern graphics processing units (GPUs), developed for gaming purposes, have thousands of cores which facilitate parallel computation via the Nvidia CUDA platform. For this research, a single Dell Precision T7500 workstation was configured with the following hardware: Intel Xeon CPU (2.13 GHz, 2 processors), 12GB RAM, and Nvidia GTX 1070 Ti GPU with 8 GB memory. The Nvidia CUDA deep neural network (cuDNN) library and other supporting software was installed to enable MATLAB, Python, and Google Colab to take advantage of the GPU.

2.2.3. DNN Training

“Alexnet,” an advanced CNN supported by MATLAB’s neural network toolbox, was used in the algorithm for plant species classification. Figure 13 shows the layer-wise structure of Alexnet. Due to a limited availability of preliminary data, transfer learning was implemented on the pretrained CNN. Initial algorithm training focused on only two training classes: the target species (Hydrilla: 466 images; Figure 14a) and non-target species (other: 1751 images including some with no vegetation; Figure 14b). Images of each class were randomly divided into training and validation sets (with 420 and 46 images, respectively). The DNN achieved 100% training accuracy, indicating sufficient network complexity. However, significant differences between training and validation accuracies indicated overfitting.

2.2.4. Reducing Overfitting

The primary causes of overfitting include insufficient training data for generalization [27] or excessive model complexity. Since the network structure already contained two dropout layers, data augmentation and model training with additional data were implemented to reduce overfitting [28,29]. Additional data was collected and Hydrilla images were generated with an increased overlap of approximately 50%, expanding the training and validation sets to 720 and 89 images, respectively. To implement data augmentation, artificial training data was generated by modifying original images via horizontal reflection, translation, and scaling. Furthermore, iterative parameter tuning was performed on optimization functions, learning rates, learning rate drop schedules, batch sizes, and data augmentation parameters to increase classification accuracy.

2.2.5. Generalizing Over Multiple Species

Although image classification based on two classes (Hydrilla and Other) achieved satisfactory accuracy, classification across multiple plant species or classes was pursued to produce even better results with increased utility. To enable treatment of multiple species and enhance model generalization, hydroacoustic imagery of Cabomba (Figure 14c) and Coontail (Figure 14d, Ceratophyllum demersum), two common aquatic plant varieties, was collected and the DNN was trained on four classes. The data set was divided into training, validation, and test sets with 657, 80, and 80 images, respectively, of each class. Equal numbers of images in each class during training ensured prevention of sample bias; any significant difference was found to affect the classification probabilities. Cross verification on the test set following parameter tuning confirmed/ensured adequate model generalization.

2.2.6. Extracting GPS Coordinates from Images Post-Classification

In conjunction with image classification, precise location of identified plant species is necessary for efficient treatment and recordkeeping. To create the required location database, the GPS coordinates superimposed on each hydroacoustic image (Figure 12) were extracted using the optical character recognition (OCR) functionality of MATLAB. Image preprocessing techniques, namely cropping, resizing, gray scaling, and binarizing, enabled extraction of accurate GPS coordinates from the classified images (Figure 15).

3. Results

3.1. Autonomous Vehicle Performance

Autonomous water vehicles were deployed into multiple water bodies for a variety of weather conditions, spanning seasons from mid-summer to early winter. The majority of trials were conducted on Lake Raleigh, a 75-acre lake located on North Carolina State University’s Centennial Campus, to evaluate the boats’ manual and autonomous navigational capabilities and confirm the herbicide dispersal system functionality.

3.1.1. Autonomous Navigation

With the marine propulsion system, prototypes achieved speeds up to 2.3 m/s and were able to navigate in the presence of moderate wind (4.5 m/s, 10 mph), light rain, and choppy surface water conditions. An iterative controller tuning process was completed to ensure high-performance tracking in autonomous navigation mode. Figure 16 shows typical autonomous tracking performance. No degradation in tracking performance was observed due to wind gusts or high fluid levels in the herbicide tank, though the maximum speed decreased with heavier herbicide payloads.

3.1.2. Hydraulic Stability and Operational Depth

Since the required submersion depth of the trolling motor exceeds the pontoon draft, the boat can be operated at any depth that can accommodate marine props. Nonetheless, with the prop motor positioned at minimal depth for shallow operation, angular travel of the vertical shaft (for steering purposes) must be software-limited to avoid collision of the blades with the pontoons during steering maneuvers. It must be noted that even with this angular limitation, adequate navigation was achieved. Bow-to-stern inclination of the boat was minimal at all tank fill levels; the boat exhibited a slight, almost indiscernible, upward pitch (bow upward) at full tank capacity.

3.1.3. Battery Life

Prototypes were powered by two 12-volt lithium iron phosphate (LFP) batteries (Bioenno Power, Santa Ana, CA, USA) wired in parallel, each rated at a 60 amp-hour capacity. While the batteries were not operated to depletion, data from multiple tests of over three hours suggests up to eight hours of operation time can be achieved on a single charge without falling below safe (80% discharge) levels. Battery discharge monitoring (Figure 17a) revealed a noticeable decrease in battery voltage over approximately the first 15 min followed by a relatively steady voltage over the remaining operation period.

3.1.4. Herbicide Dispersal System

Laboratory testing, completed before lake trials, revealed that the dispersal system could provide moderate application rates of Aquathol-K (1.0-1.8 gal/A-ft) for typical pond depths, boat velocities, and transect widths. Figure 17b shows the range of chemical dispersal rates (Aquathol concentrate) achievable by varying the RC input.
Functionality of the dispersal system with chemical dilution modality was also demonstrated in Lake Raleigh; water was substituted for the herbicide to avoid unnecessary release of chemicals into the water body. Herbicide application was completed in a small (approximately 0.29 acres) private pond with watermeal infestation (Figure 18). Due to the small surface area and volume of the pond, a premixed herbicide/water solution was dispersed directly from the tank and manual control, rather than autonomous navigation, was utilized.

3.2. Machine Learning Algorithm

Vegetation Classification

The deep learning algorithm, in conjunction with data augmentation, successfully extracted features and accurately classified underwater vegetation using hydroacoustic imagery. Figure 19 shows classification accuracy as a function of training iterations for both training and validation sets; it clearly illustrates how data augmentation reduced overfitting in the validation sets.
Initial parameter tuning readily improved the validation accuracy to approximately 97%; further parameter tuning produced only marginal gains. The optimizers ‘sgdm’, ‘rmsprop’, and ‘adam’ gave similar performance, with ‘sgdm’ and ‘adam’ producing marginally better accuracy. Increased smoothness of the training curve and improved accuracy were observed for all optimizers by reducing the learning rate. A learning rate drop schedule further improved the accuracy as compared to a constant learning rate. The “MiniBatch Size” parameter contributed heavily to runtimes and smoothness of the training curve. Greater batch sizes smoothed the curve and reduced runtimes, but increased memory requirements. Limiting batch sizes to 256 or less helped avoid the “sharp minimizers” that tend to impede generalization [30].
Following parameter tuning, the DNN achieved a classification accuracy of 99.06% for both the validation and test data sets, clearly indicating excellent generalization (Cabomba: precision = 1, recall = 1; Coontail: precision = 1, recall = 1; Hydrilla: precision = 0.9873, recall = 0.975; Other: precision = 0.9753, recall = 0.9875). Analysis of the confusion matrices for different tuning parameters revealed more misclassifications within the Hydrilla and Other classes (and between the two) than within the Cabomba and Coontail classes. Figure 20 shows the confusion matrices for the optimal parameter set.
Subsequent analysis revealed that misclassified Hydrilla images tended to be associated with limited plant growth (Figure 21a). The majority of images containing Cabomba and Coontail were associated with more mature plant growth, which perhaps led to higher classification accuracies. This suggests that timing of data capture within the growing season might play an important role in classification accuracy. Data for Hydrilla was collected from June to November while Cabomba and Coontail were scanned during December, a time of year associated with more mature growth. Other instances of misclassification included images containing floating vegetation or schools of fish (Figure 21b), and vegetation being only partially contained within the image (Figure 21c).
During DNN training, weights of all layers undergo continuous optimization. Successful feature learning creates smooth filters with uniform weight gradients. Figure 22 visualizes the 96 filters (each with 11 × 11 × 3 weights) of the first convolutional layer for two separate cases. Parameter tuning generated smoother, blended (less discretized) gradient patterns, which are characteristics of effective training (Figure 22b) [31], while filters in Figure 22a indicate lack of sufficient training.

4. Discussion

This research was successful in developing the necessary hardware and software for automated identification and treatment of submerged aquatic weeds. A small fleet (two watercraft) with the required systems and capabilities (self-navigation, hydroacoustic data collection, and herbicide dispersal) were developed over three prototype generations. Hydrilla, being a very invasive weed species with severe impacts, was chosen as the targeted aquatic weed. The deep learning model was later extended to multiple species, namely Hydrilla, Cabomba, and Coontail (four classes in total, including “Other”). High classification accuracy of 99.06% on both the validation and test sets indicate excellent generalization. Training on four classes with a higher volume of data improved the algorithm accuracy as compared to training on two classes. Geotagged information from classified images was reliably extracted for treatment of the target areas with herbicides.
While the integration of autonomous scanning, identification, and treatment of invasive plant species was to some extent limited by software and hardware capabilities, the current practice of manually scanning and treating entire water bodies can be eliminated using this approach. The technologies detailed here clearly have the potential to reduce the cost and increase the effectiveness of aquatic plant management.

Future Work

Because plant maturity was found to have an impact on classification accuracy, it may be advisable to collect data of each plant variety throughout the growing season and at multiple locations to help further generalize the model. Future research will likely focus on improved hardware and software integration, which will enable synchronous execution of all system tasks: hydroacoustic data collection, weed classification and distribution mapping, and targeted herbicide treatment. The integrated process would involve automated weed classification during hydroacoustic scanning, which could be accomplished through one of several methods. The first option involves performing all classification and location extraction tasks on a central computer, located with the operator. In this option, the hydroacoustic data could be transferred to the central computer using existing hydroacoustic wireless technology (e.g., the “Navico gofree wifi1”), a wireless module (Navico Marine Electronics, Egersund, Norway), or a Wi-Fi enabled SD card. To overcome limited transmission range, transects could be configured to periodically direct boats close to the operator for wireless data transmission. Following weed classification and target location identification by the computer, an operator could generate transects covering these locations and transmit them to the vehicles for herbicide application. Multiple boats could be employed simultaneously for weed detection and treatment with a single operator (based on the shore or in another watercraft, (e.g., canoe or motorized craft)).
The second option involves the use of onboard computer processing systems (e.g., Raspberry Pi units or tablets), for real-time image classification. Live data could be transferred from the terminals of the fish finder to the processor using Wireshark software [32]. The herbicide application system could be triggered immediately upon detection of targeted weed species. Applying this methodology, scanning and treatment could be completed in a single completion of the transects. In spite of its benefits, this method would require extensive research to successfully configure real-time communication between software (Wireshark, Reefmaster, MATLAB, Mission Planner) and hardware components (hydroacoustic scanner, autopilot module, etc.).

Author Contributions

Conceptualization, G.B., S.F. and R.R.; methodology: G.B., S.J., S.F. and M.P.; software, S.J. and M.P.; validation, S.J. and M.P.; data collection and processing, R.R., M.P. and S.J.; writing—original draft preparation, S.J. and M.P.; writing—review and editing, G.B. and M.P.; visualization, supervision, G.B., S.F. and R.R.; project administration, G.B. and S.F.; funding acquisition, R.R. and G.B.

Funding

This research was funded by United Phosphorous, Inc. and the North Carolina Policy Collaboratory, grant numbers 2016-1503 and 2019-0236, respectively.

Acknowledgments

The authors would like to acknowledge the generous technical support of Stephen Hoyle and Andrew Howell (Dept. of Crop and Soil Sciences, N.C. State University), Justin Nawrocki and Gerald Adrian (United Phosphorus, Inc.), and Eric Stewart, Gordon Beverley and Esther Lee (Dept. of Mechanical and Aerospace Engineering, N.C. State University).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Pimentel, D.; Zuniga, R.; Morrison, D. Update on the environmental and economic costs associated with alien-invasive species in the United States. Ecol. Econ. 2005, 52, 273–288. [Google Scholar] [CrossRef]
  2. Rockwell, H.W. Summary of a Survey of the Literature on the Economic Impact of Aquatic Weeds; Aquatic Ecosystem Restoration Foundation: Flint, MI, USA, 2003. [Google Scholar]
  3. Gettys, L.A.; Haller, W.T.; Petty, D.G. Biology and Control of Aquatic Plants: A Best Management Practices Handbook; Aquatic Ecosystem Restoration Foundation: Marietta, GA, USA, 2019. [Google Scholar]
  4. McComas, S. Lake and Pond Management Guidebook; CRC Press: Boca Raton, FL, USA, 2003. [Google Scholar]
  5. Lembi, C.A. Identifying and Managing Aquatic Vegetation; Formerly Purdue Extension Publication WS-21-W; Purdue University Cooperative Extension Service: West Lafayette, IN, USA, 2009. [Google Scholar]
  6. Bell, F.W.; Bonn, M.A. Economic Sectors at Risk from Invasive Aquatic Weeds at Lake Istokpoga, Florida. 2004. Available online: http://www. aquatics. org/pubs/economics. htm (accessed on 7 June 2019).
  7. Buck, B. UF/IFAS Researchers Try to Cut Costs to Control Aquatic Invasive Plants in Florida; University of Florida Institute of Food and Agricultural Sciences IFAS Blogs: Gainesville, FL, USA, 2016. [Google Scholar]
  8. Langeland, K.A. Hydrilla verticillata (L. F.) Royle (Hydrocharitaceae), The Perfect Aquatic Weed. South. Appalach. Bot. Soc. 1996, 61, 293–304. [Google Scholar]
  9. Langeland, K.A.; Enloe, S.F.; Gettys, L. Hydrilla Management in Florida Lakes; U.S. Department of Agriculture UF/IFAS Extension: Gainesville, FL, USA, 2012. [Google Scholar]
  10. Bain, M.B. Assessing impacts of introduced aquatic species: Grass carp in large systems. Environ. Manag. 1993, 17, 211–224. [Google Scholar] [CrossRef]
  11. Helfrich, L.; Neves, R.; Libey, G.; Newcomb, T. Control Methods for Aquatic Plants in Ponds and Lakes. Available online: https://vtechworks.lib.vt.edu/handle/10919/48945 (accessed on 1 April 2019).
  12. Blanco, A.; Qu, J.J.; Roper, W.E. Spectral signatures of hydrilla from a tank and field setting. Front. Earth Sci. 2012, 6, 453–460. [Google Scholar] [CrossRef]
  13. Hänggi, T. Design of an Autonomous Sampling Boat for the Study of Algae Bloom in Lake Zurich. Master’s Thesis, Swiss Federal Institute of Technology Zurich, Zurich, Switzerland, 2009. [Google Scholar]
  14. Kho, S.J.; Manickam, S.; Malek, S.; Mosleh, M.; Dhillon, S.K. Automated plant identification using artificial neural network and support vector machine. Front. Life Sci. 2017, 10, 98–107. [Google Scholar] [CrossRef] [Green Version]
  15. Pereira, L.A.M.; Nakamura, R.Y.M.; de Souza, G.F.S.; Martins, D.; Papa, J.P. Aquatic weed automatic classification using machine learning techniques. Comput. Electron. Agric. 2012, 87, 56–63. [Google Scholar] [CrossRef]
  16. Hung, C.; Xu, Z.; Sukkarieh, S. Feature learning based approach for weed classification using high resolution aerial images from a digital camera mounted on a UAV. Remote Sens. 2014, 6, 12037–12054. [Google Scholar] [CrossRef]
  17. Baron, J.; Hill, D.J.; Elmiligi, H. Combining image processing and machine learning to identify invasive plants in high-resolution images. Int. J. Remote Sens. 2018, 39, 5099–5118. [Google Scholar] [CrossRef]
  18. Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
  19. Nielsen, M. Neural Networks and Deep Learning; Determination Press: San Francisco, CA, USA, 2015. [Google Scholar]
  20. Sun, Y.; Liu, Y.; Wang, G.; Zhang, H. Deep Learning for Plant Identification in Natural Environment. Comput. Intell. Neurosci. 2017, 2017, 7361042. [Google Scholar] [CrossRef] [PubMed]
  21. Liu, W.; Wang, Z.; Liu, X.; Zeng, N.; Liu, Y.; Alsaadi, F.E. Neurocomputing A survey of deep neural network architectures and their applications. Neurocomputing 2017, 234, 11–26. [Google Scholar] [CrossRef]
  22. Convolutional Neural Network: 3 Things You Need to Know. Available online: https://www.mathworks.com/solutions/deep-learning/convolutional-neural-network.html (accessed on 6 June 2019).
  23. Beverly, G.T. Development and Experimentation of an Herbicide Dispersal System for an Autonomous Aquatic Weed Management System. Master’s Thesis, North Carolina State University, Raleigh, NC, USA, 2017. [Google Scholar]
  24. Park, S.; Deyst, J.; How, J. A New Nonlinear Guidance Logic for Trajectory Tracking. In Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit, Providence, RI, USA, 16–19 August 2004; pp. 1–16. [Google Scholar]
  25. Jones, B. Plane: L1 Control for Straight and Curved Path Following. 2013. Available online: https://github.com/ArduPilot/ardupilot/pull/101 (accessed on 7 June 2019).
  26. Khaled, N. Screen Record. MathWorks File Exchange. 2008. Available online: https://www.mathworks.com/matlabcentral/fileexchange/21216-screen-record (accessed on 8 November 2018).
  27. Domingos, P. A Few Useful Things to Know About Machine Learning. Commun. ACM 2012, 55, 78–87. [Google Scholar] [CrossRef]
  28. Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
  29. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1–9. [Google Scholar] [CrossRef]
  30. Keskar, N.; Mudigere, D.; Nocedal, J.; Smelyanskiy, M.; Tang, P. On large-batch training for deep learning: Generalization gap and sharp minima. In Proceedings of the International Conference on Learning Representations, Toulon, France, 24–26 April 2017; pp. 1–16. [Google Scholar]
  31. Understanding and Visualizing Convolutional Neural Networks. Available online: http://cs231n.github.io/understanding-cnn/ (accessed on 7 June 2019).
  32. Dabrowski, A.; Stelzer, R. A Digital Interface for Imagery and Control of a Navico/Lowrance Broadband Radar. In Breizh Spirit, a Reliable Boat for Crossing the Atlantic Ocean; Springer: Berlin/Heidelberg, Germany, 2011; pp. 169–181. [Google Scholar]
Figure 1. Watermeal infestation in a central North Carolina pond: (a) aerial view; (b) ground view.
Figure 1. Watermeal infestation in a central North Carolina pond: (a) aerial view; (b) ground view.
Applsci 09 02410 g001
Figure 2. Extraction from an image of a leaf (adapted from [4]): (a) original photograph; (b) conversion from RGB to HSV format to serve as a guide for subsequent edge detection; (c) grayscale conversion to improve image contrast; and (d) feature extraction after identifying leaf boundaries.
Figure 2. Extraction from an image of a leaf (adapted from [4]): (a) original photograph; (b) conversion from RGB to HSV format to serve as a guide for subsequent edge detection; (c) grayscale conversion to improve image contrast; and (d) feature extraction after identifying leaf boundaries.
Applsci 09 02410 g002
Figure 3. Architecture of a feedforward artificial neural network.
Figure 3. Architecture of a feedforward artificial neural network.
Applsci 09 02410 g003
Figure 4. Architecture of a convolutional neural network (CNN) [22].
Figure 4. Architecture of a convolutional neural network (CNN) [22].
Applsci 09 02410 g004
Figure 5. Autonomous boat prototype for identification and chemical treatment of invasive aquatic plant species.
Figure 5. Autonomous boat prototype for identification and chemical treatment of invasive aquatic plant species.
Applsci 09 02410 g005
Figure 6. Fabrication of fiberglass pontoons: (a) cutting, sanding, and bonding polystyrene plug sections; (b) epoxy coating, sanding, and gel-coating of plug; (c) fiberglass lay-up of mold over polystyrene plug; (d) assembly of mold halves (shown with keel upward); (e) fiberglass pontoon after lay-up inside mold and application of multiple finish coats; and (f) separately cast pontoon cap installed.
Figure 6. Fabrication of fiberglass pontoons: (a) cutting, sanding, and bonding polystyrene plug sections; (b) epoxy coating, sanding, and gel-coating of plug; (c) fiberglass lay-up of mold over polystyrene plug; (d) assembly of mold halves (shown with keel upward); (e) fiberglass pontoon after lay-up inside mold and application of multiple finish coats; and (f) separately cast pontoon cap installed.
Applsci 09 02410 g006
Figure 7. Aluminum struts bolted within the pontoons for enhanced lateral support: (a) before foam filling); (b) urethane foam filling process.
Figure 7. Aluminum struts bolted within the pontoons for enhanced lateral support: (a) before foam filling); (b) urethane foam filling process.
Applsci 09 02410 g007
Figure 8. Propulsion and steering systems: (a) Minn Kota marine propulsion unit and (b) rotational potentiometer used for shaft position feedback.
Figure 8. Propulsion and steering systems: (a) Minn Kota marine propulsion unit and (b) rotational potentiometer used for shaft position feedback.
Applsci 09 02410 g008
Figure 9. Features utilized for autonomous vehicle control: (a) Pixhawk autopilot module as installed within a prototype (receivers and other electronics in background); (b) Mission Planner interface showing generation of transects.
Figure 9. Features utilized for autonomous vehicle control: (a) Pixhawk autopilot module as installed within a prototype (receivers and other electronics in background); (b) Mission Planner interface showing generation of transects.
Applsci 09 02410 g009
Figure 10. L1 tracking schematic showing computation of reference point (L1_ref) at various vehicle positions with respect to the desired path.
Figure 10. L1 tracking schematic showing computation of reference point (L1_ref) at various vehicle positions with respect to the desired path.
Applsci 09 02410 g010
Figure 11. Screenshot from Reefmaster Sonar Viewer software illustrating hydroacoustic imagery acquired on Lake Raleigh: left—map location corresponding with imagery (indicated by boat icon) and traversed path (with increasing depth, path color changes from red to blue); top right—Primary scan sonar; bottom right—DownScan™ sonar.
Figure 11. Screenshot from Reefmaster Sonar Viewer software illustrating hydroacoustic imagery acquired on Lake Raleigh: left—map location corresponding with imagery (indicated by boat icon) and traversed path (with increasing depth, path color changes from red to blue); top right—Primary scan sonar; bottom right—DownScan™ sonar.
Applsci 09 02410 g011
Figure 12. Geo-tagged DownScan™ image with GPS coordinates in the top left corner.
Figure 12. Geo-tagged DownScan™ image with GPS coordinates in the top left corner.
Applsci 09 02410 g012
Figure 13. Layer-wise structure of Alexnet from MATLAB neural network toolbox.
Figure 13. Layer-wise structure of Alexnet from MATLAB neural network toolbox.
Applsci 09 02410 g013
Figure 14. Example hydroacoustic images of each of the weed classes: (a) Hydrilla, (b) Other, (c) Cabomba, and (d) Coontail.
Figure 14. Example hydroacoustic images of each of the weed classes: (a) Hydrilla, (b) Other, (c) Cabomba, and (d) Coontail.
Applsci 09 02410 g014
Figure 15. GPS coordinate extraction with image preprocessing: (a) raw/unprocessed hydroacoustic image and (b) preprocessed image optimized for optical character recognition (OCR). Generated output: N035.46.033↵W078.40.734↵↵.
Figure 15. GPS coordinate extraction with image preprocessing: (a) raw/unprocessed hydroacoustic image and (b) preprocessed image optimized for optical character recognition (OCR). Generated output: N035.46.033↵W078.40.734↵↵.
Applsci 09 02410 g015
Figure 16. Tracking performance of a prototype at Lake Raleigh: (a) desired and actual trajectories as displayed from Mission Planner software and (b) corresponding boat deviations from intended trajectories (180° turns excluded).
Figure 16. Tracking performance of a prototype at Lake Raleigh: (a) desired and actual trajectories as displayed from Mission Planner software and (b) corresponding boat deviations from intended trajectories (180° turns excluded).
Applsci 09 02410 g016
Figure 17. (a) Battery voltage measurements for multiple test runs and (b) characterization of chemical dispersal rates as a function of RC input. Voltage fluctuations were a function of power supplied to the propulsion system.
Figure 17. (a) Battery voltage measurements for multiple test runs and (b) characterization of chemical dispersal rates as a function of RC input. Voltage fluctuations were a function of power supplied to the propulsion system.
Applsci 09 02410 g017
Figure 18. Treatment of watermeal infestation in a small pond using manual navigation and minimal prop submersion.
Figure 18. Treatment of watermeal infestation in a small pond using manual navigation and minimal prop submersion.
Applsci 09 02410 g018
Figure 19. Training progress (a) before and (b) following data augmentation. Overfitting in (a) is evident through high variation between training and validation accuracy.
Figure 19. Training progress (a) before and (b) following data augmentation. Overfitting in (a) is evident through high variation between training and validation accuracy.
Applsci 09 02410 g019
Figure 20. Confusion matrices of classified vegetation species for (a) validation and (b) test images.
Figure 20. Confusion matrices of classified vegetation species for (a) validation and (b) test images.
Applsci 09 02410 g020
Figure 21. Misclassified images due to (a) limited vegetation growth, (b) floating vegetation or schools of fish, and (c) vegetation partially contained within the image.
Figure 21. Misclassified images due to (a) limited vegetation growth, (b) floating vegetation or schools of fish, and (c) vegetation partially contained within the image.
Applsci 09 02410 g021
Figure 22. Deep neural networks (DNN) weight visualization for first convolutional layer (a) before parameter optimization and (b) after training with parameter optimization.
Figure 22. Deep neural networks (DNN) weight visualization for first convolutional layer (a) before parameter optimization and (b) after training with parameter optimization.
Applsci 09 02410 g022

Share and Cite

MDPI and ACS Style

Patel, M.; Jernigan, S.; Richardson, R.; Ferguson, S.; Buckner, G. Autonomous Robotics for Identification and Management of Invasive Aquatic Plant Species. Appl. Sci. 2019, 9, 2410. https://doi.org/10.3390/app9122410

AMA Style

Patel M, Jernigan S, Richardson R, Ferguson S, Buckner G. Autonomous Robotics for Identification and Management of Invasive Aquatic Plant Species. Applied Sciences. 2019; 9(12):2410. https://doi.org/10.3390/app9122410

Chicago/Turabian Style

Patel, Maharshi, Shaphan Jernigan, Rob Richardson, Scott Ferguson, and Gregory Buckner. 2019. "Autonomous Robotics for Identification and Management of Invasive Aquatic Plant Species" Applied Sciences 9, no. 12: 2410. https://doi.org/10.3390/app9122410

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop