Next Article in Journal
Consumers’ Attitude towards Sustainable Food Consumption during the COVID-19 Pandemic in Romania
Next Article in Special Issue
A Design of an Unmanned Electric Tractor Platform
Previous Article in Journal
Exploring the Gender-Specific Adaptive Responses to Climate Variability: Application of Grazing Game in the Semi-Arid Region of Ghana
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mechanical Control with a Deep Learning Method for Precise Weeding on a Farm

Department of Biomechatronics Engineering, National Pingtung University of Science and Technology, Neipu 91201, Taiwan
*
Author to whom correspondence should be addressed.
Agriculture 2021, 11(11), 1049; https://doi.org/10.3390/agriculture11111049
Submission received: 25 September 2021 / Revised: 21 October 2021 / Accepted: 23 October 2021 / Published: 26 October 2021
(This article belongs to the Special Issue Design and Application of Agricultural Equipment in Tillage System)

Abstract

:
This paper presents a mechanical control method for precise weeding based on deep learning. Deep convolutional neural network was used to identify and locate weeds. A special modular weeder was designed, which can be installed on the rear of a mobile platform. An inverted pyramid-shaped weeding tool equipped in the modular weeder can shovel out weeds without being contaminated by soil. The weed detection and control method was implemented on an embedded system with a high-speed graphics processing unit and integrated with the weeder. The experimental results showed that even if the speed of the mobile platform reaches 20 cm/s, the weeds can still be accurately detected and the position of the weeds can be located by the system. Moreover, the weeding mechanism can successfully shovel out the roots of the weeds. The proposed weeder has been tested in the field, and its performance and weed coverage have been verified to be precise for weeding.

1. Introduction

The type of crop production and management has been toward knowledge- and automation-intensive practices, which use automated machine, information communication technology, and biotechnology for large-scale production, which can be combined with precision agriculture technique to increase productivity, reduce resource waste and production costs, and improve environmental quality [1,2,3]. Among them, weed management is regarded as one of the most challenging tasks in crop production. Effective weed control can increase the productivity per unit area to meet the growing demand for crop production [4]. Improper weed management can lead to a potential loss of approximately 32%, which is increasing every year [5]. If weeds are not effectively controlled, most of the fertilizer nutrients applied to the crop are absorbed by the weeds, resulting in 60% reduction in crop yield in organic farming [6].
Since weeds exhibit uneven spatial distribution [7], however, the traditional weed management method is that herbicides are usually applied uniformly across the field. Most herbicides are released into the environment through runoff and drift, which have an impact on the ecological environment and human health [8]. Hand-weeding is a common weed management practice, but it is time-consuming, high cost, labor-intensive, and more difficult due to labor shortage in the agriculture. This practice may also expose farmers to the risk of infected weeds. Some countries have even abandoned this practice [9,10]. Fortunately, some smart agricultural machines have been investigated recently, which use physical or chemical methods to solve the issue of weed management [11,12,13,14,15,16,17]. It can be expected that machines will replace humans or assist operators to achieve the purpose of smart production management [18].
The type of weeding machine can be divided into passive and active based on whether there is a power source [19]. Among them, active weeding can realize the behavior of avoiding seedlings and simultaneously weeding. Its weeding behavior can be divided into swing, rotation, hybrid, etc. [18,20,21,22,23]. Among them, swing behaviors are mainly powered by ground-driven system to drive the hoe to reciprocate. The rotary type is divided into vertical axis rotation and horizontal axis rotation according to the position of the rotation axis. There are notched hoe knives, claw tooth cycloidal hoe knives, etc., which rotate around the vertical axis of the transmission mechanism of the machine. Hybrid is a combination of swing and rotating, and its motion behavior has a high degree of spatial freedom. However, the design challenge is how to optimize the transmission mechanism and reduce the number of components. The weeding machines are usually mounted behind the tractor. As the tractor moves, the weeding machine will continue to shovel the soil to remove weeds. In fact, the implementation of full-cover mechanized shoveling operations will affect the organic matter content in the soil, which in turn affects the nutrient absorption effect of the crop roots.
Generally, a detector is installed on an automated weeding machine, which is expected to be used to detect whether there is crop in the interrow. At the same time, the actuator can control the knives or hoe knives under the soil. Based on the detection results, the actuator can move the knives into or out of the rows to fork over the soil, so as to remove weeds [9,24]. Since the performance of the end effector (actuator) of weeding machinery directly affects the efficiency of weeding. This kind of variable rate technology is rarely used in actual operation and the cost is also a key factor that needs to be considered [25]. In order to achieve the purpose of precise weeding, some weeding machine combine computer vision technology with a mobile robot capable of autonomous navigation [26]. The mobile manipulator must be able to accurately locate the weeds in real time. At the same time, the weeding tools must cooperate with the actuator to operate the weeding tools at the right time to remove the weed.
In previous study, a machine vision-based smart weeder is peoposed that uses image processing methods to identify crops and weeds, and uses an inference-based control method to drive three direct current (DC) motors, which are driven by gears and chains [27]. The three-claw harrows weeding tool on the connecting rod is inserted into the soil, and then the soil is moved backward to remove the weeds. However, due to the type of claw harrow and the torque limitation of the actuator, this machine is only suitable for soft soil and small weed removal. McCool et al. [28] described mechanical methods as an alternative to weed management. They proposed different types of weeding tools, including arrow- and tine-shaped, which can be mounted on a guided vehicle to perform weeding operations. Statistical analysis proves the effectiveness of these tools and emphasizes the importance of early intervention. Other types of weeding tools, such as intrarow plowshares, comb harrows, spring harrows, and specific plowshares for in-row weeding, are also used for weeding operations [29]. Fennimore and Cutulle [30] developed and implemented machine vision technology in an autonomous weeder. Two robotic arms cooperate with weed actuators to spray herbicides directly on each weed. Raja et al. [31] proposed a weeding system based on a 3D geometry detection algorithm of robot vision. A corresponding mechanical weeding device was also designed, used for automated weeding in tomato and lettuce fields, which can efficiently perform weed removal in a high-density environment. Kumar et al. [32] proposed an mechatronics prototype for interrow weeding and crop damage control, which initiates weeding operations through plant sensing, soil, and plantation parameters. The developed method combines the different conditions of soil, forward speed, and plant spacing to calculate the dynamic lateral movement speed. However, it is still easy to be affected by vibration or other uncontrolled movements during image processing in practical applications, resulting in blurry images, which impact the recognition and positioning performance. Meanwhile, this mechanism is complicated and lacks modular design.
The implementation of machine vision technology for weeding tasks first needs to use image processing methods to extract features such as the color, texture, and shape of the image, and then combine them with machine learning algorithms such as clustering or classification to detect and classify weeds [33,34,35,36]. Among them, the shape or feature extraction based on the support vector machine is the most commonly used to distinguish crops and weeds [37,38,39]. After this, it is necessary to determine the feature of the target object and use some morphology or color space conversion methods to extract the feature and position of the weeds [40,41]. Due to the use of a machine vision system to detect and locate weeds, its system performance is limited by the uncertainty of the environment, including light conditions and color variance of leaves or soil, which also results in a decrease in the performance of weed control. There are currently some weed detection technologies that integrate images taken from multiple perspective sources and multiple feature marks to improve the accuracy of weed recognition and location [42,43]. Because of its complex system design, time-consuming and maintenance costs need to be considered. Other methods include the use of controlled light emitting diode (LED) lighting equipment in the dark box and the use of camera-lighting module to record the reflection spectrum of the object. The system combines the size information of the desired object to distinguish crops, weeds and soil in horticultural crops, which can locate weeds [44]. Currently, this method has not integrated weeding equipment to implement precise weeding operations.
With the improvement of computer computing performance and the increase in the number of available images, deep learning has been able to provide enhanced data expression capabilities for target objects in images. These methods can be used to extract multiscale and multidimensional spatial semantic feature information of objects [5,45,46,47].
In many cases, the detection and classification results obtained using convolutional neural network method are better than the classification results produced by using machine learning commonly in the early stages [48,49,50,51,52,53,54,55]. However, deep learning needs to rely on a large number of data sets for training, it is not easy to collect crop and weed images [56]. Redmon et al. [57] proposed a fast target detection algorithm called YOLO, which can quickly implement real-time applications. This method is based on the Darknet-53 network architecture and has been modified many times to greatly improve the accuracy of target identification with only a small amount of data samples.
This study proposes a weed identification technology and weeding tool control method based on the YOLOv3 model [58], and implements it in an innovative weeding mechanism. In the early study, an artificial intelligent-enabled shovel weeder is designed and implemented [59]. Nevertheless, the weeder was only tested in a simulated field and its weeding performance is limited by the torque of actuator and unstable transmission mechanism, which requires further design and testing. The earlier designed mechanism was modified and re-made and assembled. The modular weeding tool is attached to an unpowered machine. The motion behavior of the weeding tool is a combination of swing and rotating. The design concept of the transmission mechanism of the weeding machine is derived from the power transmission of a bicycle. An inverted triangle weeding knife is designed. The weeding machine is equipped with a camera module, which can be used to obtain top-in-view images in real time. This weeding tool is used to test and evaluate the effectiveness of deep learning methods. After that, the weeding machine was used in the field to actually test the weeding performance of the method in the presence and absence of crops. The results of different types of knives for weed removal will also be analyzed and compared.When the trailer is moving, the proposed weeding machine can automatically remove weeds in the farmland.
The purpose of this study is as follows: First, a weeder is implemented and can be used to replace manual weeding. Second, the use of deep learning methods to achieve precise removal of individual weeds to improve the existing mechanized weeding. Third, modularize the weeder. Multiple modules can be attached to the back of the vehicle to solve the problem of difficult disassembly and spacing adjustment of large weeders. Fourth, the proposed weeder simultaneously weeds and shovels soil, which can reduce the probability of weed growth. The proposed weeder is particularly suitable for homeworkers and farmers who want to carry out organic cultivation for weeding operations in small fields.
The chapters of this paper are organized as follows: The design method for the weeding machine and the mobile platform, including the design of the weeding mechanism and transmission mode, the software and hardware construction of the weeding system, and the performance evaluation matrices are described in Section 2. The flow of the weed detection program is also explained in this chapter. Section 3 explains how to test the performance of weeding machine, including evaluating the performance of weed detection and testing the weeding efficiency. The last chapter summarizes the characteristics and applications of the weeding methods proposed in this paper and explains future work.

2. Materials and Methods

The weeding machine developed in this research will can be attached to a simple four-wheeled trailer with no power source to perform weeding operations. The battery supplies power to the machine. The appearance of the entire mechanism is shown in Figure 1. At most, two sets of weeding machines are attached to the vehicle, which are respectively mounted on the left and right sides of the vehicle. On the right is the advanced intelligent weeding machine (Weeder #1) equipped with an inverted triangle weeding tool. On the left is the first-generation weeding machine (Weeder #2), which is equipped with a claw rake-type weeding tool [59].
Based on the YOLOv3 network, the deep learning model is used as weed detection, the network model is trained by multiple feature objects, and the trained network model and the weeding tool control algorithm are integrated and implemented in the embedded system. Through the execution of the program, the weeding tool can swing up and down and back and forth for weeding operations. The following describes the design of the drive mechanism of the weeding machine and the weed detection and control system, including the weed recognition algorithm and the hardware construction, and the software program flow is also described in detail in this chapter.

2.1. Mechanism Design

The design and development of weeding equipment must take into account the various agronomical requirements of crops, soil conditions, and weed characteristics for field management operations. For example, the appearance of the field includes different field heights, widths, and densities of cultivated crops. In addition, the height of crops, root length, leaf branch and soil type, water content, bulk density, and strength of the soil also need to be considered. The mechanical design of weeding tools needs to be simplified, so that farmers or craftsmen can repair them quickly and have low maintenance costs. Therefore, based on the above ideas, a DC-driven weeding machine was developed. Its components include a DC motor (model: SWG-24-1800, Xajong), a transmission mechanism, a height-adjustable weeding handle, and a protective case (see Figure 2).
The transmission mechanism consists of an upper sprocket (Model: RS35-B-16, New Sheylee CO., Ltd., Taichung City, Taiwan), a lower sprocket (Model: RS35-B-32, New Sheylee CO., Ltd., Taichung City, Taiwan), a drive chain (Model: RS35, Prelead Industrial CO., Ltd., Taiwan), left and right discs, a coupler, a ball bearing seat, and a cylindrical rod (16 mm × 200 mm (diameter [D] × length [L])). The size of the case is 216 mm × 180 mm × 278 mm (L × width (W) × height (H)), and the weight of the whole machine is 6 kg. In terms of tool design, the appearance of traditional weeding tools is mostly designed to imitate the blade geometry. Different types of soil require the use of different shaped cutters to shovel the soil [60]. This type of tool set is installed on a rotating mechanism, which can make the vertical cutting surface of the blade move downward through the rotating torsion force to achieve the purpose of shoveling the soil. However, this tool is suitable for use in fields with a low cultivation density. In contrast, weeding tools, such as the disc, round head, and sawtooth types, are more suitable for use in fields with higher planting density and can effectively treat weeds on the surface of the soil. In addition, the rake-type cutter can be used to dig out weeds with shallow roots [27,59], but the material of this cutter is more likely to stick to the soil.
Therefore, a new type of tool was designed, the material of which was aluminum alloy. The shape of the weeding tool is an inverted triangle (90 mm × 47 mm × 80 mm (L × W × H)) with a sharp end, which is suitable for hard soil. In addition, the bottom of the cutter is wider, which can cover the size of a single weed and shovel out the roots of the weeds. A combination of multiple iron plates is used as the mechanism case. The upper part of the front and rear sides is locked with a pull handle, and a proximity switch is installed inside the upper part of the iron plate (Model: TG1-X3010E1, Prosensor Phototech Co., Ltd., Taoyuan city, Taiwan), which is used to stop the motor. The digital lens (Model: Logitech BRIO, Logitech International S.A., Lausanne, Switzerland) is installed under the case.
The control box is installed on the back side of the case, and it contains an embedded control board (Model: Jeston Nano, NVIDIA Company, Santa Clara, CA, USA) and peripheral circuit boards. A DC 24V lead-acid battery (Model: GP1272 F2, CSB Energy Technology Co., Ltd., Taipei city, Taiwan) is the power source for the entire system. The specifications of the weeding system are shown in Table 1.
Considering the lowest transmission loss, the double-gear chain transmission mechanism was designed. This design concept was derived from the mechanical transmission principle of the bicycle. The transmission component adopts a sprocket, which is made of medium carbon steel.
First, the DC motor rotates to drive the upper gear, and the chain of the upper gear drives the lower gear. The lower gear is fixed in the case on the left side, and is connected to the left and right disks by a coaxial connector. Close to the center point of the two discs, a square seat is locked, and a cylindrical rod is installed in it, which is inserted into the square coupler and is connected to a ball bearing seat inside the casing. There are holes in different positions on the end of the cylindrical rod, and the user can select a suitable hole position and lock the weeding tool on the cylindrical rod to adjust the distance between the weeding tool and the ground. When the motor rotates, the weeding handle has a reciprocating swinging behavior (Figure 3). This operation mode is like a farmer holding a hoe for weeding. The sequence of this motion involves extending the weeding tools, digging down, turning up the roots of the weeds, throwing away the weeds, and retracting the weeding tools.
Assume that the torque, speed, and radius of the upper sprocket are Ta, na, and ra, respectively. The chain connects the upper and lower gears. Without considering the transmission and mechanical friction, the torque Tb and speed nb of the lower gear are:
T b = G b G a × T a
n b = G a G b × n a
Among them, Ga and Gb represent the ratio of the number of teeth of the upper gear to the lower gear, respectively. Since the lower gear and the two discs are on the same axis, the disc rotation speed nc = nb, the tangential torque of the fixed point of the cylindrical rod on the disc Tc is:
T c = r b r c × T b
where rb represents the radius of the lower gear and rc is the distance between the center of the disc and the center of the square seat. Assuming that point o is a fixed point, the distance from point o to the ground is defined as h and the depth of weeding as D. When the center point of the square seat is at the positions ➊, ➋, ➌, and ➍ in Figure 3, the length l from point p to the end of the cylindrical rod can be defined as:
l = h c o s 1 θ
where θ depicts the angle of weeding. When θ = 0 ° (position ➋), the length reaches the maximum value lmax:
l m a x = h + d
When the center point of the square seat is at position 4 (origin position), l has a minimum length lmin.
During the weeding process, a digital camera takes an image of the planted area, and a YOLOv3-based deep learning method is used to detect and locate weeds (see Figure 4a). Suppose v depicts the moving speed of the vehicle and s represents the operation range of weeder between the center point p (xp, yp) of the weed detection frame and the point q (xq, yq) below the weed cutter, as shown in Figure 4b. The orange frame represents the detection results. The green arrow indicates the heading of the trailer and the dashed box indicates ground truth. The light gray area represents the weeding range, w is the width of the weeding, and the white color line represents the upper and lower boundary of the weeding range. Once two weeds are detected and appear in the gray area, the object with the largest frame area is selected. In addition, the size of weeds that are too small are ignored because they have little effect on the growth of the crop. When the trailer moves for t = s/v seconds, once the weeding system detects weeds, the system must activate the weeding tool within t seconds to remove the weeds.

2.2. System Description

2.2.1. Hardware

The sensing and control circuit components in the weeding system include a main control board, relays (JQC-3FF-S-Z, Tongling), DC motors, digital cameras, DC/DC conversion modules (model: XL4005, XLSEMI company, Shanghai, China), proximity switches, and automatic voltage regulators (AVRs). The circuit system architecture is depicted in Figure 5a. The function of the main control unit is to execute weed detection algorithms and motor drive and control decisions. The main control board can receive the images taken by the digital camera via the Universal serial bus (USB) port and store them in the memory. Two sets of relays are connected to the general-purpose input/output (GPIO) port of the main control board, which can receive the driving signal output from the main control board to start and stop the motor.
The proximity switch (type: normal open (NO)) is used to detect whether the square seat in the weeding mechanism has returned to the original position, and the detection signal is then input into the main control unit through the GPIO interface. The 24 V battery provides power for circuit components, including motors and proximity switches. The negative output terminal “−” of the battery is connected to the ground (GND) terminal of the circuit board. The DC/DC module is used to convert 24 V to 5 V for the embedded control board; these components and the control board are integrated in a waterproof control box, as shown in Figure 5b. The upper layer is a circuit board, which mainly integrates DC/DC conversion modules, relays, and other electronic components, and the lower layer is for placing an embedded control board.

2.2.2. Software

The YOLOv3 tool [57] is a common deep learning model used to quickly detect objects. It is executed in the Darknet environment. Residual neural network (RestNet) [61] and feature pyramid networks (FPN) are its main architectures, which can improve the prediction ability of small objects. This network tool is used to detect weed objects. A desktop computer with a high-speed computing processor (Model: Intel i5-8400, Intel Co., Santa Clara, CA, USA) is paired with a high-speed graphics processing unit (GPU) (Model: GTX 1070, Nvidia Co., Santa Clara, CA, USA) to train the YOLOv3 network model. The training model of YOLOv3 is configured as follows: Batch size set to 64, image size resized to 416 × 426 pixels, subdivision of 32, momentum of 0.9, decay of 0.0005, learning rate of 0.001, batch size of 64, etc. After that, image preprocessing is performed, including image cropping, white balance, and noise filtering processing, which is then marked by trained technicians and used for model training and evaluation. Among them, 80% of the images are used for training and 20% are used for testing. The bounding box of the region of interest is drawn and exported to YOLO format for model development.
During training, the training loss of each epoch is recorded to evaluate the performance of the visualization model in real time. Once the loss is stable and there is no significant change, the training process stops, and the corresponding weights of the model are saved for further evaluation of the weed detection performance. The trained YOLOv3 model integrates the weeding control program and is embedded in the weeding system. Figure 6 shows the program execution flow, which is written in python language. First, the function library is imported, including the external function (ctype.cdll), multi-threading module, and open source computer vision library (cv2). Then the GPIO pins, data type, class, structure, and subfunctions are defined. The next step is to set, import, and load the environmental variables of Darknet; it also includes defining the frame selection parameters and their storage file paths.
The program is executed to perform a while loop, the image is read and converted from the blue (B)–green (G)–red (R) color layer to the RGB color layer, and then weed detection operation is performed. Once the weed object is detected, the value “1” is written to the text file. Otherwise, the value “0” is written to the text file. The detection results, including bounding box and labels, are displayed in the image (see Figure 6a). In the process of program execution, the multi-threaded module is activated and the motor control program is executed synchronously (Figure 6b). In the while loop, the text file value is open and read. When the value is 1, the system outputs a signal to start the motor, otherwise it stops the motor. A function Delay() with a delay time is inserted into the program for starting and stopping the motor.

2.3. Performance Evaluation Metrics

The performance indicators for detecting weeds will be defined in this section, including the precision, recall, and F1-score, as well as descriptions of the efficiency of weeding and the rate of plant damage.

2.3.1. Weed Detection

The detection performance metrics used to evaluate YOLOv3 include the precision, recall, and F1 score [61]. The accuracy index is as per Equation (6):
δ P = T P T P + F P  
TP (true positive solution) represents a true positive test result is one that detects the condition when the condition is presented; in contrast, FP (false positive solution) is the opposite result.
Ideally, the FP should be as small as possible in order to ensure the accuracy of the network in identifying each object. The intersection-over-union (IoU) is a method to define whether the detected object is a positive solution, as shown in Equation (7):
u = U d U y U d U y  
where Ud and Uy indicate the ground truth and predict boxes of the deep neural network, respectively, and the symbols “∩” and “∪” depict the intersection and union operator, respectively. If u is larger than the threshold uT, the prediction result is regarded as a TP; otherwise, it is regarded as an FP.
The recall rate is a metric that quantifies the number of correct positive predictions made from all possible positive predictions, and its definition is shown in Equation (8).
δ R = T P T P + F N  
where FN depicts the false negative test result. The sum of TP and FN in Equation (8) is just the number of ground-truths, so there is no need to compute the number of FN. The F1-score (δf) is a weighted average of the precision and recall which is performed as a trade-off between δR and δP to demonstrate the comprehensive performance of the trained models.
δ f = 2 δ P δ R δ P + δ R  
The values of δf range from 0 to 1, where 1 means the highest accuracy. Through the uT setting for the confidence score at various recall levels, different pairs of precision and recall are generated with recall on the x-axis and precision on the y-axis, which can be drawn as a precision–recall (PR) curve, indicating their association and can be employed to measure the performance of the weed detection.

2.3.2. Weeding Efficiency

We conducted field tests in the field to evaluate the performance of the weeding machine for weeding operations. The evaluation metrics include weeding efficiency and plant damage, which are shown in Equqtions (10) and (11):
η = ( W W ¯ ) / W
D = d ¯ / d
Among them, W and W ¯ represent the number of weeds before and after weeding, respectively, and d ¯ and d represent the damaged crop and the total amount of crops, respectively.

3. Experimental Results

This section explains the data collection and model training methods. In addition, two test scenarios were used to evaluate weed detection performance and weeding efficiency

3.1. Data Collection and Model Training

Images were collected in the field under different climates and time periods. A digital camera was used to take a total of 140 images of weeds in the experimental field. Image processing technology, including geometric transformation (resize, crop, rotate, horizontal flip, etc.) and intensity transformation (such as contrast and brightness enhancement, color and noise adjustment), was used to modify the original image, thereby increasing the number of image samples, which totaled 60.
Then, the image size was adjusted from 1920 × 1080 to 416 × 416 pixels to fit the YOLOv3 model network, and then, each weed in each image was marked with an object box for model training. There were 160 images in the training set, 30 images in the validation set, and 10 images in the test set. When the number of iterations reached 20,000 times and the loss function approached 0.135, the training was stopped and the weight value of the network was obtained. Finally, the trained model was used to evaluate the performance of weed detection.

3.2. Experimental Test

The experiment site is located in front of the Department of Biomechanical Engineering of National Pingtung University of Science and Technology (longitude: 120.6059°; latitude: 22.6467°). The experiment period was from 5 August to 15 September 2021. Vegetable crops were grown for 20 days on the cropland ridges. The length of each cropland ridge in the field was approximately 20 m and the width was 25 cm. The spacing between each plant was 50 cm. The number and location of the weeds within the cropland were recorded in advance. These data were used for a comparison with the experimental results. In addition, we set up a hoist machine at the end of the field, and hooked the trailer with a steel shackle. The user was able to adjust the speed of the hoist machine to maintain the forward speed of the trailer.
Two experiments were used to verify the performance of the weeding system. Experiment 1 was mainly to test the weed removal performance of the weeding machine on both sides of the crop. Two weeding machines were used. Among them, the weeder machine (Weeder #1) was mounted on the right side of the vehicle, and the first-generation weeder machine (Weeder #2) was mounted on the other side. An inverted triangle-shaped weeding tool was installed on the right machine, and a claw-shaped weeding tool was installed on the left machine. Experiment 2 was mainly to test the weeding performance of the weeder (Weeder #1) proposed in this study in the intrarow of crops. Weeder #2 was mounted at the center of the rear of the trailer.
The test scenarios of Experiments 1 and 2 are shown in Figure 7. The dashed border represents the area of weed detection. The mechanical design parameters and specifications of the modified weeder (Weeder #1) based on previous research results [60] are demonstrated in Table 2. When the weeding tool was at the origin of the mechanism, the distance between the coupler in the mechanism and the surface of the ground was h = 16.9 cm. When the weeding tool was activated, the excavation depth for the weeding tool was d = 3 cm. The maximum and minimum lengths of the cylindrical rod were lmax = 26 cm and lmin = 15 cm, respectively.
When the weeding operation was completed, manually the number of weeds that had not been removed and the number of damaged crops on the cropland ridges were recorded. Weeds that are too small are ignored. When the roots of the weeds were exposed to the soil surface, it was considered that the weeds had been successfully removed.

3.3. Results and Discussion

3.3.1. Performance of Weed Detection Using the YOLOv3 Model

The trained YOLOv3 model was verified to detect weeds in different climatic conditions. During the experiment, the climatic conditions were cloudy in the morning and at noon, cloudy in the afternoon, and cloudy in the afternoon. When the vehicle was moving, the weeding tool was not activated. Only the digital camera under the weeding tool was used to shoot the image on the cropland, and the image samples were taken by a digital camera every 2 h. The image samples were stored in the memory card. The number of weed objects is counted in each image that were framed (or unframed), and Equations (6), (8) and (9) were finally used to evaluate detection performance of the model.
Table 3 shows the results of weed detection using the YOLOv3 model in different time periods. The results show that the F1 score was between 74.3% and 92.8%, especially during the period from 10:00 to 13:00, where the accuracy was up to 95.6% and the F1-score value was also the highest. It is worth noting that due to the low light intensity during 18:00–19:00, the accuracy rate and recall rate are reduced.
Figure 8 shows the weed detection results of each time interval, where the green frame represents the area where weeds are detected. It can be seen from these figures that most of the weed objects were framed, and only a few weeds were not framed between 18:00 and 19:00.
Then, weed detection experiments were carried out on different days, and the climatic conditions during the detection process were variable, including cloudy, sunny, and rainy. Figure 9 shows average detection performance results obtained at different time intervals in the same field using the YOLOv3 network model. The evaluation metrics at different time intervals include precision, recall, and F1-score, each representing a ten day average value.

3.3.2. Performance of Weeder

The experimental weeder tests was conducted from 10:00 to 12:00, and the weather conditions were sunny. Due to the limited area of the site, two experiments were carried out in a single day and repeated on three different days. Finally, the data obtained from the three times were averaged. Figure 10 shows the actuation behavior of the weeding tool. In Figure 10a, “➊” and “➋” in the white frame indicate the visible range of the camera on the left and right weeding tools. The orange line indicates the position of the weeding tool, which is the origin of the mechanism. When the vehicle was moving, once the weeds had been detected, the weeder was activated (the weeding tool on the right side of Figure 10b). In contrast, Weeder #1 was maintained at the origin of the mechanism when the weeds could not be detected (as shown in Figure 10b, the left weeder—Weeder #2).
The effective cutting width of the two weeding machines is 20 cm. The data given in Table 4 show that in scenario 1, when the vehicle speed was 10 and 15 cm/s, the weeding efficiency was between 84% and 90.9%, which is equivalent to an hourly working area of up to 72 and 108 m2. The average F1-score values of the deep learning networks in the left and right weeders were between 0.841 and 0.901. When the trailer speed increased to 20 cm/s, its weeding efficiency was significantly reduced, and the F1-score value was able to still reach approximately 0.867.
In scenario 2, when the vehicle moving speed was 10 and 15 cm/s, the weeding efficiency when using Weeder #1 was 92.3% and 82.6%, respectively, the crop damage rate was 5.5% and 11.1%, and the F1-score was at least 0.890. The weeding efficiency of using Weeder #2 was 87.0% (10 cm/s) and 78.6% (15 cm/s), respectively, the crop damage rate was 8.33% and 13.8%, and the F1 score value was above 0.878. Once the vehicle speed increased to 20 cm/s, the weeding efficiency of using Weeder #1 and Weeder #2 dropped to 64% and 56%, respectively, and the crop damage rate increased to 44.4% and 52.7%. However, the F1-score values were still 0.833 and 0.848, respectively.
Figure 11 shows an image of the weeds being removed by two weeding tools and the damage of the crops. Most of the roots of the weeds were turned up to the soil surface (Figure 11a,d), and some of the weeds on the edge of the weeding tool’s coverage area were also turned up (Figure 11b,e). Some crops were slightly shifted or damaged from their original position due to the activation of the weeding tools (Figure 11c,f).

3.3.3. Discussion

There were three types of weeds in the experimental field, namely gramineous weeds, cyperaceae and broadleaf grasses, of which sedges and broadleaf grasses accounted for a higher proportion. At the end of each weeding experiment, we recorded the number of weeds remaining in the field, and most of these weed objects were detected. Part of the weeds did not actually turn up and the roots of some weeds were not removed due to the position of the weeds on both sides of the cutting width of the weeding tools. In addition, different shapes of weeding tools have different effects on different types of weeds. The claw rake-type weeding tool is suitable for shallow-rooted weeds. In contrast, the weeding tools used in this study are more suitable for removing weeds with deep roots, such as the tuber roots of Cyperaceae.
Second, the speed of the vehicle needs to match the weeding time. When the speed is greater than 20 cm/s, the weeding tool cannot accurately turn up the weeds. Especially under high weed density, some weeds cannot be removed immediately. The experimental results showed that the vehicle has a 92.6% success rate of weeding when the moving speed is lower than 15 cm/s. The cutter can shovel 3 cm below the ground. The, the height of the camera and the ground, and the distance between the camera and the weeding tool are 10 cm and 20 cm respectively. However, when the vehicle moves at a speed of 20 cm/sec, the highest success rate is only 64%, and there is a 44.4% crop damage rate. The loop speed of the weeding machine is set to one circle per second. If the moving speed of the trailer exceeds 20 cm/s, it increases the probability of crop damage and reduce the efficiency of weeding. A relatively slow speed is required to achieve a higher weeding success rate without damaging the crop. It is worth noting that when multiple weeds appear in the image simultanously, select the weed object with the largest area to maximize the weeding efficiency. In addition, before using this weeding tools, make sure that there are no large stones or bricks in the soil to avoid damage to the weeder. Because the steel cable is used by the hoist to pull the vehicle, when the vehicle is moving, the ground is relatively uneven, and there are several short speed changes during the movement of the vehicle, resulting in a time deviation. However, the deviation of weeding is still within the acceptable range.
The frame rate of YOLOv3 is set to 5 frame per second (fps), which can meet the requirements of real-time detection. A small number of weed samples were provided to the YOLOv3 model for training. Its network model was able to effectively detect weed objects with an accuracy rate of up to 95.6%. As far as we know, there are no relevant studies that use the YOLOv3 model to detect individual weeds in the field and use weeding tools to weed them. Since the number of image samples has an impact on the model detection performance, too few samples will reduce the model recognition performance [62].
In this study, the images were taken by mobile phones and some of the images were obtained using data augmentation technology. With a limited number of images, the weed detection model will still have different detection performance due to the difference in the brightness of the image background. In Scenario 1, the brightness of the images captured by the cameras on both sides is different due to the mask of the body frame and the asymmetry of the position of the weeding equipment, resulting in different model detection results (F1-score) of the two modules. The F1-score of the deep learning model designed in this research can reach above 0.83. Although the use of image processing technology can achieve a recognition rate of more than 90% in the identification of individual weeds and crops [27]. However, due to the influence of unstable light, the recognition rate fluctuates greatly. Using YOLOv3 model to detect weeds in low light conditions, the accuracy rate dropped slightly, but it remained at 83.2%. On the other hand, when the deep learning model detects eggs, its detection results are not affected by light [63], which is slightly different from the results of this study. The reason may be that the characterization of the detected object is more complicated. In weak light intensity environments, the performance of the model is still affected. This result still needs to be further studied.
The advantage of using the YOLOv3 model based on the Darknet-53 architecture is that it can quickly obtain the main characteristics of a weed or crop, and even features outside of human visual perception [55]. It can be observed from Fig. 11 that tiny weeds still remain on the soil surface. This result is acceptable. The dynamic balance of farmland agroecosystems will be improved when the composition of the weed community is changed, and the biodiversity of farmland will be improved [64].
The weeder is equipped with only one camera, and its weeding system can detect all weeds in the image. The proposed system does not involve the construction of multiple cameras and complex detection systems that require lighting control [42]. Meanwhile, the YOLOv3 model can also solve the identification limit of the same size of crops and weeds [44]. This study proposes an alternative strategy for single weed removal, replacing the traditional all-in-one weeding (chemical or physical) method. Small weeds on the field are neglected, which can improve the dynamic balance of the farmland ecosystem and increase the biodiversity of the farmland [64].
Finally, the use of a new-generation YOLOv4 network can shorten the time for object recognition [65]. If there are multiple different types of objects in the image or there are complex backgrounds, this method should be explored and studied.

4. Conclusions

The proposed weeder uses deep learning technology to detect weeds in the field and can use a special weeding tool to remove the weeds. The experimental results herein confirmed the effectiveness of the machine for weeding. At travel speeds of vehicle below 15 cm/s, the weeding system can detect the weed signal with a detection speed 5 fps of YOLOv3 and the average weeding efficiency is 88.6%. With an F1-score of 89.5% and a recall rate of 90.1%, the average detection accuracy rate is 90.7%. These results were from field trials of vegetable under different climate condition, which also included various densities of weeds. Since most of the deep learning model is only used to detect objects in the image; and the operating conditions of the weeder depend on the detection results of the contact or non-contact sensors on the machine. In this study, a smart farming method combining deep learning and weeding control was proposed. Its advantage lies in reducing the number of sensors used and the cost of maintenance. In addition, the powerful deep learning method can also identify different types of crops and weeds, with high flexibility.
The proposed weeder can be installed on the pylon behind the tractor, and multiple units can be made to be used on farmland of different scales and areas. The weeder is suitable for low-density weeds, early germination of weeds, or farming environments with deep roots of weeds, such as rice in wetlands or weeding in fields that have been prepared. The use of the proposed weeder can indeed destroy the growth conditions of weeds while reducing environmental medication. In addition, the weeder adopts DC power supply, which has a low production cost (approximately 1000 US dollars) and power consumption (approximately 500 W/h), which is of great significance for energy saving and environmental protection.
Future work will focus on the improvement of the performance of the weeder, including reducing the weight of the weeder and adjusting the rotation speed of the weeding tool in real time to adapt to different speed of vehicle. This deep learning method will also be tested to distinguish crops or weeds of the same size but different colors. Finally, install this weeder on a large tractor for tillage farming verification.

Author Contributions

Conceptualization, C.-L.C. and B.-X.X.; Methodology, C.-L.C.; Software, B.-X.X. and S.-C.C.; Verification, C.-L.C., B.-X.X. and S.-C.C.; Data management, B.-X.X.; Writing-manuscript preparation, C.-L.C. and B.-X.X.; writing—Review and edit, C.-L.C.; visualization, C.-L.C. and B.-X.X.; supervision, C.-L.C.; project management, C.-L.C.; fund acquisition, C.-L.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Ministry of Science and Technology (MOST), Taiwan, grant number MOST 109-2321-B-020-004; MOST 110-2221-E-020-019.

Data Availability Statement

The datasets presented in this study are available from the corresponding author on reasonable request.

Acknowledgments

Many thanks to all anonymous reviewers for their constructive comments on this manuscript. Meanwhile, we sincerely thank Wen-Chung Li, Director of the Department of Biomechanical Engineering, National Pingtung University of Science and Technology, for providing administrative support and Wei-Cheng Chen for assisting in the maintenance of the experimental site.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zambon, I.; Cecchini, M.; Egidi, G.; Saporito, M.G.; Colantoni, A. Revolution 4.0: Industry vs. agriculture in a future development for SMEs. Processes 2019, 7, 36. [Google Scholar] [CrossRef] [Green Version]
  2. Pierce, F.J.; Nowak, P. Aspects of Precision Agriculture. Adv. Agron. 1999, 67, 1–85. [Google Scholar]
  3. McBratney, A.; Whelan, B.; Ancev, T.; Bouma, J. Future Directions of Precision Agriculture. Precis. Agric. 2005, 6, 7–23. [Google Scholar] [CrossRef]
  4. Oerke, E.C.; Dehne, H.W. Safeguarding production—Losses in major crops and the role of crop protection. Crop Prot. 2004, 23, 275–285. [Google Scholar] [CrossRef]
  5. Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
  6. Singh, G. Development and fabrication techniques of improved grubber. AMA-Agric. Mech. Asia Afr. 1988, 19, 42–46. [Google Scholar]
  7. Bah, M.D.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef] [Green Version]
  8. Rani, L.; Thapa, K.; Kanojia, N.; Sharma, N.; Singh, S.; Grewal, A.S.; Srivastav, A.L.; Kaushal, J. An extensive review on the consequences of chemical pesticides on human health and environment. J. Clean. Prod. 2021, 283, 124657. [Google Scholar] [CrossRef]
  9. Van Der Weide, R.Y.; Bleeker, P.O.; Achten, V.T.J.M.; Lotz, L.A.P.; Fogelberg, F.; Melander, B. Innovation in mechanical weed control in crop rows. Weed Res. 2008, 48, 215–224. [Google Scholar] [CrossRef]
  10. Chandel, A.K.; Tewari, V.K.; Kumar, S.P.; Nare, B.; Agarwal, A. On-the-go position sensing and controller predicated contact-type weed eradicator. Curr. Sci. 2018, 114, 1485–1494. [Google Scholar] [CrossRef]
  11. Steward, B.L.; Tian, L.F.; Tang, L. Distance-based control system for machine vision-based selective spraying. Trans. ASAE 2002, 45, 1255–1262. [Google Scholar] [CrossRef] [Green Version]
  12. Zaman, Q.U.; Esau, T.J.; Schumann, A.W.; Percival, D.C.; Chang, Y.K.; Read, S.M.; Farooque, A.A. Development of prototype automated variable rate sprayer for real-time spot-application of agrochemicals in wild blueberry fields. Comput. Electron. Agric. 2011, 76, 175–182. [Google Scholar] [CrossRef]
  13. Ahmad, M.T. Development of an Automated Mechanical Intra-Row Weeder for Vegetable Crops. Master’s Dissertation, Iowa State University, Ames, IA, USA, 2012. [Google Scholar]
  14. Bawden, O.; Ball, D.; Kulk, J.; Perez, T.; Russell, R. A lightweight, modular robotic vehicle for the sustainable intensification of agriculture. In Proceedings of the Australasian Conference on Robotics and Automation, Melbourne, Australia, 2–4 December 2014; pp. 1–9. [Google Scholar]
  15. Cordill, C.; Grift, T.E. Design and testing of an intra-row mechanical weeding machine for corn. Biosyst. Eng. 2011, 110, 247–252. [Google Scholar] [CrossRef]
  16. Perez-Ruiz, M.; Slaughter, D.C.; Fathallah, F.A.; Gliever, C.J.; Miller, B.J. Co-robotic intra-row weed control system. Biosyst. Eng. 2014, 126, 45–55. [Google Scholar] [CrossRef]
  17. Utstumo, T.; Urdal, F.; Brevik, A.; Dørum, J.; Netland, J.; Overskeid, Ø.; Berge, T.W.; Gravdahl, J.T. Robotic in-row weed control in vegetables. Comput. Electron. Agric. 2018, 154, 36–45. [Google Scholar] [CrossRef]
  18. Gobor, Z. Mechatronic system for mechanical weed control of the intra-row area in row crops. KI—Künstliche Intell. 2013, 27, 379–383. [Google Scholar] [CrossRef]
  19. Rask, A.M. A review of non-chemical weed control on hard surfaces. Weed Res. 2007, 47, 370–380. [Google Scholar] [CrossRef]
  20. Astrand, B.; Baerveldt, A.J. An agricultural mobile robot with vision-based perception for mechanical weed control. Auton. Robot. 2002, 13, 21–35. [Google Scholar] [CrossRef]
  21. Griepentrog, H.; Nørremark, M.; Nielsen, J. Autonomous intra-row rotor weeding based on GPS. In Proceedings of the CIGR World Congress Agricultural Engineering for a Better World, Bonn, Germany, 3–7 September 2006. [Google Scholar]
  22. Tillett, N.D.; Hague, T.; Grundy, A.C.; Dedousis, A.P. Mechanical within row weed control for transplanted crops using computer vision. Biosyst. Eng. 2008, 99, 171–178. [Google Scholar] [CrossRef]
  23. Nørremark, M.; Griepentrog, H.W.; Nielsen, J.; Søgaard, H.T. Evaluation of an autonomous GPS-based system for intra-row weed control by assessing the tilled area. Precis. Agric 2012, 13, 149–162. [Google Scholar] [CrossRef]
  24. Peruzzi, A.; Martelloni, L.; Frasconi, C.; Fontanelli, M.; Pirchio, M.; Raffaelli, M. Machines for non-chemical intra-row weed control in narrow and wide-row crops: A review. J. Agric. Eng. 2017, 48, 57–70. [Google Scholar] [CrossRef] [Green Version]
  25. Schimmelpfennig, D. Farm Profits and Adoption of Precision Agriculture; USDA: Washington, DC, USA, 2016; Volume 217, pp. 1–46.
  26. Michaels, A.; Haug, S.; Albert, A. Vision-based high-speed manipulation for robotic ultra-precise weed control. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 5498–5505. [Google Scholar]
  27. Chang, C.L.; Lin, K.M. Smart agricultural machine with a computer vision-based weeding and variable-rate irrigation scheme. Robotics 2018, 7, 38. [Google Scholar] [CrossRef] [Green Version]
  28. McCool, C.; Beattie, J.; Firn, J.; Lehnert, C.; Kulk, J.; Bawden, O.; Russell, R. Efficacy of mechanical weeding tools: A study into alternative weed management strategies enabled by robotics. IEEE Robot. Autom. 2018, 3, 1184–1190. [Google Scholar] [CrossRef]
  29. Naïo Technologies. Autonomous Vegetable Weeding Robot—Dino. Available online: https://www.naio-technologies.com/en/dino/ (accessed on 3 August 2021).
  30. Fennimore, S.A.; Cutulle, M. Robotic weeders can improve weed control options for specialty crops. Pest Manag. Sci. 2019, 75, 1767–1774. [Google Scholar] [CrossRef] [PubMed]
  31. Raja, R.; Nguyen, T.T.; Slaughter, D.; Fennimore, S. Real-time robotic weed knife control system for tomato and lettuce based on geometric appearance of plant labels. Biosyst. Eng. 2020, 194, 152–164. [Google Scholar] [CrossRef]
  32. Kumar, S.P.; Tewari, V.K.; Chandel, A.K.; Mehta, C.R.; Nare, B.; Chethan, C.R.; Mundhada, K.; Shrivastava, P.; Gupta, C.; Hota, S. A fuzzy logic algorithm derived mechatronic concept prototype for crop damage avoidance during eco-friendly eradication of intra-row weeds. Artif. Intell. Agric. 2020, 4, 116–126. [Google Scholar] [CrossRef]
  33. Sujaritha, M.; Annadurai, S.; Satheeshkumar, J.; Sharan, S.K.; Mahesh, L. Weed detecting robot in sugarcane fields using fuzzy real time classifier. Comput. Electron. Agric. 2017, 134, 160–171. [Google Scholar] [CrossRef]
  34. Wu, Z.; Chen, Y.; Zhao, B.; Kang, X.; Ding, Y. Review of weed detection methods based on computer vision. Sensors 2021, 21, 3647. [Google Scholar] [CrossRef] [PubMed]
  35. Dadashzadeh, M.; Abbaspour-Gilandeh, Y.; Mesri-Gundoshmian, T.; Sabzi, S.; Hernández-Hernández, J.L.; Hernández-Hernández, M.; Arribas, J.I. Weed classification for site-specific weed management using an automated stereo computer-vision machine-learning system in rice fields. Plants 2020, 9, 559. [Google Scholar] [CrossRef] [PubMed]
  36. Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine learning in agriculture: A review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Tellaeche, A.; Pajares, G.; Burgos-Artizzu, X.P.; Ribeiro, A. A computer vision approach for weeds identification through Support Vector Machines. Appl. Soft Comput. J. 2011, 11, 908–915. [Google Scholar] [CrossRef] [Green Version]
  38. Bakhshipour, A.; Jafari, A. Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Comput. Electron. Agric. 2018, 145, 153–160. [Google Scholar] [CrossRef]
  39. Ahmed, F.; Al-Mamun, H.A.; Bari, A.S.; Hossain, E.; Kwan, P. Classification of crops and weeds from digital images: A support vector machine approach. Crop Prot. 2012, 40, 98–104. [Google Scholar] [CrossRef]
  40. Tang, J.; Chen, X.; Miao, R.; Wang, D. Weed detection using image processing under different illumination for site-specific areas spraying. Comput. Electron. Agric. 2016, 122, 103–111. [Google Scholar] [CrossRef]
  41. Mahajan, S.; Raina, A.; Gao, X.Z.; Pandit, A.K. Plant recognition using morphological feature extraction and transfer learning over SVM and AdaBoost. Symmetry 2021, 13, 356. [Google Scholar] [CrossRef]
  42. Raja, R.; Nguyen, T.T.; Vuong, V.L.; Slaughter, D.; Fennimore, S.A. RTD-SEPs: Real-time detection of stem emerging points and classification of crop-weed for robotic weed control in producing tomato. Biosyst. Eng. 2020, 195, 152–171. [Google Scholar] [CrossRef]
  43. Chen, Y.; Wu, Z.; Zhao, B.; Fan, C.; Shi, S. Weed and corn seedling detection in field based on multi feature fusion and support vector machine. Sensors 2021, 21, 212. [Google Scholar] [CrossRef]
  44. Elstone, L.; How, K.Y.; Brodie, S.; Ghazali, M.Z.; Heath, W.P.; Grieve, B. High Speed Crop and Weed Identification in Lettuce Fields for Precision Weeding. Sensors 2020, 20, 455. [Google Scholar] [CrossRef] [Green Version]
  45. McCool, C.; Perez, T.; Upcroft, B. Mixtures of lightweight deep convolutional neural networks: Applied to agricultural robotics. IEEE Robot. Autom. Lett. 2017, 2, 1344–1351. [Google Scholar] [CrossRef]
  46. Yu, J.; Sharpe, S.; Schumann, A.; Boyd, N. Deep learning for image-based weed detection in turfgrass. Eur. J. Agron. 2019, 104, 78–84. [Google Scholar] [CrossRef]
  47. Jiang, S.; Li, X.; Xing, Y. Repair method of data loss in weld surface defect detection based on light intensity and 3D geometry. IEEE Access 2020, 8, 205814–205820. [Google Scholar] [CrossRef]
  48. Dyrmann, M.; Karstoft, H.; Midtiby, H.S. Plant species classification using deep convolutional neural network. Biosyst. Eng. 2016, 151, 72–80. [Google Scholar] [CrossRef]
  49. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
  50. Kamilaris, A.; Prenafeta-Boldu, F.X. A review of the use of convolutional neural networks in agriculture. J. Agric. Sci. 2019, 156, 312–322. [Google Scholar] [CrossRef] [Green Version]
  51. Koirala, A.; Walsh, B.; Wang, Z.; McCarthy, C. Deep learning for real-time fruit detection and orchard fruit load estimation: Benchmarking of ‘mangoyolo’. Precis. Agric. 2019, 20, 1107–1135. [Google Scholar] [CrossRef]
  52. Hasan, A.S.M.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G.K. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
  53. Kounalakis, T.; Triantafyllidis, G.; Nalpantidis, L. Deep learning-based visual recognition of rumex for robotic precision farming. Comput. Electron. Agric. 2019, 165, 104973. [Google Scholar] [CrossRef]
  54. Sun, J.; He, X.; Ge, X.; Wu, X.; Shen, J.; Song, Y. Detection of key organs in tomato based on deep migration learning in a complex background. Agriculture 2018, 8, 196. [Google Scholar] [CrossRef] [Green Version]
  55. Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodríguez, L. A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering 2020, 2, 471–488. [Google Scholar] [CrossRef]
  56. Partel, V.; Kakarla, S.C.; Ampatzidis, Y. Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. Comput. Electron. Agric. 2019, 157, 339–350. [Google Scholar] [CrossRef]
  57. Redmon, J.; Divvala, S.; Girshick, R. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
  58. Redmon, J.; Farhadi, A. YOLOv3: An incremental improvement. arXiv Preprint 2018, arXiv:1804.02767. [Google Scholar]
  59. Xie, B.X.; Chung, S.C.; Chang, C.L. Design and implementation of a modular AI-enabled shovel weeder. In Proceedings of the IEEE International Symposium on Computer, Consumer and Control (IS3C2020), Taichung, Taiwan, 13–16 November 2020. [Google Scholar] [CrossRef]
  60. Bernacki, H.; Haman, J.; Kanafojski, C. Agricultural Machines, Theory and Construction; U.S. Department of Agriculture and the National Science Foundation: Washington, DC, USA, 1972; Volume 1.
  61. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. arXiv Preprint 2015, arXiv:1512.03385. [Google Scholar]
  62. Ahmad, A.; Saraswat, D.; Aggarwal, V.; Etienne, A.; Hancock, B. Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems. Comput. Electron. Agric. 2021, 184, 106081. [Google Scholar] [CrossRef]
  63. Li, G.; Xu, Y.; Zhao, Y.; Du, Q.; Huang, Y. Evaluating convolutional neural networks for cage-free floor egg detection. Sensors 2020, 20, 332. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  64. MacLaren, C.; Storkey, J.; Menegat, A.; Helen Metcalfe, H.; Dehnen-Schmutz, K. An ecological future for weed science to sustain crop production and the environment. A review. Agron. Sustain. Dev. 2020, 40, 24. [Google Scholar] [CrossRef]
  65. Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. YOLOv4: Optimal speed and accuracy of object detection. arXiv Preprint 2020, arXiv:2004.10934. [Google Scholar]
Figure 1. Smart weeding machine and its four-wheel trailer.
Figure 1. Smart weeding machine and its four-wheel trailer.
Agriculture 11 01049 g001
Figure 2. A drawing prototype of the weeding mechanism.
Figure 2. A drawing prototype of the weeding mechanism.
Agriculture 11 01049 g002
Figure 3. Transmission of the weeding mechanism.
Figure 3. Transmission of the weeding mechanism.
Agriculture 11 01049 g003
Figure 4. Operation concept of the weeding process of the weeder. (a) The relationship between the two-dimensional coordinate points of the weeding tool, the camera and the weeds; (b) Frame selection of weed objects in the snapshot image and description of the range of weeding.
Figure 4. Operation concept of the weeding process of the weeder. (a) The relationship between the two-dimensional coordinate points of the weeding tool, the camera and the weeds; (b) Frame selection of weed objects in the snapshot image and description of the range of weeding.
Agriculture 11 01049 g004
Figure 5. Sensing and circuit architecture: (a) Block diagram of electronic circuit and (b) peripheral electronic component board (upper layer) and main control board (lower layer) in the control box.
Figure 5. Sensing and circuit architecture: (a) Block diagram of electronic circuit and (b) peripheral electronic component board (upper layer) and main control board (lower layer) in the control box.
Agriculture 11 01049 g005
Figure 6. Software program flow for weeding system. (a) program flow for weed detection; (b) program flow for weeding operation.
Figure 6. Software program flow for weeding system. (a) program flow for weed detection; (b) program flow for weeding operation.
Agriculture 11 01049 g006aAgriculture 11 01049 g006b
Figure 7. Illustration of two scenarios for testing the performance of weeding: (a) Using two weeders (Weeder #1 and Weeder #2) to weed the areas on both sides of the cropland ridges (gray areas); (b) using a weeding machine (Weeder #1) for intrarow weeding (the area within the dashed frame).
Figure 7. Illustration of two scenarios for testing the performance of weeding: (a) Using two weeders (Weeder #1 and Weeder #2) to weed the areas on both sides of the cropland ridges (gray areas); (b) using a weeding machine (Weeder #1) for intrarow weeding (the area within the dashed frame).
Agriculture 11 01049 g007
Figure 8. Weed identification results in different time intervals.
Figure 8. Weed identification results in different time intervals.
Agriculture 11 01049 g008
Figure 9. Ten day average detection results at different time intervals.
Figure 9. Ten day average detection results at different time intervals.
Agriculture 11 01049 g009
Figure 10. The operation of the weeder. The orange lines indicate the claw rake (left) and the inverted triangle (right) weeding tools. The white dotted line indicates the area of view taken by the two cameras on the left and right weeders. (a) The weed object is framed (the detection result of area ➊ (upper right corner)) and no weed is detected (the detection result of area ➋ (upper left corner)); (b) the weeding tool on the right is activated, and the left weeding cutter is maintained at the origin of the mechanism.
Figure 10. The operation of the weeder. The orange lines indicate the claw rake (left) and the inverted triangle (right) weeding tools. The white dotted line indicates the area of view taken by the two cameras on the left and right weeders. (a) The weed object is framed (the detection result of area ➊ (upper right corner)) and no weed is detected (the detection result of area ➋ (upper left corner)); (b) the weeding tool on the right is activated, and the left weeding cutter is maintained at the origin of the mechanism.
Agriculture 11 01049 g010aAgriculture 11 01049 g010b
Figure 11. Snapshot of the soil on the field after weeding by the weeding machine. (a) weeds are completely removed by weeder #2, partially removed (b) and damaged crops (c); (d) weeds are completely removed by weeding tool #1, partially removed (e) and damaged crop (f). The red circle and orange arrow indicates the position of the crop roots and the root of the weed, respectively.
Figure 11. Snapshot of the soil on the field after weeding by the weeding machine. (a) weeds are completely removed by weeder #2, partially removed (b) and damaged crops (c); (d) weeds are completely removed by weeding tool #1, partially removed (e) and damaged crop (f). The red circle and orange arrow indicates the position of the crop roots and the root of the weed, respectively.
Agriculture 11 01049 g011
Table 1. The specifications of the weeding machine.
Table 1. The specifications of the weeding machine.
DescriptionValue or Other Details
Mechanism body
Size (L × W × H)216 mm × 180 mm × 278 mm
Weight6 kg
Weeding body
Upper sprocket (Number of teeth (T) × outer diameter (Ø))16 mm × 54 mm
Lower sprocket (T × Ø)32 mm × 102 mm
Roller chain (length; tension)RS35-1; 1150 kgf
Cylindrical rod (D × L) 16 × 200 mm
Weeding tools (L × W × H)90 mm × 47 mm × 80 mm
Disc (D × W)140 mm × 3 mm
Electronics components
Main control board (speed; memory)1.43 GHz; 4 GB 64-bit LPDDR4
DC motor (voltage; gear ratio; torque; speed)24 V; 1:15; 26.7 kg/cm; 120 rpm
Proximity switches (voltage; distance; output)24 V; 10 mm; normal open
Digital camera (resolution; focus type)4 K Ultra HD; auto focus
Battery (voltage, capacity, weight)DC 24 V, 7.2 Ah, 2.4 Kg
Table 2. The parameters and specifications of the modified weeder (Weeder #1).
Table 2. The parameters and specifications of the modified weeder (Weeder #1).
ParametersValueParametersValue
n a 120 rpml185 mm
n b 60 rpm l m i n 150 mm
G a 16 l m a x 260 mm
G b 32 h 169 mm
T a 27 Kg-cm d 30 mm
T b 54 Kg-cm θ 24 degree
T c 50 Kg-cm r b 51 mm
r a 27 mm r c 55 mm
Table 3. Using deep learning models to detect weeds during the daytime.
Table 3. Using deep learning models to detect weeds during the daytime.
DescriptionEvaluation Metrics
WeatherTimePrecisionRecallF1-score
Cloudy and sunny08:00–09:000.9020.8290.864
10:00–11:000.9560.9010.928
12:00–13:000.9360.8850.910
14:00–15:000.9180.8540.885
Cloudy16:00–17:000.9030.8330.867
18:00–19:000.8320.7010.761
Table 4. Performance evaluation results of the weeding system.
Table 4. Performance evaluation results of the weeding system.
ExperimentsType of Weederv
(cm/s)
Number of Weeds η Damaged CropD (%)F1-Score
W W ¯ d ¯
Scenario 1Weeder #1/
Weeder #2
1025 */22 **4/284.0/90.9--0.852/0.901
1526/274/384.6/88.8--0.841/0.889
2024/2111/854.2/61.9--0.851/0.867
Scenario 2Weeder #11026292.325.50.910
1523482.6411.10.890
2025964.01644.40.833
Weeder #21023387.038.330.903
1528678.6513.80.878
20251156.01952.70.848
*, **: Number of weeds on the left* and right** sides of the cropland.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chang, C.-L.; Xie, B.-X.; Chung, S.-C. Mechanical Control with a Deep Learning Method for Precise Weeding on a Farm. Agriculture 2021, 11, 1049. https://doi.org/10.3390/agriculture11111049

AMA Style

Chang C-L, Xie B-X, Chung S-C. Mechanical Control with a Deep Learning Method for Precise Weeding on a Farm. Agriculture. 2021; 11(11):1049. https://doi.org/10.3390/agriculture11111049

Chicago/Turabian Style

Chang, Chung-Liang, Bo-Xuan Xie, and Sheng-Cheng Chung. 2021. "Mechanical Control with a Deep Learning Method for Precise Weeding on a Farm" Agriculture 11, no. 11: 1049. https://doi.org/10.3390/agriculture11111049

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop