Skip to main content
Log in

Automatic monitoring of flying vegetable insect pests using an RGB camera and YOLO-SIP detector

  • Published:
Precision Agriculture Aims and scope Submit manuscript

Abstract

Pests cause heavy crop losses, so it is vital to conduct early pest management and control in precision agriculture. In general, pest monitoring is a foundation for early pest management and control. Conventional pest monitoring using manual sampling and detection is time consuming and labour intensive. Therefore, many studies have explored how to achieve automatic pest monitoring. However, few works have focused on automatic monitoring of flying vegetable insect pests. To close this gap, this study developed an automatic monitoring scheme for flying vegetable insect pests based on two hypotheses: (1) yellow sticky traps could provide reliable information to assess population density of flying vegetable insect pests, and (2) a computer-vision-based detector could accurately detect pests in images. Specifically, yellow sticky traps were exploited to sample flying vegetable insect pests, and an RGB camera was adopted to capture yellow-sticky-trap images; and a computer-vision-based detector called “YOLO for Small Insect Pests” (YOLO-SIP) was used to detect pests in captured images. The hypotheses were tested by using the Heuristics engineering method, installing yellow sticky traps and RGB cameras in vegetable fields, constructing a manually labelled image dataset, and applying YOLO-SIP to the constructed dataset with the mean average precision (mAP), average mean absolute error (aMAE), and average mean square error (aMSE) metrics. Experiments showed that the proposed scheme captured yellow-sticky-trap images automatically and obtained an mAP of 84.22%, an aMAE of 0.422, and an aMSE of 1.126. Thus, the proposed scheme is promising for the automatic monitoring of flying vegetable insect pests.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Abbreviations

\(\mathrm{iw},\mathrm{ih}\) :

Width and height of an image

\(H,W,C\) :

Height, width, and channel number of a feature map

\(M\) :

Number of predicted detections

\(c\) :

Insect pest category

\({\widehat{b}}_{j},{\widehat{s}}_{j}\) :

\(j\)-Th predicted bounding box, and confidence of a specific insect pest category

\({\varvec{I}}\) :

An image

\(K\) :

Number of ground-truth bounding boxes

\({b}_{k}\) :

\(k\)-Th ground-truth bounding box

\(\varepsilon ,\beta\) :

Two thresholds

\({\varvec{v}}\) :

A binary vector

\({\varvec{a}},{\varvec{b}}\) :

Two element sets

\(t\) :

A temporary variable

\(N\) :

Insect pest category number

\(T\) :

Number of test images

\({z}_{i},{\widehat{z}}_{i}\) :

Number of ground-truth insect pests, and number of estimated insect pests in \(i\)-th image

\(\mathcal{N}\) :

Normal distribution

\(\nearrow ,\searrow\) :

Increase and decline, respectively

\(+,-\) :

Elimination and employment of a certain part, respectively

CNN:

Convolutional neural network

i.e.:

That is

YOLO-SIP:

The proposed insect pest detector "YOLO for Small Insect Pests"

AP, mAP:

Average precision, mean average precision

MAE, aMAE:

Mean absolute error, average mean absolute error

MSE, aMSE:

Mean square error, average mean square error

GPU:

Graphics processing unit

MBConv:

Mobile inverted residual bottleneck convolution

fps:

Frames per second

P, R :

Precision, recall

TP, FP, FN:

True positive, false positive, and false negative

IoU:

Intersection over union

Px L, Bt G, Fo P, Ps F, and Bc C :

Plutella xylostella (Linnaeus), Bemisia tabaci (Gennadius), Frankliniella occidentalis (Pergande), Phyllotreta striolata (Fabricius), and Bactrocera cucurbitae (Coquillett)

SE, CBAM, BAM, CA:

Squeeze-and-excitation module, convolutional block attention module, bottleneck attention module, and coordinate attention module

CSPDarknet53, EfficientNetV2:

The convolutional neural networks of (Wang et al., 2020) and (Tan & Le, 2021)

SSD, Faster R-CNN, R-FCN, CenterNet, EfficientDet, YOLOv4:

The generic object detection algorithms of (Liu et al., 2015), (Ren et al., 2017), (Dai et al., 2016), (Zhou et al., 2019), (Tan et al., 2020), and (Bochkovskiy et al., 2020)

csp53, eff2-s, eff2-m, eff2-l:

CSPDarknet53, EfficientV2-S, EfficientV2-M, EfficientV2-L

References

  • Bochkovskiy, A., Wang, C.-Y., and Liao, H.-Y. M. (2020). YOLOv4: Optimal speed and accuracy of object detection. arXiv e-print, page arXiv:2004.10934.

  • Bubbliiiing (2021). Yolov4-pytorch. Git Code. https://github.com/bubbliiiing/yolov4-pytorch (accessed on 15 June 2021).

  • Böckmann, E. (2015) Combined monitoring of pest and beneficial insects with sticky traps, as basis for decision making in greenhouse pest control: A proof of concept study, Doctoral dissertation, Hannover: Gottfried Wilhelm Leibniz Universität Hannover.

  • Dai, J., Li, Y., He, K., and Sun, J. (2016). R-FCN: Object detection via region-based fully convolutional networks. In Proceedings of the 30th International Conference on Neural Information Processing Systems, NIPS’16, page 379–387, Red Hook, NY, USA. Curran Associates Inc.

  • Ding, W., & Taylor, G. (2016). Automatic moth detection from trap images for pest management. Computers and Electronics in Agriculture, 123, 17–28.

    Article  Google Scholar 

  • Ge, Z., Liu, S., Wang, F., Li, Z., and Sun, J. (2021). YOLOX: Exceeding yolo series in 2021. arXiv e-print, page arXiv:2107.08430.

  • He, Y., Zeng, H., Fan, Y., Ji, S., & Wu, J. (2019). Application of deep learning in integrated pest management: A real-time system for detection and diagnosis of oilseed rape pests. Mobile Information Systems, 2019, 1–14.

    Article  Google Scholar 

  • He, Y., Zhou, Z., Tian, L., Liu, Y., & Luo, X. (2020). Brown rice planthopper (Nilaparvata lugens Stal) detection based on deep learning. Precision Agriculture, 21, 1385–1402.

    Article  Google Scholar 

  • Hou, Q., Zhou, D., and Feng, J. (2021). Coordinate attention for efficient mobile network design. In 2021 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 13713–13722, Los Alamitos, CA, USA. IEEE Computer Society

  • Koen, B. V. (1988). Toward a definition of the engineering method. European Journal of Engineering Education, 13(3), 307–315.

    Article  Google Scholar 

  • Köksal, Ö., & Tekinerdogan, B. (2019). Architecture design approach for IoT-based farm management information systems. Precision Agriculture, 20, 926–958.

    Article  Google Scholar 

  • Li, D. (2021). Efficientnetv2.pytorch. Git Code. https://github.com/d-li14/efficientnetv2.pytorch (accessed on 5 May 2021).

  • Li, W., Wang, D., Li, M., Gao, Y., Wu, J., & Yang, X. (2021). Field detection of tiny pests from sticky trap images using deep learning in agricultural greenhouse. Computers and Electronics in Agriculture, 183, 106048.

    Article  Google Scholar 

  • Liu, H., Lee, S.-H., & Chahl, J. S. (2017). A review of recent sensing technologies to detect invertebrates on crops. Precision Agriculture, 18, 635–666.

    Article  Google Scholar 

  • Liu, L., Ouyang, W., Wang, X., Fieguth, P., Chen, J., Liu, X., & Pietikäinen, M. (2020). Deep learning for generic object detection: A survey. International Journal of Computer Vision, 128, 261–318.

    Article  Google Scholar 

  • Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.-Y., and Berg, A. C. (2015). SSD: Single shot multibox detector. arXiv e-prints, page arXiv:1512.02325.

  • Macfadyen, S., Moradi-Vajargah, M., Umina, P., Hoffmann, A., Nash, M., Holloway, J., Severtson, D., Hill, M., Van Helden, M., & Barton, M. (2019). Identifying critical research gaps that limit control options for invertebrate pests in Australian grain production systems. Austral Entomology, 58(1), 9–26.

    Article  Google Scholar 

  • Park, J., Woo, S., & Lee, J. (2019). BAM: Bottleneck Attention Module. arXiv e-print, page arXiv:1807.06514.

  • Partel, V., Nunes, L., Stansly, P., & Ampatzidis, Y. (2019). Automated vision-based system for monitoring Asian citrus psyllid in orchards utilizing artifical intelligence. Computers and Electronics in Agriculture, 162, 328–336.

    Article  Google Scholar 

  • Ren, S., He, K., Girshick, R., & Sun, J. (2017). Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(6), 1137–1149.

    Article  PubMed  Google Scholar 

  • Saleem, M. H., Potgieter, J., & Arif, K. M. (2021). Automation in agriculture by machine and deep learning techniques: A review of recent developments. Precision Agriculture, 22, 2053–2091.

    Article  Google Scholar 

  • Sermanet, P., Eigen, D., Zhang, X., Mathieu, M., Fergus, R., and LeCun, Y. (2013). OverFeat: Integrated recognition, localization and detection using convolutional networks. arXiv e-prints, page arXiv:1312.6229.

  • Shen, Y., Zhou, H., Li, J., Jian, F., & Jayas, D. S. (2018). Detection of stored grain insects using deep learning. Computers and Electronics in Agriculture, 145, 319–325.

    Article  Google Scholar 

  • Stern, V. M., Smith, R. F., van den Bosch, R., & Hagen, K. S. (1959). The integration of chemical and biological control of the spotted alfalfa aphid: The integrated control concept. Hilgardia, 29(2), 81–101.

    Article  CAS  Google Scholar 

  • Tan, M. and Le, Q. V. (2021). Efficientnetv2: Smaller models and faster training. arXiv e-print, page arXiv:2104.00298.

  • Tan, M., Pang, R., and Le, Q. V. (2020). EfficientDet: Scalable and efficient object detection. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 10778–10787, Los Alamitos, CA, USA. IEEE Computer Society

  • Tzutalin (2015). LabelImg. Git Code. https://github.com/tzutalin/labelImg (accessed on 5 May 2021)

  • Wang, C.-Y., Mark Liao, H.-Y., Wu, Y.-H., Chen, P.-Y., Hsieh, J.-W., & Yeh, I.-H. (2020). CSPNet: A new backbone that can enhance learning capability of CNN. 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) (pp. 1571–1580). CA, USA: Los Alamitos.

    Chapter  Google Scholar 

  • Wang, R., Liu, L., Xie, C., Yang, P., Li, R., & Zhou, M. (2021). AgriPest: A largescale domain-specific benchmark dataset for practical agricultural pest detection in the wild. Sensors, 21(5), 1601.

    Article  PubMed  PubMed Central  Google Scholar 

  • Woo, S., Park, J., Lee, J.-Y., & Kweon, I. S. (2018). Cbam: Convolutional block attention module. Computer Vision, ECCV 2018–15th European Conference, 2018, Proceedings (pp. 3–19). Switzerland: Cham.

    Google Scholar 

  • Yao, Q., Chen, G., Wang, Z., Zhang, C., Yang, B., & Tang, J. (2017). Automated detection and identification of white-backed planthoppers in paddy fields using image processing. Journal of Integrative Agriculture, 16(7), 1547–1557.

    Article  Google Scholar 

  • Yao, Q., Xian, D., Liu, Q., Yang, B., Diao, G., & Tang, J. (2014). Automated counting of rice planthoppers in paddy fields based on image processing. Journal of Integrative Agriculture, 13(8), 1736–1745.

    Article  Google Scholar 

  • Zhang, C., Cai, J., Xiao, D., Ye, Y., & Chehelamirani, M. (2018). Research on vegetable pest warning system based on multidimensional big data. Insects, 9(2), 66.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  • Zhang, Y., Zhou, D., Chen, S., Gao, S., & Ma, Y. (2016). Single-image crowd counting via multi-column convolutional neural network. Proceedings - 29th IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 589–597). CA, USA: Los Alamitos.

    Google Scholar 

  • Zhou, X., Wang, D., and Krähenbühl, P. (2019). Objects as points. arXiv e-print, page arXiv:1904.07850.

Download references

Acknowledgements

This work was supported in part by the Key-Area Research and Development Program of Guangdong Province under Grants 2019B020214002 and 2019B020217003, in part by the Science and Technology Planning Project of Guangdong Province under Grant 2021B1212040009, in part by the National Natural Science Foundation of China under contracts 62172165 and 61672242, in part by the Natural Science Foundation of Guangdong Province under Grant 2022A1515010325, and in part by the Science and Technology Program of Guangzhou under Grants 201902010081 and 107126242281.

Funding

Key-Area Research and Development Program of Guangdong Province under Grants 2019B020214002 and 2019B020217003, Science and Technology Planning Project of Guangdong Province under Grant 2021B1212040009, National Natural Science Foundation of China under contracts 62172165 and 61672242, Science and Technology Program of Guangzhou under Grants 201902010081 and 107126242281, Natural Science Foundation of Guangdong Province under Grant 2022A1515010325.

Author information

Authors and Affiliations

Authors

Contributions

QG investigation, conceptualization, methodology, software, simulation, writing–original draft. CW supervision, investigation, conceptualization, validation, writing–reviewing and editing. DX investigation, experimental site building, monitoring device development, data collection and curation, validation. QH investigation, validation. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Chuntao Wang.

Ethics declarations

Conflict of interest

The authors declare no conflicts of interest.

Data Availability

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Guo, Q., Wang, C., Xiao, D. et al. Automatic monitoring of flying vegetable insect pests using an RGB camera and YOLO-SIP detector. Precision Agric 24, 436–457 (2023). https://doi.org/10.1007/s11119-022-09952-w

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11119-022-09952-w

Keywords

Navigation