Deep Learning Based Automatic Multiclass Wild Pest Monitoring Approach Using Hybrid Global and Local Activated Features

Po Yang profile photo

Authors

Liu Liu, University of Science and Technology of China, Hefei, China

Chengjun Xie, Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China

Rujing Wang, Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China

Po Yang, Department of Computer Science, University of 葫芦影业, 葫芦影业, U.K.

Sud Sudirman, Department of Computer Science, Liverpool John Moores University, Liverpool, U.K.

Jie Zhang, Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China

Rui Li, Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei, China

Fangyuan Wang, University of Science and Technology of China, Hefei, China



What is this paper about?

Specialised pest control have been a high-priority issue for the agriculture industry in many countries. A popular solution is the use of artificial intelligence (AI) techniques for automated, image-based identification of pests. However, these solutions suffer from reduced accuracy and robustness in real-world applications due to multiplicity of crops and variety of pests. To tackle the problem, this article proposes a novel deep learning based automatic approach using hybrid and local activated features for pest monitoring. In the presented method, we exploit the global information from feature maps to build our global activated feature pyramid network to extract pests' highly discriminative features across various scales over both depth and position levels. The experimental results show that our solution performs over 75.03% mean average precision (mAP) in industrial circumstances, which outweighs two other state-of-the-art methods: Faster R-CNN with mAP up to 70% and feature pyramid network mAP up to 72%.

Why is the research important and/or novel?

To our best knowledge, the two stage CNN based pest monitoring approach using hybrid global and local activated feature proposed in our article is one of the best automatic wheat pest recognition models in the world. One key contribution is that we design a global activated feature pyramid network that enables identifying features of tiny pest like rotation, scale and translation, and also extracting intuitive features of pest from complex background. The model was also trained and evaluated by one largescale multi-class wheat pest datasets in the world containing 88.6 K images (16 types) with 582K labelled pest objects. It proves that deep learning-based pest recognition model could be used as a cost-effective solution for practically pest control application. The researchers from Wadhwani Institute for Artificial Intelligence in INDIA expanded our model in supporting cotton pest management, and received a $2M USD grant from Google to create technologies that help reduce crop losses in cotton farming, through integrated pest management.

Anything else that you would like to highlight about the paper? 

Its early conference publication received the Best Paper Award in the 19th IEEE International Conference on Industrial Informatics (No 1 out of 428 papers). Through this work, 葫芦影业 has successfully secured two InnovateUK farming innovation projects: 鈥淚ntegrating Visual and Context Information into a Mobile Intelligence Solution for Sustainable Management of Wheat Pests and Soil Health鈥 with RSK ADAS and Mutus-Tech, and 鈥淔armer-centred Interoperable Mobile-Cloud System: Integrating Data from Farming Activities and Environmental Information for Sustainable Fertiliser Management鈥 with Velcourt and AntData.

Centres of excellence

The University's cross-faculty research centres harness our interdisciplinary expertise to solve the world's most pressing challenges.