[1]王婵,朱劲松,周军.基于改进YOLO v8的轻量级水稻病虫害识别模型[J].江苏农业科学,2025,53(20):232-242.
 Wang Chan,et al.A lightweight rice pest and disease recognition model based on improved YOLO v8[J].Jiangsu Agricultural Sciences,2025,53(20):232-242.
点击复制

基于改进YOLO v8的轻量级水稻病虫害识别模型()

《江苏农业科学》[ISSN:1002-1302/CN:32-1214/S]

卷:
第53卷
期数:
2025年第20期
页码:
232-242
栏目:
病虫害智能检测
出版日期:
2025-10-20

文章信息/Info

Title:
A lightweight rice pest and disease recognition model based on improved YOLO v8
作者:
王婵1朱劲松1周军2
1.长江大学经济与管理学院,湖北荆州 434023; 2.南宁职业技术大学,广西南宁 530008
Author(s):
Wang Chanet al
关键词:
MobileNet v4水稻病虫害YOLO v8坐标注意力机制SIoU
Keywords:
-
分类号:
S126;TP391.41
DOI:
-
文献标志码:
A
摘要:
水稻病虫害检测是农业领域的重要任务,但传统人工方法和现有机器学习模型在效率与准确性方面存在局限。为解决这些问题,提出基于改进YOLO v8算法的高效检测模型YOLO v8-MobileNet v4-CA。该模型采用轻量级网络MobileNet v4替代原始骨干网络,将参数量从37.223 M降低到6.596 M,减少了约82.3%;同时,计算复杂度显著降低,模型在相同硬件条件下推理速度提升了约1.5倍;结合坐标注意力机制(CA)强化关键特征提取能力,并通过SIoU损失函数优化目标定位精度,实现性能与效率的同步提升。研究基于包含3 500张标注精确的水稻病虫害图像数据集,涵盖3种主要病害和3种常见虫害。通过Makesense AI平台进行数据标注,并采用几何变换与内容转换的离线增强技术提升模型的鲁棒性与泛化能力。在模型设计中,CA模块集成于Neck部分,随机梯度下降(SGD)优化算法确保训练效率与稳定性。结果表明,YOLO v8-MobileNet v4-CA模型在测试集上的准确率为923%,mAP@0.5为93.7%,F1分数为91%,在GeForce RTX 3060 Ti GPU上FPS达到78.13帧/s,在检测精度、效率及资源消耗方面均优于原始YOLO v8及其改进版本(如YOLO v8-MobileNet v4、YOLO v8-shuffleNet v2等),尤其在复杂背景和多目标场景中,该模型表现出卓越的检测能力,适用于资源受限的移动设备部署。综上,YOLO v8-MobileNet v4-CA模型为水稻病虫害检测提供了一种低成本、高精度的技术解决方案,其显著提升的性能与效率,为农业病虫害智能检测和作物保护提供了重要技术支持,有助于推动农业可持续发展。
Abstract:
-

参考文献/References:

[1]Wen G H,Li M,Luo Y H,et al. The improved YOLO v8 algorithm based on EMSPConv and SPE-head modules[J]. Multimedia Tools and Applications,2024,83(21):61007-61023.
[2]Wang B W,Liu P,Tian H,et al. A lightweight particle detection algorithm based on an improved YOLO v8[C]. 2024 4th International Conference on Artificial Intelligence and Industrial Technology Applications. Guangzhou:Journal of Physics,2024.
[3]Asefpour Vakilian K,Massah J. Performance evaluation of a machine vision system for insect pests identification of field crops using artificial neural networks[J]. Archives of Phytopathology and Plant Protection,2013,46(11):1262-1269.
[4]Yao Q,Xian D X,Liu Q J,et al. Automated counting of rice planthoppers in paddy fields based on image processing[J]. Journal of Integrative Agriculture,2014,13(8):1736-1745.
[5]Sekulska-Nalewajko J,Goclawski J. A semi-automatic method for the discrimination of diseased regions in detached leaf images using fuzzy c-means clustering[C]//IEEE.Perspective Technologies and Methods in MEMS Design.Polyana,2011:172-175.
[6]Zhou Z Y,Zang Y,Li Y F,et al. Rice plant-hopper infestation detection and classification algorithms based on fractal dimension values and fuzzy C-means[J]. Mathematical and Computer Modelling,2013,58(3/4):701-709.
[7]Liu Y J,Wang Q J,Zhang H Y,et al. Real-time defect detection of metal surface based on improved YOLO v4[J]. International Journal of Innovative Computing,Information and Control,2022,18(4):1329-1338.
[8]Tan L J,Lu J Z,Jiang H Y. Tomato leaf diseases classification based on leaf images:a comparison between classical machine learning and deep learning methods[J]. AgriEngineering,2021,3(3):542-558.
[9]Karar M E,Alsunaydi F,Albusaymi S,et al. A new mobile application of agricultural pests recognition using deep learning in cloud computing system[J]. Alexandria Engineering Journal,2021,60(5):4423-4432.
[10]郝琨,王阔,王贝贝. 基于改进Mobilenet-YOLO v3的轻量级水下生物检测算法[J]. 浙江大学学报(工学版),2022,56(8):1622-1632.
[11]张佳敏,闫科,王一非,等. 基于改进Mask-RCNN算法的作物害虫分类识别[J]. 农业工程学报,2024,40(7):202-209.
[12]李善军,梁千月,余勇华,等. 柑橘木虱YOLO v8-MC识别算法与虫情远程监测系统研究[J]. 农业机械学报,2024,55(6):210-218.
[13]Xu W Y,Cui C,Ji Y C,et al. YOLO v8-MPEB small target detection algorithm based on UAV images[J]. Heliyon,2024,10(8):e29501.
[14]Caldera S,Madushika V,Herath S,et al. VisionPal:visual assistant system for the visually impaired people[C]//IEEE.2023 International Conference on Innovative Computing,Intelligent Communication and Smart Electrical Systems(ICSES).Chennai,2023:1-8.
[15]梁陈烨,张轩雄. 基于改进MobileNet网络的多类别垃圾分类算法[J]. 电子科技,2024,37(4):38-46.
[16]Koppad D,Suma K V,Nagarajappa N. Automated seed classification using state-of-the-art techniques[J]. SN Computer Science,2024,5(5):511.
[17]Li J F,Kang X J. Mobile-YOLO:an accurate and efficient three-stage cascaded network for online fiberglass fabric defect detection[J]. Engineering Applications of Artificial Intelligence,2024,134:108690.
[18]Chakraborty S,Zahir S,Orchi N T,et al. Violence detection:a multi-model approach towards automated video surveillance and public safety[C]//IEEE.2024 International Conference on Advances in Computing,Communication,Electrical,and Smart Systems(iCACCESS).Dhaka,2024:1-6.
[19]Inbavalli A,Jarshini T,Muralikrishnaa M. Efficient aggressive behaviour detection and alert system employing deep learning techniques[C]//IEEE.2024 International Conference on Automation and Computation(AUTOCOM).Dehradun,2024:404-410.
[20]Anzum H,Sammo M N S,Akhter S. Leveraging data efficient image transformer(DeIT) for road crack detection and classification[C]//IEEE.2024 International Conference on Advances in Computing,Communication,Electrical,and Smart Systems(iCACCESS). Dhaka,2024:1-6.
[21]Naftali M G,Hugo G,Suharjito. Palm oil counter:state-of-the-art deep learning models for detection and counting in plantations[J]. IEEE Access,2024,12:90395-90417.
[22]Hou Q B,Zhou D Q,Feng J S. Coordinate attention for efficient mobile network design[C]//IEEE.2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition(CVPR).Nashville,2021:13708-13717.
[23]Gevorgyan Z. SIoU Loss:more powerful learning for bounding box regression[EB/OL]. (2022-05-25)[2024-05-08]. https://arxiv.org/abs/2205.12740v1.
[24]蔡腾,陈慈发,董方敏. 结合Transformer和动态特征融合的低照度目标检测[J]. 计算机工程与应用,2024,60(9):135-141.
[25]吴肖,赵洪泉,杨鹏,等. 基于YOLO v8和DeepLabv3+的指针仪表读数识别[J]. 机电工程技术,2024,53(6):240-244.
[26]Nasim M,Mumtaz R,Ahmad M,et al. Fabric defect detection in real world manufacturing using deep learning[J]. Information,2024,15(8):476.
[27]郑文轩,杨瑛. 基于频域数据增强与轻量化YOLO v7模型的成熟期香梨目标检测方法[J]. 农业机械学报,2024,55(5):244-253.
[28]胡健威,马慧敏,宁孝梅,等. 基于改进YOLO v8的无人机图像玉米幼苗检测[J]. 江苏农业学报,2025,41(6):1179-1187.
[29]庞超,王传安,苏煜,等. 基于改进YOLO v8的水稻病害检测方法[J]. 内蒙古农业大学学报(自然科学版),2024,45(2):62-68.
[30]陈学深,吴昌鹏,党佩娜,等. 基于ViT-改进YOLO v7的稻田杂草识别[J]. 农业工程学报,2024,40(10):185-193.
[31]Gitau S N,Abd El-Malek A H,Sayed M S,et al. Automatic surgical items tracking in laparoscopic surgery utilizing deep learning for retained items detection and classification[C]//IEEE.2024 IEEE 30th International Conference on Telecommunications(ICT).Amman,2024:1-6.
[32]Nawarathne U M M P K,Kumari H M L S,Kumari H M N S. Comparative analysis of jellyfish classification:a study using YOLO v8 and pre-trained models[C]//IEEE.2024 International Research Conference on Smart Computing and Systems Engineering(SCSE).Colombo,2024:1-6.
[33]Svyatov K,Khairullin I. Obstacle avoidance method with local semantic map generation for self-driving cars[C]//IEEE.2024 International Russian Smart Industry Conference(SmartIndustryCon).Sochi,2024:68-74.
[34]Shi Y,Qing S H,Zhao L,et al. YOLO-peach:a high-performance lightweight YOLO v8s-based model for accurate recognition and enumeration of peach seedling fruits[J]. Agronomy,2024,14(8):1628.
[35]李志良,李梦霞,董勇,等. 基于改进YOLO v8的轻量化玉米害虫识别方法[J]. 江苏农业科学,2024,52(14):196-206.
[36]Alkady G I. A deep learning-powered web service for optimal restaurant recommendations based on customers food preferences[C]//IEEE.2024 16th International Conference on Electronics,Computers and Artificial Intelligence(ECAI).Iasi,2024:1-4.

相似文献/References:

[1]张俊喜,陈永明,成晓松,等.水稻病虫害绿色防控技术研究与集成应用[J].江苏农业科学,2017,45(21):94.
 Zhang Junxi,et al.Research and integration application of green prevention and control technology of rice diseases and pests[J].Jiangsu Agricultural Sciences,2017,45(20):94.
[2]王圆圆,林建,王姗.基于YOLOv4-tiny模型的水稻早期病害识别方法[J].江苏农业科学,2023,51(16):147.
 Wang Yuanyuan,et al.An early rice disease recognition method based on YOLOv4-tiny model[J].Jiangsu Agricultural Sciences,2023,51(20):147.

备注/Memo

备注/Memo:
收稿日期:2024-09-11
基金项目:国家自然科学基金(编号:62077018)。
作者简介:王婵(1989—),女,湖北荆州人,硕士,研究方向为农业可持续发展。E-mail:2522270787@qq.com。
通信作者:朱劲松,博士,副教授,主要从事智慧农业研究,E-mail:66305541@qq.com;周军,博士,高级工程师,主要从事电子商务、智慧农业教学工作,E-mail:zhoujungx@qq.com。
更新日期/Last Update: 2025-10-20