[1]Fraiwan M,Faouri E,Khasawneh N. Multiclass classification of grape diseases using deep artificial intelligence[J]. Agriculture,2022,12(10):1542.
[2]Li Z,Paul R,Tis T B,et al. Non-invasive plant disease diagnostics enabled by smartphone-based fingerprinting of leaf volatiles[J]. Nature Plants,2019,5(8):856-866.
[3]Savary S,Ficke A,Aubertot J N,et al. Crop losses due to diseases and their implications for global food production losses and food security[J]. Food Security,2012,4(4):519-537.
[4]Faithpraise F,Birch P,Young R,et al. Automatic plant pest detection and recognition using k-means clustering algorithm and correspondence filters[J]. International Journal of Advanced Biotechnology and Research,2013,4(2):189-199.
[5]Arnal Barbedo J G. Digital image processing techniques for detecting,quantifying and classifying plant diseases[J]. SpringerPlus,2013,2(1):660.
[6]Zhang F,Chen Z J,Ali S,et al. Multi-class detection of cherry tomatoes using improved YOLO v4-Tiny[J]. International Journal of Agricultural and Biological Engineering,2023,16(2):225-231.
[7]彭红星,徐慧明,刘华鼐. 融合双分支特征和注意力机制的葡萄病虫害识别模型[J]. 农业工程学报,2022,38(10):156-165.
[8]Zhang P,Yang L,Li D L. EfficientNet-B4-ranger:a novel method for greenhouse cucumber disease recognition under natural complex environment[J]. Computers and Electronics in Agriculture,2020,176:105652.
[9]Guo W J,Feng Q,Li X Z,et al. Grape leaf disease detection based on attention mechanisms[J]. International Journal of Agricultural and Biological Engineering,2022,15(5):205-212.
[10]王瑞鹏,陈锋军,朱学岩,等. 采用改进的EfficientNet识别苹果叶片病害[J]. 农业工程学报,2023,39(18):201-210.
[11]贾璐,叶中华. 基于注意力机制和特征融合的葡萄病害识别模型[J]. 农业机械学报,2023,54(7):223-233.
[12]张林鍹,巴音塔娜,曾庆松. 基于StyleGAN2-ADA和改进YOLO v7的葡萄叶片早期病害检测方法[J]. 农业机械学报,2024,55(1):241-252.
[13]Verma S,Chug A,Singh A P,et al. PDS-MCNet:a hybrid framework using MobileNet v2 with SiLU6 activation function and capsule networks for disease severity estimation in plants[J]. Neural Computing and Applications,2023,35(25):18641-18664.
[14]Guo Y F,Lan Y T,Chen X D.CST:Convolutional Swin Transformer for detecting the degree and types of plant diseases[J]. Computers and Electronics in Agriculture,2022,202:107407.
[15]Liu X Y,Peng H W,Zheng N X,et al. EfficientViT:memory efficient vision transformer with cascaded group attention[C]//2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).Vancouver,BC,Canada.2023:14420-14430.
[16]Shi D. TransNeXt:robust foveal visual perception for vision transformers[C]//2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).Seattle,WA,USA.IEEE,2024:17773-17783.
[17]袁媛,陈雷. IDADP-葡萄病害识别研究图像数据集[J]. 中国科学数据,2022,7(1):86-90.
[18]Wu K,Peng H W,Chen M H,et al. Rethinking and improving relative position encoding for vision transformer[C]//2021 IEEE/CVF International Conference on Computer Vision (ICCV).Montreal,QC,Canada.IEEE,2021:10013-10021.
[19]Chen J R,Kao S H,He H,et al. Run,dont walk:chasing higher FLOPS for faster neural networks[C]//2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).Vancouver,BC,Canada.IEEE,2023:12021-12031.
[20]Amirul Islam M,Kowal M,Jia S,et al. Global pooling,more than meets the eye:position information is encoded channel-wise in CNNs[C]//2021 IEEE/CVF International Conference on Computer Vision (ICCV).Montreal,QC,Canada.IEEE,2021:773-781.
[21]Dauphin Y N,Fan A,Auli M,et al. Language modeling with gated convolutional networks[C]//Proceedings of the 34th International Conference on Machine Learning,PMLR 70.Sydney,NSW,Australia.ACM,2017:933-941.
[22]Nayar S K,Narasimhan S G. Vision in bad weather[C]//Proceedings of the Seventh IEEE International Conference on Computer Vision..Kerkyra,Greece.IEEE,1999:820.
[23]Krizhevsky A,Sutskever I,Hinton G E. ImageNet classification with deep convolutional neural networks[J]. Communications of the ACM,2017,60(6):84-90.
[24]He K M,Zhang X Y,Ren S Q,et al. Deep residual learning for image recognition[C]//2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).Las Vegas,NV,USA.IEEE,2016:770-778.
[25]Howard A,Sandler M,Chen B,et al. Searching for MobileNet v3[C]//2019 IEEE/CVF International Conference on Computer Vision (ICCV).Seoul,Korea (South).IEEE,2019:1314-1324.
[26]Tan M X,Le Q V. EfficientNet:rethinking model scaling for convolutional neural networks[C]//Proceedings of the 36 th International Conference on MachineLearning,PMLR 97.Long Beach,California,2019.
[27]Zhou D Q,Kang B Y,Jin X J,et al. DeepViT:towards deeper vision transformer[EB/OL]. (2021-04-19)[2024-05-29].https://arxiv.org/abs/2103.11886v4.
[28]Graham B,El-Nouby A,Touvron H,et al. LeViT:a vision transformer in ConvNets clothing for faster inference[C]//2021 IEEE/CVF International Conference on Computer Vision (ICCV).Montreal,QC,Canada.IEEE,2021:12259-12269.
[29]Dosovitskiy A,Beyer L,Kolesnikov A,et al. An image is worth 16×16 words:Transformers for image recognition at scale[J/OL].ICLR,2021:1-21(2021-06-03)[2024-05-29].https://arxiv. org/abs/2010.11929.
[30]Liu Z,Lin Y T,Cao Y,et al. Swin transformer:hierarchical vision transformer using shifted windows[C]//2021 IEEE/CVF International Conference on Computer Vision (ICCV).Montreal,QC,Canada.IEEE,2021:9992-10002.
[31]Chen Y P,Dai X Y,Chen D D,et al. Mobile-former:Bridging mobilenet and transformer[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2022:1-15.
[1]王利伟,徐晓辉,苏彦莽,等.基于计算机视觉的葡萄叶部病害识别研究[J].江苏农业科学,2017,45(23):222.
Wang Liwei,et al.Study on recognition of grape leaf diseases based on computer vision[J].Jiangsu Agricultural Sciences,2017,45(5):222.
[2]戴久竣,马肄恒,吴坚,等.基于改进残差网络的葡萄叶片病害识别[J].江苏农业科学,2023,51(5):208.
Dai Jiujun,et al.Grape leaf disease identification based on improved residual network[J].Jiangsu Agricultural Sciences,2023,51(5):208.
[3]许文燕.基于轻量化神经网络的葡萄叶部病害检测装置研制[J].江苏农业科学,2023,51(19):181.
Xu Wenyan.Development of grape leaf disease detection device based on lightweight neural network[J].Jiangsu Agricultural Sciences,2023,51(5):181.
[4]张立强,武玲梅,蒋林利,等.基于改进YOLO v8s的葡萄叶片病害检测[J].江苏农业科学,2024,52(21):221.
Zhang Liqiang,et al.Detection of grape leaf disease based on improved YOLO v8s[J].Jiangsu Agricultural Sciences,2024,52(5):221.