首页 | 本学科首页   官方微博 | 高级检索  
     检索      

融合动态机制的改进型Faster R-CNN识别田间棉花顶芽
引用本文:陈柯屹,朱龙付,宋鹏,田晓敏,黄成龙,聂新辉,肖爱玲,何良荣.融合动态机制的改进型Faster R-CNN识别田间棉花顶芽[J].农业工程学报,2021,37(16):161-168.
作者姓名:陈柯屹  朱龙付  宋鹏  田晓敏  黄成龙  聂新辉  肖爱玲  何良荣
作者单位:1. 华中农业大学植物科学技术学院,武汉 430070;2. 石河子大学农学院,新疆 832003;3. 华中农业大学工学院,武汉 430070;4. 塔里木大学机械电气化工程学院,新疆 843300;5. 塔里木大学植物科学学院,新疆 843300
摘    要:针对田间密植环境棉花精准打顶时,棉花顶芽因其小体积特性所带来识别困难问题,该研究提出一种改进型更快速区域卷积神经网络(Faster Region Convolutional Neural Networks,Faster R-CNN)目标检测算法实现大田环境棉花顶芽识别。以Faster R-CNN为基础框架,使用 RegNetX-6.4GF作为主干网络,以提高图像特征获取性能。将特征金字塔网络(Feature Pyramid Network,FPN)和导向锚框定位(Guided Anchoring,GA)机制相融合,实现锚框(Anchor)动态自适应生成。通过融合动态区域卷积神经网络(Dynamic Region Convolutional Neural Networks,Dynamic R-CNN)方法,实现训练阶段检测模型自适应候选区域(Proposal)分布的动态变化,以提高算法训练效果。最后在目标候选区域(Region of Interest,ROI)中引入目标候选区域提取器(Generic ROI Extractor,GROIE)提高图像特征融合能力。采集自然环境7种不同棉花材料总计4 819张图片,建立微软常见物体图像识别库2017(Microsoft Common Objects in Context 2017,MS COCO 2017)格式的棉花顶芽图片数据集进行试验。结果表明,该研究提出的方法平均准确率均值(Mean Average Precision,MAP)为98.1%,模型的处理速度(Frames Per Second,FPS)为10.3帧/s。其MAP在交并比(Intersection Over Union,IOU)为0.5时较Faster R-CNN、RetinaNet、Cascade R-CNN和RepPoints网络分别提高7.3%、78.9%、10.1%和8.3%。该研究算法在田间对于棉花顶芽识别具有较高的鲁棒性和精确度,为棉花精准打顶作业奠定基础。

关 键 词:深度学习  算法  棉花  机制融合  动态适应  顶芽识别  Faster  R-CNN
收稿时间:2021/8/14 0:00:00
修稿时间:2021/8/14 0:00:00

Recognition of cotton terminal bud in field using improved Faster R-CNN by integrating dynamic mechanism
Chen Keyi,Zhu Longfu,Song Peng,Tian Xiaomin,Huang Chenglong,Nie Xinhui,Xiao Ailing,He Liangrong.Recognition of cotton terminal bud in field using improved Faster R-CNN by integrating dynamic mechanism[J].Transactions of the Chinese Society of Agricultural Engineering,2021,37(16):161-168.
Authors:Chen Keyi  Zhu Longfu  Song Peng  Tian Xiaomin  Huang Chenglong  Nie Xinhui  Xiao Ailing  He Liangrong
Institution:1. College of Plant Science and Technology, Huazhong Agricultural University, Wuhan 430070, China;2. Shihezi University, Xinjiang 832003, China;3. College of Engineering, Huazhong Agricultural University, Wuhan 4300070, China;4.College of Mechanical and Electronic Engineering, Tarim University, Xinjiang 843300, China; 5.College of Plant Sciences, Tarim University, Xinjiang 843300, China
Abstract:Abstract: Accurate identification of cotton top bud is important for cotton topping operation to detect cotton terminal bud accurately in field, a recognition method using Faster R-CNN (Faster Region Convolutional Neural Networks, Faster R-CNN) by integrating dynamic mechanism was proposed to solve the recognition difficulties caused by the small size of cotton terminal bud when it is topped in densely planted fields. The RegNetX-6.4GF model was used as the backbone network to improve its image feature extractor capabilities. Due to number of proposals under a higher IOU(Intersection Over Union, IOU) as well as the matching degree between anchor and the target shape affect the performance of the detector, the method proposed in this paper changed the original anchor generation mechanism by combining FPN (Feature Pyramid Network, FPN) and Guided Anchoring in RPN (Region Proposal Network, RPN), which will cause the distribution of the proposals generated by the RPN of the algorithm under different IOUs dynamically change during the training process. To adapt the dynamic change of proposals distribution, we integrated Dynamic Region Convolutional Neural Networks ( Dynamic R-CNN) in Faster R-CNN, which can dynamically adjust the value of IOU to obtain high quality proposals. And the GROIE (Generic ROI Extractor, GROIE) mechanism was inducted to extract ROI (Region of Interest, ROI) to improve the feature fusion capability. In this paper, 4 819 images of gossypium hirsutum population which contain seven leaf types were taken from the top of cotton plant at distance of 30-50 cm (medium distance) and 50-100 cm (long distance) under uniform light, oblique strong light, direct strong light, and shadows. Those images were processed as MS COCO 2017 format dataset and assigned to the training set, validation set, and test set, which contained 2 815, 704, and 1 300 pictures respectively. The experimental results demonstrated that FPS (Frames Per Second, FPS) of proposed model was up to 10.3 frames/s and the Mean Average Precision (MAP) of bud identification reached to 98.1% which was 3.2 percentage points higher than original Faster R-CNN model. The validation set were used to compare performance of mainstream recognition algorithm and proposed method. Results showed that the improved Faster R-CNN''s MAP was 7.3% higher than original Faster R-CNN, which was also higher than RetinaNet, Cascade R-CNN (Cascade Region Convolutional Neural Networks, Cascade R-CNN) and RepPoints by 78.9%, 10.1% and 8.3% when IOU was set to 0.5. The improved Faster R-CNN proposed in this paper meets the accuracy and real-time requirements of cotton topping operation.
Keywords:deep learning  algorithm  cotton  mechanism fusion  dynamic adaptation  terminal bud recognition  Faster R-CNN
点击此处可从《农业工程学报》浏览原始摘要信息
点击此处可从《农业工程学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号