首页 | 官方网站   微博 | 高级检索  
     

3D LiDAR感知的植物行信息提取方法与试验
引用本文:赵润茂,朱政,陈建能,范国帅,王麒程,黄培奎.3D LiDAR感知的植物行信息提取方法与试验[J].华南农业大学学报,2023,44(4):628-637.
作者姓名:赵润茂  朱政  陈建能  范国帅  王麒程  黄培奎
作者单位:浙江理工大学 机械与自动控制学院, 浙江 杭州 310018;浙江省种植装备技术重点实验室, 浙江 杭州 310018;华南农业大学 工程学院, 广东 广州 510642
基金项目:国家自然科学基金(52105284); 浙江理工大学科研启动基金(20022307-Y)
摘    要:目的 针对林间或冠层下等卫星信号严重遮挡的区域,提出一种面向农业机器人导航环境感知的低成本3D激光雷达(LiDAR)点云信息处理与植物行估计方法。方法 利用直通滤波器滤除感兴趣区域外的目标无关点;提出均值漂移聚类、扫描区域自适应的方法分割每棵植物主干,垂直投影主干点云估算中心点;利用最小二乘法拟合主干中心,估计植物行。分别在开阔地的仿真果园与水杉树林进行模拟试验与田间试验,以植物行向量与正东方夹角为指标,计算本研究提出的方法识别的植物行信息与GNSS卫星天线定位测得的植物行真值间的角度误差。结果 采用提出的3D LiDAR点云信息处理与植物行估计方法,模拟试验和田间试验对植物行识别误差平均值分别为0.79°和1.48°,最小值分别为0.12°和0.88°,最大值分别为1.49°和2.33°。结论 车载3D LiDAR能够有效估计水杉树植物行。该研究丰富了作物识别思路与方法,为无卫星信号覆盖区域的农业机器人无图导航提供了理论依据。

关 键 词:农业机器人  3D激光雷达  点云处理  植物行识别  视场自适应  植物主干分割
收稿时间:2022/8/3 0:00:00

3D LiDAR sensing method and experiment of plant row information extraction
ZHAO Runmao,ZHU Zheng,CHEN Jianneng,FAN Guoshuai,WANG Qicheng,HUANG Peikui.3D LiDAR sensing method and experiment of plant row information extraction[J].Journal of South China Agricultural University,2023,44(4):628-637.
Authors:ZHAO Runmao  ZHU Zheng  CHEN Jianneng  FAN Guoshuai  WANG Qicheng  HUANG Peikui
Affiliation:Faculty of Mechanical Engineering and Automation, Zhejiang Sci-Tech University, Hangzhou 310018, China;Zhejiang Key Laboratory of Planting Equipment Technology, Hangzhou 310018, China; College of Engineering, South China Agruicultural University, Guangzhou 510642, China
Abstract:Objective A low-cost 3D light detecting and ranging (LiDAR) point cloud information processing and plant row estimation method for environment perception in agricultural robot navigation is proposed for the areas where the satellite signal is seriously occluded in the forest or under the canopy.Method First, the pass through filter was used to filter out the target irrelevant points outside the area of interest. Secondly, the methods of mean shift clustering and scanning area adaptation were proposed to segment the trunk of each plant, and the vertical projection of the trunk point cloud was used to estimate the center point. Finally, the plant rows were estimated by determing the trunk centers with the least square fitting method. The simulation experiment and field experiment were carried out in the simulated orchard and metasequoia forest in the open field. The angle between the plant row vector and the due east was used as the index. The angle error between the plant row information identified by the proposed method and the true value of the plant row measured by GNSS satellite antenna positioning was calculated.Result Using the proposed method of 3D LiDAR point cloud information processing and plant row estimation, the average errors of plant row identification in simulation experiment and field experiment were 0.79° and 1.48°, the minimum errors were 0.12° and 0.88°, and the maximum errors were 1.49° and 2.33°, respectively.Conclusion The vehicle-mounted 3D LiDAR can effectively estimate the plant rows of metasequoia. This research enriches the ideas and methods of crop identification, and provides a theoretical basis for the map-free navigation of agricultural robots in areas without satellite signal coverage.
Keywords:Agricultural robot  3D LiDAR  Point cloud processing  Plant row recognition  Field of view adaptation  Plant trunk division
点击此处可从《华南农业大学学报》浏览原始摘要信息
点击此处可从《华南农业大学学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号