Py学习  »  机器学习算法

【泡泡一分钟】对于仅使用2D激光雷达传感器的移动机器人车队的基于深度学习的相互检测和协同定位

泡泡机器人SLAM • 3 年前 • 375 次点击  

每天一分钟,带你读遍机器人顶级会议文章

标题:Deep Learning-Based Mutual Detection and Collaborative Localization for Mobile Robot Fleets Using Solely 2D LIDAR Sensors

作者:Robin Dietrich and Stefan Dorr

来源:2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)

编译:孙钦

审核:黄思宇,孙钦

这是泡泡一分钟推送的第 574 篇文章,欢迎个人转发朋友圈;其他机构或自媒体如需转载,后台留言申请授权

摘要

在动态大场景环境下的机器人定位是一个具有挑战性的任务,尤其是当仅仅依靠里程计和2D激光雷达数据时。在车队中进行操作时,相互检测和交换定位信息非常有价值。然而,由于稀疏的观测信息,在2D激光雷达数据中检测和分类异类车队中的不同机器人类型并非易事。

在本文中,我们提出了一个基于卷积层和ConvLSTM层的组合的新方法,用于多机器人检测,分类以及相对位姿估计。该算法使用转换为网格图的2D激光雷达信息来学习机器人形状的端到端分类和姿势估计。随后,从网络的热图输出中提取表示每种机器人类型的位姿测量的概率分布的混合模型。然后,将输出用于基于云的协作定位系统中,以增强各个机器人的定位。

我们方法的有效性可以从仿真和真实实验两个方面说明。我们的评估结果显示,分类网络在真实世界数据中能实现90%的精度,平均位置估计误差为14cm。而且,协作定位系统能够增加配备有低成本传感器的机器人63%的定位精度。


图1  具有端到端目标分类和位置估计功能的已开发CNN / ConvLSTM网络的体系结构和输入/输出。


图2  具有未知关联的协作定位的示意图。

Abstract

Localization for mobile robots in dynamic, largescale environments is a challenging task, especially when relying solely on odometry and 2D LIDAR data. When operating in fleets, mutual detection and the exchange of localization information can be highly valuable. Detecting and classifying different robot types in a heterogeneous fleet, however, is nontrivial with 2D LIDAR data due to the sparse observation information.

In this paper a novel approach for mutual robot detection, classification and relative pose estimation based on a combination of convolutional and ConvLSTM layers is presented in order to solve this issue. The algorithm learns an end-to-end classification and pose estimation of robot shapes using 2D LIDAR information transformed into a grid-map. Subsequently a mixture model representing the probability distribution of the pose measurement for each robot type is extracted out of the heatmap output of the network. The output is then used in a cloud-based collaborative localization system in order to increase the localization of the individual robots.

The effectiveness of our approach is demonstrated in both, simulation and real-world experiments. The results of our evaluation show that the classification network is able to achieve a precision of 90% on real-world data with an average position estimation error of 14 cm. Moreover, the collaborative localization system is able to increase the localization accuracy of a robot equipped with low-cost sensors by 63%.


如果你对本文感兴趣,请点击点击阅读原文下载完整文章,如想查看更多文章请关注【泡泡机器人SLAM】公众号(paopaorobot_slam)

百度网盘提取码:olgq


欢迎来到泡泡论坛,这里有大牛为你解答关于SLAM的任何疑惑。

有想问的问题,或者想刷帖回答问题,泡泡论坛欢迎你!

泡泡网站:www.paopaorobot.org

泡泡论坛:http://paopaorobot.org/bbs/


泡泡机器人SLAM的原创内容均由泡泡机器人的成员花费大量心血制作而成,希望大家珍惜我们的劳动成果,转载请务必注明出自【泡泡机器人SLAM】微信公众号,否则侵权必究!同时,我们也欢迎各位转载到自己的朋友圈,让更多的人能进入到SLAM这个领域中,让我们共同为推进中国的SLAM事业而努力!

商业合作及转载请联系liufuqiang_robot@hotmail.com

Python社区是高质量的Python/Django开发社区
本文地址:http://www.python88.com/topic/71939
 
375 次点击