Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Multi-data sensor fusion framework to detect transparent object for the efficient mobile robot mapping

Multi-data sensor fusion framework to detect transparent object for the efficient mobile robot... PurposeAn efficient perception of the complex environment is the foremost requirement in mobile robotics. At present, the utilization of glass as a glass wall and automated transparent door in the modern building has become a highlight feature for interior decoration, which has resulted in the wrong perception of the environment by various range sensors. The perception generated by multi-data sensor fusion (MDSF) of sonar and laser is fairly consistent to detect glass but is still affected by the issues such as sensor inaccuracies, sensor reliability, scan mismatching due to glass, sensor model, probabilistic approaches for sensor fusion, sensor registration, etc. The paper aims to discuss these issues.Design/methodology/approachThis paper presents a modified framework – Advanced Laser and Sonar Framework (ALSF) – to fuse the sensory information of a laser scanner and sonar to reduce the uncertainty caused by glass in an environment by selecting the optimal range information corresponding to a selected threshold value. In the proposed approach, the conventional sonar sensor model is also modified to reduce the wrong perception in sonar as an outcome of the diverse range measurement. The laser scan matching algorithm is also modified by taking out the small cluster of laser point (w.r.t. range information) to get efficient perception.FindingsThe probability of the occupied cells w.r.t. the modified sonar sensor model becomes consistent corresponding to diverse sonar range measurement. The scan matching technique is also modified to reduce the uncertainty caused by glass and high computational load for the efficient and fast pose estimation of the laser sensor/mobile robot to generate robust mapping. These stated modifications are linked with the proposed ALSF technique to reduce the uncertainty caused by glass, inconsistent probabilities and high load computation during the generation of occupancy grid mapping with MDSF. Various real-world experiments are performed with the implementation of the proposed approach on a mobile robot fitted with laser and sonar, and the obtained results are qualitatively and quantitatively compared with conventional approaches.Originality/valueThe proposed ASIF approach generates efficient perception of the complex environment contains glass and can be implemented for various robotics applications. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal of Intelligent Unmanned Systems Emerald Publishing

Multi-data sensor fusion framework to detect transparent object for the efficient mobile robot mapping

Loading next page...
 
/lp/emerald-publishing/multi-data-sensor-fusion-framework-to-detect-transparent-object-for-SOtUQjstWQ

References (27)

Publisher
Emerald Publishing
Copyright
Copyright © Emerald Group Publishing Limited
ISSN
2049-6427
DOI
10.1108/IJIUS-05-2018-0013
Publisher site
See Article on Publisher Site

Abstract

PurposeAn efficient perception of the complex environment is the foremost requirement in mobile robotics. At present, the utilization of glass as a glass wall and automated transparent door in the modern building has become a highlight feature for interior decoration, which has resulted in the wrong perception of the environment by various range sensors. The perception generated by multi-data sensor fusion (MDSF) of sonar and laser is fairly consistent to detect glass but is still affected by the issues such as sensor inaccuracies, sensor reliability, scan mismatching due to glass, sensor model, probabilistic approaches for sensor fusion, sensor registration, etc. The paper aims to discuss these issues.Design/methodology/approachThis paper presents a modified framework – Advanced Laser and Sonar Framework (ALSF) – to fuse the sensory information of a laser scanner and sonar to reduce the uncertainty caused by glass in an environment by selecting the optimal range information corresponding to a selected threshold value. In the proposed approach, the conventional sonar sensor model is also modified to reduce the wrong perception in sonar as an outcome of the diverse range measurement. The laser scan matching algorithm is also modified by taking out the small cluster of laser point (w.r.t. range information) to get efficient perception.FindingsThe probability of the occupied cells w.r.t. the modified sonar sensor model becomes consistent corresponding to diverse sonar range measurement. The scan matching technique is also modified to reduce the uncertainty caused by glass and high computational load for the efficient and fast pose estimation of the laser sensor/mobile robot to generate robust mapping. These stated modifications are linked with the proposed ALSF technique to reduce the uncertainty caused by glass, inconsistent probabilities and high load computation during the generation of occupancy grid mapping with MDSF. Various real-world experiments are performed with the implementation of the proposed approach on a mobile robot fitted with laser and sonar, and the obtained results are qualitatively and quantitatively compared with conventional approaches.Originality/valueThe proposed ASIF approach generates efficient perception of the complex environment contains glass and can be implemented for various robotics applications.

Journal

International Journal of Intelligent Unmanned SystemsEmerald Publishing

Published: Jan 7, 2019

There are no references for this article.