Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

Monocular odometry in country roads based on phase‐derived optical flow and 4‐DOF ego‐motion model

Monocular odometry in country roads based on phase‐derived optical flow and 4‐DOF ego‐motion model Purpose – Positioning is a key task in most field robotics applications but can be very challenging in GPS‐denied or high‐slip environments. The purpose of this paper is to describe a visual odometry strategy using only one camera in country roads. Design/methodology/approach – This monocular odometery system uses as input only those images provided by a single camera mounted on the roof of the vehicle and the framework is composed of three main parts: image motion estimation, ego‐motion computation and visual odometry. The image motion is estimated based on a hyper‐complex wavelet phase‐derived optical flow field. The ego‐motion of the vehicle is computed by a blocked RANdom SAmple Consensus algorithm and a maximum likelihood estimator based on a 4‐degrees of freedom motion model. These as instantaneous ego‐motion measurements are used to update the vehicle trajectory according to a dead‐reckoning model and unscented Kalman filter. Findings – The authors' proposed framework and algorithms are validated on videos from a real automotive platform. Furthermore, the recovered trajectory is superimposed onto a digital map, and the localization results from this method are compared to the ground truth measured with a GPS/INS joint system. These experimental results indicate that the framework and the algorithms are effective. Originality/value – The effective framework and algorithms for visual odometry using only one camera in country roads are introduced in this paper. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Industrial Robot: An International Journal Emerald Publishing

Monocular odometry in country roads based on phase‐derived optical flow and 4‐DOF ego‐motion model

Loading next page...
 
/lp/emerald-publishing/monocular-odometry-in-country-roads-based-on-phase-derived-optical-Mzhiij1a7F

References (25)

Publisher
Emerald Publishing
Copyright
Copyright © 2011 Emerald Group Publishing Limited. All rights reserved.
ISSN
0143-991X
DOI
10.1108/01439911111154081
Publisher site
See Article on Publisher Site

Abstract

Purpose – Positioning is a key task in most field robotics applications but can be very challenging in GPS‐denied or high‐slip environments. The purpose of this paper is to describe a visual odometry strategy using only one camera in country roads. Design/methodology/approach – This monocular odometery system uses as input only those images provided by a single camera mounted on the roof of the vehicle and the framework is composed of three main parts: image motion estimation, ego‐motion computation and visual odometry. The image motion is estimated based on a hyper‐complex wavelet phase‐derived optical flow field. The ego‐motion of the vehicle is computed by a blocked RANdom SAmple Consensus algorithm and a maximum likelihood estimator based on a 4‐degrees of freedom motion model. These as instantaneous ego‐motion measurements are used to update the vehicle trajectory according to a dead‐reckoning model and unscented Kalman filter. Findings – The authors' proposed framework and algorithms are validated on videos from a real automotive platform. Furthermore, the recovered trajectory is superimposed onto a digital map, and the localization results from this method are compared to the ground truth measured with a GPS/INS joint system. These experimental results indicate that the framework and the algorithms are effective. Originality/value – The effective framework and algorithms for visual odometry using only one camera in country roads are introduced in this paper.

Journal

Industrial Robot: An International JournalEmerald Publishing

Published: Aug 23, 2011

Keywords: Monocular odometry; Ego‐motion estimation; 4‐DOF ego‐motion model; Phase‐derived optical flow; Blocked RANSAC; Road vehicles; Motion; Robotics

There are no references for this article.