Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Two years of Visual Odometry on the Mars Exploration Rovers

Two years of Visual Odometry on the Mars Exploration Rovers NASA's two Mars Exploration Rovers (MER) have successfully demonstrated a robotic Visual Odometry capability on another world for the first time. This provides each rover with accurate knowledge of its position, allowing it to autonomously detect and compensate for any unforeseen slip encountered during a drive. It has enabled the rovers to drive safely and more effectively in highly sloped and sandy terrains and has resulted in increased mission science return by reducing the number of days required to drive into interesting areas. The MER Visual Odometry system comprises onboard software for comparing stereo pairs taken by the pointable mast‐mounted 45 deg FOV Navigation cameras (NAVCAMs). The system computes an update to the 6 degree of freedom rover pose (x, y, z, roll, pitch, yaw) by tracking the motion of autonomously selected terrain features between two pairs of 256×256 stereo images. It has demonstrated good performance with high rates of successful convergence (97% on Spirit, 95% on Opportunity), successfully detected slip ratios as high as 125%, and measured changes as small as 2 mm, even while driving on slopes as high as 31 deg. Visual Odometry was used over 14% of the first 10.7 km driven by both rovers. During the first 2 years of operations, Visual Odometry evolved from an “extra credit” capability into a critical vehicle safety system. In this paper we describe our Visual Odometry algorithm, discuss several driving strategies that rely on it (including Slip Checks, Keep‐out Zones, and Wheel Dragging), and summarize its results from the first 2 years of operations on Mars. © 2006 Wiley Periodicals, Inc. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Field Robotics Wiley

Two years of Visual Odometry on the Mars Exploration Rovers

Loading next page...
 
/lp/wiley/two-years-of-visual-odometry-on-the-mars-exploration-rovers-VvkmH5W3G5

References (31)

Publisher
Wiley
Copyright
Copyright © 2007 Wiley Periodicals, Inc., A Wiley Company
ISSN
1556-4959
eISSN
1556-4967
DOI
10.1002/rob.20184
Publisher site
See Article on Publisher Site

Abstract

NASA's two Mars Exploration Rovers (MER) have successfully demonstrated a robotic Visual Odometry capability on another world for the first time. This provides each rover with accurate knowledge of its position, allowing it to autonomously detect and compensate for any unforeseen slip encountered during a drive. It has enabled the rovers to drive safely and more effectively in highly sloped and sandy terrains and has resulted in increased mission science return by reducing the number of days required to drive into interesting areas. The MER Visual Odometry system comprises onboard software for comparing stereo pairs taken by the pointable mast‐mounted 45 deg FOV Navigation cameras (NAVCAMs). The system computes an update to the 6 degree of freedom rover pose (x, y, z, roll, pitch, yaw) by tracking the motion of autonomously selected terrain features between two pairs of 256×256 stereo images. It has demonstrated good performance with high rates of successful convergence (97% on Spirit, 95% on Opportunity), successfully detected slip ratios as high as 125%, and measured changes as small as 2 mm, even while driving on slopes as high as 31 deg. Visual Odometry was used over 14% of the first 10.7 km driven by both rovers. During the first 2 years of operations, Visual Odometry evolved from an “extra credit” capability into a critical vehicle safety system. In this paper we describe our Visual Odometry algorithm, discuss several driving strategies that rely on it (including Slip Checks, Keep‐out Zones, and Wheel Dragging), and summarize its results from the first 2 years of operations on Mars. © 2006 Wiley Periodicals, Inc.

Journal

Journal of Field RoboticsWiley

Published: Mar 1, 2007

There are no references for this article.