Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Mapping uncertainty: sensitivity of wildlife habitat ratings to expert opinion

Mapping uncertainty: sensitivity of wildlife habitat ratings to expert opinion Summary 1 Expert opinion is frequently called upon by natural resource and conservation professionals to aid decision making. Where species are difficult or expensive to monitor, expert knowledge often serves as the foundation for habitat suitability models and resulting maps. Despite the long history and widespread use of expert‐based models, there has been little recognition or assessment of uncertainty in predictions. 2 Across British Columbia, Canada, expert‐based habitat suitability models help guide resource planning and development. We used Monte Carlo simulations to identify the most sensitive parameters in a wildlife habitat ratings model, the precision of ratings for a number of ecosystem units, and variation in the total area of high‐quality habitats due to uncertainty in expert opinion. 3 The greatest uncertainty in habitat ratings resulted from simulations conducted using a uniform distribution and a standard deviation calculated from the range of possible scores for the model attributes. For most ecological units, the mean score, following 1000 simulations, varied considerably from the reported value. When applied across the study area, assumed variation in expert opinion resulted in dramatic decreases in the geographical area of high‐ (−85%) and moderately high‐quality habitats (−68%). The majority of habitat polygons could vary by up to one class (85%) with smaller percentages varying by up to two classes (9%) or retaining their original rank (7%). Our model was based on only four parameters, but no variable consistently accounted for the majority of uncertainty across the study area. 4 Synthesis and applications. We illustrated the power of uncertainty and sensitivity analyses to improve or assess the reliability of predictive species distribution models. Results from our case study suggest that even simple expert‐based predictive models can be sensitive to variation in opinion. The magnitude of uncertainty that is tolerable to decision making, however, will vary depending on the application of the model. When presented as error bounds for individual predictions or maps of uncertainty across landscapes, estimates of uncertainty allow managers and conservation professionals to determine if the model and input data reliably support their particular decision‐making process. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Applied Ecology Wiley

Mapping uncertainty: sensitivity of wildlife habitat ratings to expert opinion

Loading next page...
 
/lp/wiley/mapping-uncertainty-sensitivity-of-wildlife-habitat-ratings-to-expert-Ewj9DIxvKz

References (44)

Publisher
Wiley
Copyright
Copyright © 2004 Wiley Subscription Services, Inc., A Wiley Company
ISSN
0021-8901
eISSN
1365-2664
DOI
10.1111/j.0021-8901.2004.00975.x
Publisher site
See Article on Publisher Site

Abstract

Summary 1 Expert opinion is frequently called upon by natural resource and conservation professionals to aid decision making. Where species are difficult or expensive to monitor, expert knowledge often serves as the foundation for habitat suitability models and resulting maps. Despite the long history and widespread use of expert‐based models, there has been little recognition or assessment of uncertainty in predictions. 2 Across British Columbia, Canada, expert‐based habitat suitability models help guide resource planning and development. We used Monte Carlo simulations to identify the most sensitive parameters in a wildlife habitat ratings model, the precision of ratings for a number of ecosystem units, and variation in the total area of high‐quality habitats due to uncertainty in expert opinion. 3 The greatest uncertainty in habitat ratings resulted from simulations conducted using a uniform distribution and a standard deviation calculated from the range of possible scores for the model attributes. For most ecological units, the mean score, following 1000 simulations, varied considerably from the reported value. When applied across the study area, assumed variation in expert opinion resulted in dramatic decreases in the geographical area of high‐ (−85%) and moderately high‐quality habitats (−68%). The majority of habitat polygons could vary by up to one class (85%) with smaller percentages varying by up to two classes (9%) or retaining their original rank (7%). Our model was based on only four parameters, but no variable consistently accounted for the majority of uncertainty across the study area. 4 Synthesis and applications. We illustrated the power of uncertainty and sensitivity analyses to improve or assess the reliability of predictive species distribution models. Results from our case study suggest that even simple expert‐based predictive models can be sensitive to variation in opinion. The magnitude of uncertainty that is tolerable to decision making, however, will vary depending on the application of the model. When presented as error bounds for individual predictions or maps of uncertainty across landscapes, estimates of uncertainty allow managers and conservation professionals to determine if the model and input data reliably support their particular decision‐making process.

Journal

Journal of Applied EcologyWiley

Published: Dec 1, 2004

There are no references for this article.