Climate change and the detection of trends in annual runoff

Climate Research
By:  and 

Links

Abstract

This study examines the statistical likelihood of detecting a trend in annual runoff given an assumed change in mean annual runoff, the underlying year-to-year variability in runoff, and serial correlation of annual runoff. Means, standard deviations, and lag-1 serial correlations of annual runoff were computed for 585 stream gages in the conterminous United States, and these statistics were used to compute the probability of detecting a prescribed trend in annual runoff. Assuming a linear 20% change in mean annual runoff over a 100 yr period and a significance level of 95%, the average probability of detecting a significant trend was 28% among the 585 stream gages. The largest probability of detecting a trend was in the northwestern U.S., the Great Lakes region, the northeastern U.S., the Appalachian Mountains, and parts of the northern Rocky Mountains. The smallest probability of trend detection was in the central and southwestern U.S., and in Florida. Low probabilities of trend detection were associated with low ratios of mean annual runoff to the standard deviation of annual runoff and with high lag-1 serial correlation in the data.

Study Area

Publication type Article
Publication Subtype Journal Article
Title Climate change and the detection of trends in annual runoff
Series title Climate Research
DOI 10.3354/cr008129
Volume 8
Issue 2
Year Published 1997
Language English
Publisher Inter-Research Science Publisher
Description 6 p.
First page 129
Last page 134
Country United States
Other Geospatial conterminous United States
Google Analytic Metrics Metrics page
Additional publication details