This report presents the results of a study conducted by the U.S. Geological Survey, in cooperation with the North Dakota Department of Health, to analyze historical water-quality trends in selected dissolved major ions, nutrients, and dissolved trace metals for 10 streams in southwestern and eastern North Dakota and to develop an efficient sampling design to monitor future water-quality trends. A time-series model for daily streamflow and constituent concentration was used to identify significant concentration trends, separate natural hydroclimatic variability in concentration from variability that could have resulted from anthropogenic causes, and evaluate various sampling designs to monitor future water-quality trends.
The interannual variability in concentration as a result of variability in streamflow, referred to as the annual concentration anomaly, generally was high for all constituents and streams used in the trend analysis and was particularly sensitive to the severe drought that occurred in the late 1980's and the very wet period that began in 1993 and has persisted to the present (2002). Although climatic conditions were similar across North Dakota during the trend-analysis period (1971-2000), significant differences occurred in the annual concentration anomalies from constituent to constituent and location to location, especially during the drought and the wet period.
Numerous trends were detected in the historical constituent concentrations after the annual concentration anomalies were removed. The trends within each of the constituent groups (major ions, nutrients, and trace metals) showed general agreement among the streams. For most locations, the largest dissolved major-ion concentrations occurred during the late 1970's and concentrations in the mid- to late 1990's were smaller than concentrations during the late 1970's. However, the largest concentrations for three of the Missouri River tributaries and one of the Red River of the North tributaries occurred during the mid- to late 1990's.
Concentration trends for total ammonia plus organic nitrogen showed close agreement among the streams for which that constituent was evaluated. The largest concentrations occurred during the early 1980's, and the smallest concentrations occurred during the early 1990's. Nutrient data were not available for the early 1970's or late 1990's. Although a detailed analysis of the causes of the trends was beyond the scope of this report, a preliminary analysis of cropland, livestock-inventory, and oil-production data for 1971-2000 indicated the concentration trends may be related to the livestock-inventory and oil-production activities in the basins.
Dissolved iron and manganese concentrations for the southwestern North Dakota streams generally remained stable during 1971-2000. However, many of the recorded concentrations for those streams were less than the detection limit, and trends that were masked by censoring may have occurred. Several significant trends were detected in dissolved iron and manganese concentrations for the eastern North Dakota streams. Concentrations for those streams either remained stable or increased during most of the 1970's and then decreased rapidly for about 2 years beginning in the late 1970's. The concentrations were relatively stable from the early 1980's to 2000 except at two locations where dissolved iron concentrations increased during the early 1990's.
The most efficient overall sampling designs for the detection of annual trends (that is, trends that occur uniformly during the entire year) consisted of balanced designs in which the sampling dates and the number of samples collected remained fixed from year to year and in which the samples were collected throughout the year rather than in a short timespan. The best overall design for the detection of annual trends consisted of three samples per year, with samples collected near the beginning of December, April, and August. That design had acceptable sensitivity for the detection of trends in most constituents at all locations. Little improvement in sensitivity was achieved by collecting more than three samples per year.
The sampling designs that were first evaluated for annual trends also were evaluated with regard to their sensitivity to detect seasonal trends that occurred during three seasons--April through August, August through December, and December through April. Design results indicated that an average of one extra sample per station per year resulted in an efficient design for detecting seasonal trends. However, allocation of the extra samples varied depending on the station, month, and constituent group (major ions, nutrients, and trace metals).
|Publication Subtype||USGS Numbered Series|
|Title||Water-quality trend analysis and sampling design for streams in North Dakota, 1971-2000|
|Series title||Water-Resources Investigations Report|
|Publisher||U.S. Geological Survey|
|Publisher location||Reston, VA|
|Contributing office(s)||North Dakota Water Science Center, Dakota Water Science Center|
|Description||v, 73 p.|
|Google Analytic Metrics||Metrics page|