Ground water is the primary source of drinking water for towns in the upper Charles River Basin, an area of 105 square miles in eastern Massachusetts that is undergoing rapid growth. The stratified-glacial aquifers in the basin are high yield, but also are thin, discontinuous, and in close hydraulic connection with streams, ponds, and wetlands. Water withdrawals averaged 10.1 million gallons per day in 1989?98 and are likely to increase in response to rapid growth. These withdrawals deplete streamflow and lower pond levels. A study was conducted to develop tools for evaluating water-management alternatives at the regional scale in the basin. Geologic and hydrologic data were compiled and collected to characterize the ground- and surface-water systems. Numerical flow modeling techniques were applied to evaluate the effects of increased withdrawals and altered recharge on ground-water levels, pond levels, and stream base flow. Simulation-optimization methods also were applied to test their efficacy for management of multiple water-supply and water-resource needs.
Steady-state and transient ground-water-flow models were developed using the numerical modeling code MODFLOW-2000. The models were calibrated to 1989?98 average annual conditions of water withdrawals, water levels, and stream base flow. Model recharge rates were varied spatially, by land use, surficial geology, and septic-tank return flow. Recharge was changed during model calibration by means of parameter-estimation techniques to better match the estimated average annual base flow; area-weighted rates averaged 22.5 inches per year for the basin. Water withdrawals accounted for about 7 percent of total simulated flows through the stream-aquifer system and were about equal in magnitude to model-calculated rates of ground-water evapotranspiration from wetlands and ponds in aquifer areas. Water withdrawals as percentages of total flow varied spatially and temporally within an average year; maximum values were 12 to 13 percent of total annual flow in some subbasins and of total monthly flow throughout the basin in summer and early fall.
Water-management alternatives were evaluated by simulating hypothetical scenarios of increased withdrawals and altered recharge for average 1989?98 conditions with the flow models. Increased withdrawals to maximum State-permitted levels would result in withdrawals of about 15 million gallons per day, or about 50 percent more than current withdrawals. Model-calculated effects of these increased withdrawals included reductions in stream base flow that were greatest (as a percentage of total flow) in late summer and early fall. These reductions ranged from less than 5 percent to more than 60 percent of model-calculated 1989?98 base flow along reaches of the Charles River and major tributaries during low-flow periods. Reductions in base flow generally were comparable to upstream increases in withdrawals, but were slightly less than upstream withdrawals in areas where septic-system return flow was simulated. Increased withdrawals also increased the proportion of wastewater in the Charles River downstream of treatment facilities. The wastewater component increased downstream from a treatment facility in Milford from 80 percent of September base flow under 1989?98 conditions to 90 percent of base flow, and from 18 to 27 percent of September base flow downstream of a treatment facility in Medway. In another set of hypothetical scenarios, additional recharge equal to the transfer of water out of a typical subbasin by sewers was found to increase model-calculated base flows by about 12 percent of model-calculated base flows. Addition of recharge equal to that available from artificial recharge of residential rooftop runoff had smaller effects, augmenting simulated September base flow by about 3 percent.
Simulation-optimization methods were applied to an area near Populatic Pond and the confluence of the Mill and Charles Rivers in Franklin,