Complete transformations of land cover from prairie, wetlands, and
hardwood forests to row crop agriculture and urban centers are thought to
have caused profound changes in hydrology in the Upper Midwestern US since
the 1800s. In this study, we investigate four large
(23 000–69 000 km
The magnitude, frequency, duration, and timing of streamflows strongly influence the water quality, sediment and nutrient transport, channel morphology, and habitat conditions of a river channel. While streamflows fluctuate naturally over event to millennial timescales, humans have also altered rainfall–runoff processes in pervasive and profound ways (Vörösmarty et al., 2004). For example, humans have substantially altered the timing and magnitude of evapotranspiration, have dammed, channelized, and leveed waterways, and have installed artificial drainage networks in former wetlands (Boucher et al., 2004; Dumanski et al., 2015; Rockström et al., 2014; Schottler et al., 2014; Vörösmarty et al., 2004). While it is inevitable that wetland removal and artificial drainage will change rainfall–runoff processes, the effects of drainage on the hydrologic cycle may be subtle and difficult to discern, and may manifest differently at different spatial scales and times of year (e.g., Bullock and Acreman, 2003; Foufoula-Georgiou et al., 2016; Irwin and Whiteley, 1983; O'Connell et al., 2007).
Systematic increases in peak, mean, total, and base flows are widely reported
in the Midwestern USA. Such increases have been attributed to changes in
climate, such as increasing precipitation and earlier snowmelt, and land use,
including widespread conversion from perennial vegetation, such as grasses,
to annual row crops, primarily corn and soybean, and the addition of
artificial drainage (e.g., Foufoula-Georgiou et al., 2015; Frans et al.,
2013; Gerbert and Krug, 1996; Juckem et al., 2008; Novotny and Stefan, 2007;
Schilling and Libra, 2003; Schottler et al., 2014; Xu et al., 2013; Zhang and
Schilling, 2006). Furthermore, large-scale, land use land cover (LULC)
changes influence surface energy fluxes and thus have feedbacks on climate
and water balances. As a result of the Green Revolution, net primary
production increased during the twentieth century in the Midwestern US, which
subsequently increased evapotranspiration (ET) demands, especially during the
peak growing season (Mueller et al., 2015). Corn yields (bushels per acre)
tripled in the US between 1949 and 1989 (US Department of Agriculture Bureau
of Agricultural Economics Crop Reporting Board, 1949; US Department of
Agriculture National Agricultural Statistics Service Agricultural Statistics
Board, 1990). However, any increase in ET demand due to crop yield increases
may have been offset during this time by the addition and replacement of
agricultural drainage. Regional studies have reported increases in Midwestern
crop yields and yet simultaneously decreases in ET for artificially drained
agricultural basins, where streamflows have subsequently increased during the
twentieth century. (Frans et al., 2013; Schottler et al., 2014). Therefore,
the question remains: how have combined climate and land use changes affected
streamflows in very large (
Many basins across the Midwestern Corn Belt and around the world are experiencing greater runoff, higher sediment and nutrient loads, and accelerated loss of habitat than in the past (Blann et al., 2009). Linkages between artificial agricultural drainage and increased nutrient export have been well documented (David et al., 1997; Goolsby et al., 1999; Kreiling and Houser, 2016; Letey et al., 1977; Randall and Mulla, 2001; Royer et al., 2006; Schilling et al., 2017; Sims et al., 1998). Less research has focused on the implications of hydrologic change for sediment loads in agricultural landscapes. For waters impaired by sediment under the US Clean Water Act (CWA), EU Water Framework Directive, and similar regulations around the world, loads often consist of both natural and human-derived sediment sources (Belmont et al., 2011; Gran et al., 2011; Belmont and Foufoula-Georgiou, 2017). Differentiating between these two sources is often very difficult, and yet is essential for identifying and achieving water quality standards (Belmont et al., 2014; Trimble and Crosson, 2000; Wilcock, 2009). Sediment sources derived from near or within the channel itself (e.g., bank erosion from channel widening) are particularly sensitive to changes in streamflows (Lauer et al., 2017; Schottler et al., 2014; Lenhart et al., 2013). Bank erosion is a significant sediment source in many alluvial rivers, contributing as much as 80 to 96 % of the sediment that comprises a river's total sediment load (Kronvang et al., 2013; Palmer et al., 2014; Schaffrath et al., 2015; Simon et al., 1996; Stout et al., 2014; Willett et al., 2012). For some agricultural basins, erosion of near-channel sources contributes more fine sediment than does agricultural field erosion (Belmont et al., 2011; Lenhart et al., 2012; Trimble, 1999). However, if artificial drainage practices act to amplify streamflows, then the source of accelerated bank erosion may still be linked to agriculture. Artificial drainage is currently unregulated at the federal level in the US and many countries around the world. Therefore, in stark contrast to urban hydrology, progress in understanding the effects of agricultural drainage has been hindered by the fact that accurate data regarding the location, size, depth, efficiency, and connectivity of sub-surface drainage systems are rarely available.
The United States is the largest producer of corn and soybeans in the world (Boyd and McNevin, 2015; Guanter et al., 2014). Exceptionally high agricultural productivity over the past century and a half required massive conversion of grasslands, wetlands, and forests to agricultural lands (Dahl, 1990; Dahl and Allord, 1996; Marschner, 1974). Although many advances in cropping practices have led to the modern day prosperity of the Corn Belt, artificial drainage has played a critical role for agriculture in the Midwestern USA. Throughout this paper “artificial drainage” is used as a general term that refers to both human installed surface ditches and subsurface tile drainage. Tile drains and ditch networks are installed to ameliorate water-logged soils, which are known to limit crop growth (Hillel, 1998; Sullivan et al., 2001; Wuebker et al., 2001). Modern tile drains are composed of corrugated plastic tubing and are typically installed at depths of 1–2 m to control the elevation of the water table below the soil surface (Hillel, 1998).
The economic benefits of artificial drainage are well understood by Midwestern farmers, who have invested heavily in drainage systems to reduce soil moisture, surface overland flow, and soil erosion, and increase land value, ease of equipment operation, and production of first class crops such as corn and soy (Burns, 1954; Fausey et al., 1987; Hewes and Frandson, 1952; Johnston, 2013; McCorvie and Lant, 1993). Installation or enhancement of tile drainage systems often occurs simultaneously with land conversion from wild hay and small grains to soybeans, as Fig. S1 demonstrates in the Supplement (Blann et al., 2009; Burns, 1954; Hewes and Frandson, 1952). Conversion of perennial grasses to corn and soybean rotations does not necessarily lead to a reduction in ET over the course of an entire growing season, at least for well-drained soils (Hamilton et al., 2015). However, several studies report a reduction of ET early in the growing season (Hickman et al., 2010; McIsaac et al., 2010; Schottler et al., 2014; Zeri et al., 2013) and greater evapotranspiration rates than native prairie during the peak growing season (Wolf and Market, 2007; Zeri et al., 2013). Thus changes in land cover (and ET) and drainage expansion have been found to alter watershed hydrology and increase mean annual flows (Harrigan et al., 2014; Kibria et al., 2016), base flows (Juckem et al., 2008; Robinson, 1990; Schilling and Libra, 2003; Xu et al., 2013), annual peak flows (Dumanski et al., 2015; Magner et al., 2004; Skaggs et al., 1980, 1994), and total flow volumes (Dumanski et al., 2015; Frans et al., 2013; Lenhart et al., 2011). While it seems inevitable that altering ET and subsurface drainage efficiency should have measurable effects on streamflow, the combined effects have proven difficult to isolate empirically, especially across scales, due to measurement uncertainties, high temporal and spatial variability in antecedent moisture conditions and runoff processes, a shift towards a wetter climate today than in the historical past, as well as limited documentation of artificial drainage installation in the US.
In this paper we couple analysis of historical patterns in large
(
We acknowledge that the conversion of precipitation to streamflow occurs by a complex suite of physical processes. Inevitably, we lack temporal and spatial coverage/resolution of all of the relevant hydrologic fluxes (e.g., groundwater, actual evapotranspiration, infiltration, soil water flux rates) to characterize the system completely and have limited ability to ascribe subtle changes to any given physical process, especially at large scales. Yet, with increasing concerns about water quality and aquatic biota, disentangling the effects of artificial drainage and changing precipitation patterns is important for evaluating economic costs, benefits and risks, predicting the effects of future land and water management, and informing future policy.
We analyze hydrologic and land use change in four large Midwestern watersheds
during 1935–2013. We selected these basins for the following reasons: all
are agricultural, to various degrees, primarily producing corn and soybeans;
all are located mainly within the Central Lowland physiographic province and
were affected by continental glaciation resulting in mostly flat, poorly
drained uplands and incised river valleys (Arnold et al., 1999; Barnes, 1997;
Belmont et al., 2011; Day et al., 2013; Gran et al., 2009; Groschen et al.,
2000; Rosenberg et al., 2005; Stark et al., 1996); and all are characterized
by a humid, temperate climate (Kottek et al., 2006). Additionally, all four
basins also contain waters impaired for excessive sediment under the US Clean
Water Act. Therefore, deconvolving climate and land use effects on basin
hydrology is essential for developing and attaining sediment- and
nutrient-related water quality standards. Despite the broad similarities
between basins, we have intentionally selected watersheds that span a
gradient of climate and land use change. From northwest to southeast, these
include the Red River of the North basin (RRB), upstream of Grand Forks, ND
(67 005 km
2013 relative proportion of each land cover class for the four study watersheds, Red River of the North basin (RRB), Minnesota River basin (MRB), Chippewa River basin (CRB), and Illinois River basin (IRB). Data from USDA National Agricultural Statistics Service Cropland Data Layer (2013).
Soils in the Minnesota River basin consist of organic rich but poorly drained mollisols with a very small area consisting of alfisols and entisols (Stark et al., 1996). The Illinois River basin is generally dominated mollisols, containing around 1 % organic matter and generally of low to very low permeability, with some presence of more permeable alfisols and entisols (Arnold et al., 1999; Groschen et al., 2000). The dominant soil orders found in the Red River of the North basin include mollisols and alfisols with some areas underlain by entisols and histosols (Stoner et al., 1993). In the Chippewa River basin, alfisols and spodosols are most prevalent, with occasional pockets of entisols, mollisols, and histosols (Hartemink et al., 2012; Soil Survey Staff, NRCS).
There is a broad northwest to southeast precipitation and temperature gradient across the region (Fig. S2). The RRB is the coldest and driest of all four study basins, although the last 2 decades (1990s and 2000s) have been the wettest in historical times. Precipitation records, lake level elevations, and paleoclimate studies indicate that the basin is prone to extreme climate variability (Fritz et al., 2000; Miller and Frink, 1984). Much like the RRB, the adjacent MRB is uniquely situated at a “climatic triple junction” where warm moist air from the Gulf of Mexico, cold dry air from the Arctic, and dry Pacific air dominate at different times of the year and have varied in relative dominance in the past (Dean and Schwalb, 2000; Fritz et al., 2000). Temperature and humidity in the CRB are more strongly influenced by the Great Lakes than in the other basins. The southwest IRB generally receives more precipitation than the northeast in all months. On average each basin from northwest to southeast receives 589, 716, 822, and 960 mm annually, with 59–68 % of the annual precipitation falling in the spring (MAM) and summer (JJA) months based on annual long term means, 1981–2010 (Fig. S2). Recent increases in precipitation and streamflows have been reported across the region during the last few decades (Foufoula-Georgiou et al., 2015; Frans et al., 2013; Gerbert and Krug, 1996; Groisman et al., 2001; Juckem et al., 2008; Novotny and Stefan, 2007; Schottler et al., 2014).
Settlement, agricultural intensification, and development differ in timing and intensity among basins, but are generally similar. During the early to mid-nineteenth century, permanent occupation of the Midwest was difficult without the aid of artificial drainage (Beauchamp, 1987). Beginning in the mid-1800s, organized drainage districts and enterprises installed ditches and tile to drain many permanently or seasonally wet areas and create more arable land (Beauchamp, 1987; Skaggs et al., 1994). Between 1850 and 1930 Illinois, Minnesota, and Wisconsin lost an estimated 90, 53, and 32 % of state wetlands, respectively (McCorvie and Lant, 1993). Enormous tracts of wetlands and tall grass prairie (millions of acres) were levelled and drained, mainly by surface ditches and canals, in the RRB during this same time (Miller and Frink, 1984). Artificial drainage increased property value, and as corn and soybean commodity prices increased, as they did following WWII, in the mid-1970s, and most recently a tripling of commodity prices between 2002 and 2012 (Glaser, 2016; Johnston, 2013), lands previously cultivated for small grains or left as wet meadows were drained and converted to soybean and corn fields (Blann et al., 2009; Burns, 1954; Wright and Wimberly, 2013). Although many advances in cropping practices have led to the modern day prosperity of the Corn Belt, drainage installation and intensification has played a critical role for agriculture in the Midwestern US. Today the RRB, MRB, CRB, and IRB, respectively, contain 45, 78, 12, and 60 % of land cultivated for corn and soybeans, yet estimates of tile drainage in these basins remain poorly constrained (Fig. 1). Within the Bois de Sioux watershed, a sub-basin of the RRB where permits are required for drain tile installation, annual installation has increased from 5 km in 1999 to 3096 km in 2015 for a cumulative total of 24 304 km of new tile installed since 1999 (Bois de Sioux Watershed District, 2015). Tile drainage installation in all basins continues to this day.
The other major anthropogenic impact that affects all basins is dams installed for hydropower, navigation, water resources, and recreation. Most of the dams in our study basins are small and were constructed in the late 1800s and early 1900s (Barnes, 1997; Delong, 2005; Graf, 1999; Hyden, 2010; Lian et al., 2012; Martin, 1965; Stoner et al., 1993; United States Army Corps of Engineers, 2016). Therefore, the effects of these dams would have been established well before our study period. For example, in the Illinois River basin all major dams had been completed by 1939. Based on work by Lian et al. (2012), streamflow changes post 1938, specifically peak flows, have been influenced more by climate than by dam operations, though they did not consider the effects of drain tile. One exception might be the uppermost Illinois River basin, which has been influenced by expansion of the Chicago metropolitan area. Though historical and present water withdrawals are largely unknown, increased water use for industry, agriculture, and public drinking supply may offset some of the climate impacts of increased precipitation. Urban and suburban detention basins may also limit how much precipitation is converted to runoff. We expect that other water development projects in each basin have minimally affected streamflows at the basin outlet. Conversion of hay and small grains to corn and soybeans accompanied by artificial drainage expansion were likely the largest LULC changes in these basins from the early to mid-twentieth century.
We explain our methods for addressing how LULC, climate, and streamflows have changed during the twentieth and twenty-first centuries in Sect. 3.1 through 3.3. In Sect. 3.4 we explain how the timing and timescales of prominent change were determined. We use a water budget to determine whether precipitation and evapotranspiration alone can explain runoff trends in Sect. 3.5.
We compiled county level US Census of Agriculture drainage data from 1940, 1950, 1960, 1978, and 2012 for each study watershed, weighing partial counties by area (US Bureau of the Census, 1942, 1952, 1961, 1981; US Department of Agriculture, 2014a). Tabulations of drainage enterprises exclude lands draining less than 500 acres in all years except 1940 (US Bureau of the Census, 1922, 1952). In 1940 and 2012, acres drained by ditches and tile were reported individually. To normalize the land area across basins of different sizes, we report the percentage of watershed area drained. While the uncertainties in these data are high, they are the best data available on a national scale for our study period. Some studies (e.g., David et al., 2010) have taken advantage of other drainage estimates, such as those from Sugg (2007). However, the Sugg (2007) method was calibrated and validated using data from 1987 and 1992 drainage census reports. Therefore it is unclear whether this approach could be used to estimate historical or current drainage extents. Furthermore, the drainage estimates are based on soil type, class, and crop type and assume that state percentages of average cropland area drained are uniform for every county in each state and have remained static through time (Sugg, 2007). Although somewhat tedious, we use US Census of Agriculture drainage data as the best available proxy for the relative drainage extent and expansion through time in each of the four large study basins, the smallest of which is still larger than 20 counties.
County level agricultural census drainage data are only available for 5 census years. Therefore, we also compiled annual USDA National Agricultural Statistics Service (NASS) crop acreage harvested in each basin following the methods of Foufoula-Georgiou et al. (2015). We report the percentage of corn, soybeans, and hay and small grains grown in each watershed from 1915 to 2015. Artificial drainage installation has typically coincided with the replacement of hay and small grains for soybeans as shown in Fig. S1 in the Supplement (Burns, 1954; Hewes and Frandson, 1952). Therefore we use these annual crop data as another indication of LULC changes.
Monthly Parameter elevation Regression on Independent Slopes Model (PRISM)
precipitation rasters produced by the PRISM Climate Group (2004) and modeled
actual evapotranspiration (ET
Livneh et al. (2013) evapotranspiration was produced for the continental
United States using the Variable Infiltration Capacity (VIC) model run at
3 h time steps in energy balance mode, consistent with methods of Maurer et
al. (2002). Hereafter we refer to Livneh et al. (2013) and Maurer et
al. (2002) as L13 and M02. We have chosen L13 data over other available
estimates of evapotranspiration because they cover a large spatial and
temporal domain necessary for the study, i.e., the contiguous US from 1915 to
2011, at reasonable spatial (
Although the precipitation input used to generate the ET
United States Geological Survey (USGS) stream gauge stations listed by study basin.
We evaluated annual (seasonal), monthly, and daily flow metrics for each of the four river basins. Using multiple gauges for a single basin, we compiled seven annual flow metrics: mean annual flow, 7-day average annual low flow winter (November–April), 7-day average annual low flow summer (May–October), peak mean daily flow spring (March–May), peak mean daily flow summer and fall (June–November), high flow days, and extreme flow days using mean daily flow data from USGS gauges within each basin (Fig. S2; Table 1) following the methods of Novotny and Stefan (2007). The number of high and extreme flow days refers to the number of days in a given year that are 1 and 2 standard deviations above the 1950–2010 mean. For each gauge, we normalized the annual flow metric by the 1950–2010 mean to facilitate comparisons among basins and to observe similarities in trends among metrics. Each gauge record included a minimum of 62 years, and of the 63 gauges analyzed, 53 gauges had continuous records. Of the 10 non-continuous records, 4, 2, 2, 1, and 1 gauges were missing for 2, 4, 6, 8, and 14 years of data, respectively, during the period 1929–2013 (Table 1).
For the downstream outlet gauge in each basin (Table 1) we computed annual
and monthly streamflow average depths (cm month
In order to determine whether observed changes in climate and streamflow are statistically meaningful and potentially coincident with LULC change, we first determined the timing of climate, streamflow, and LULC change. Annual crop data reveal the timing of a rapid expansion of soybean acreage and indicate land use land cover transitions (LCTs) when soybean acreage exceeds hay and small grains (Foufoula-Georgiou et al., 2015). We identified the timing of precipitation and streamflow change using wavelets and by fitting a piecewise linear regression (PwLR) using a least-squares approach to the monthly streamflow and precipitation volume time series in each basin (Liu et al., 2010; Tomé and Miranda, 2004; Verbesselt et al., 2010; Zeileis et al., 2003).
A common method for detecting and quantifying changes in the
magnitude/frequency content of a time series is via a localized
time-frequency analysis using wavelets. The continuous wavelet transform
(CWT) of a signal
We also evaluated precipitation and streamflow change using two statistical tests and three breakpoints. We selected 1974/1975 as a breakpoint for the pre-period and post-period because it lumps the time series data into two roughly equal periods (40/39 years) and coincides with the timing of widespread acceptance of cheaper and easier to install corrugated plastic tile (Fouss and Reeve, 1987), and other studies in the MRB and IRB have identified hydrologic change occurring around that time (e.g., Foufoula-Georgiou et al., 2015; Lian et al., 2012; Schottler et al., 2014). Acknowledging that 1974/1975 may not be the hydrologically relevant breakpoint in all basins at this large scale, we ran statistical tests using 1974/1975 as well as the breakpoints identified for each basin from the PwLR and LCT.
We performed one-tailed Student's
For a given watershed over a specified time period of integration, water
inputs minus water outputs are equal to the change in storage per unit time:
We have computed average annual water budgets for each basin by accumulating
monthly
Livneh et al. (2013) did not incorporate land use land cover changes, such as tile drainage expansion or crop changes, into the VIC model. The fact that LULC change is not included in the model is what allows us to test, external to the ET predictions, whether or not a LULC effect exists. There is no evidence of regional groundwater change and the effects of dams and urbanization on streamflows are likely minimal as discussed in Sect. 2. Comparing these data to other estimates of evapotranspiration including four AmeriFlux towers, two of which are in corn–soy agricultural areas, we demonstrate that they are sufficiently reliable modern estimates for our purposes (Table 2; Figs. S4 and S5).
Site details for AmeriFlux sites used for comparison with Livneh et al. (2013) evapotranspiration data (L13), where L13(JJA) represents a 17 % reduction in ET during the summer months June, July, and August. The average annual difference is positive when L13/L13(JJA) ET is greater than Ameriflux ET and negative when less. The nearest study watersheds are abbreviated: Chippewa River basin (CRB), Illinois River basin (IRB), Minnesota River basin (MRB), and Red River of the North basin (RRB).
We acknowledge that there is uncertainty in all of the input data and
understand that the magnitude of the storage term is sensitive to estimates
of ET. Livneh et al. (2013) reported a 17 % overestimation of
ET
We present records of land use land cover in Sect. 4.1 and discuss the timing of notable change in Sect. 4.2. In Sect. 4.2 we also present the timing, timescales and times of year when changes in precipitation, evapotranspiration, and streamflow magnitude are most prominent. Finally, we present the results of a water budget in Sect. 4.3 to address whether change in climate variables alone can explain runoff trends. Discussion of how the combined results address our three research questions can be found in Sect. 5.
Across the Upper Midwest, the percent of land drained by tiles and ditches and cultivated for corn and soybeans has increased since the early twentieth century, while land cultivated for hay and small grains has declined. Figure 2 shows the percent of each watershed drained by tiles and ditches from the Census of Agriculture data, as well as the percent of each county drained by tile in 1940 and 2012. Total drainage and tile drainage have increased in the MRB and IRB, while they remained relatively unchanged from 1940 to 2012 in the CRB and RRB (Fig. 2). The drainage census data show that the MRB has the greatest percentage of the watershed area drained by tile, 19 % in 1940 and 35 % in 2012, and ditches, 7 % in 1940 and 10 % in 2012, followed closely by the IRB (Fig. 2). The Red River of the North basin has experienced very little increase in total drainage since 1940. Most artificial drainage in the RRB is ditches rather than tile drains. Although a dramatic increase in tile installation has been reported in the Red River Valley since the 1990s, the area of this expansion appears small relative to the watershed area. Acres reported to be drained by tile in 2012 represent only 2 % of the total watershed area. The CRB has very little agricultural land and thus the 2012 census reports less than 1.5 % of the watershed area drained by tile and ditches (Fig. 2).
The 1978 census data illustrate the uncertainty associated with reporting, as it is unlikely for total drainage to have decreased between 1960 and 1978 in the RRB and MRB (Fig. 2). Most county ditches and tile in Blue Earth County, Minnesota, were installed during the 1910s and 1920s, with a noticeable drop off during WWII and a resurgence of drainage enterprises starting in the 1960s (Blue Earth County Minnesota, 2016). Burns (1954) reported that the 1940 census data underestimated drainage enterprises in Blue Earth County by 8.5 %, simply due to inaccuracies in reporting. According to one report, it was estimated that 27 % of drained land in the United States was not included in the 1960 drainage census due to private drainage operations on lands of less than 500 acres (Gain, 1967). Furthermore, 82, 80, 51, and 91 % of all farms in Minnesota, Illinois, North Dakota, and Wisconsin, respectively, were less than 500 acres in 2012, and therefore were not included in survey results (US Department of Agriculture, 2014b). Therefore these estimates are likely to underestimate the area drained by tile and ditches. Although the 2012 census attempts to correct for incomplete and missing responses, because drainage enterprise records have traditionally been so poorly documented, it is difficult to know how much reported acreage underestimates the actual acreage.
We also note that acres drained by tile and ditches do not directly translate
to effectiveness of artificial drainage. Several factors influence the flow
rate from soils, including the hydraulic conductivity of the soil,
macropores, depth of the water table, depths of the tile lines, tile
diameter, slope of the tile or ditch, horizontal spacing, as well as
precipitation intensity and duration and antecedent soil conditions (Hillel,
1998). We simply do not have this level of information regarding artificial
drainage in the Midwestern USA and suspect that the spatial variability in
drainage management practices may be high. For example, Naz et al. (2009)
mapped tile drains in a 202 km
While we expect that the drainage trends observed are relatively correct, we
are cautious about drawing any definitive conclusions from the Census of
Agriculture data regarding the actual extent of tile drainage and changes
over time. It is clear that these estimates tend to underestimate the amount
of drainage. Nevertheless, total drainage and tile drainage in the Minnesota
River basin and Illinois River basin have increased considerably since 1940.
It is known anecdotally, but is not included in these data, that tile
drainage spacing has decreased and intensity or drainage rate in mm h
Acres harvested of corn, soybeans, and hay and small grains (barley, oats, wheat) expressed as percent watershed area for each of the basins based on county level data from USDA NASS. The sum of these three commodity groups is shown as a total in black and the percent of this total area in corn and soybeans is plotted in blue. Vertical dashed lines indicate when the percent of basin area harvested for soybeans exceeds hay and small grains. Horizontal dashed lines indicate when the percent of total area harvested for corn and soybeans exceeds 60 % in the Red River of the North basin and Chippewa River basin and 75 % in the Minnesota River basin and Illinois River basin.
Conversion from small grains to soybeans is often accompanied by increased sub-surface drainage installation (Foufoula-Georgoiu et al., 2015). Figure 3 displays the percent of each basin harvested for corn, soybean, and hay and small grains from 1915 to 2015. There has been a decline in hay and small grains and an increase in soybeans in all four of the watersheds over the period of record. The RRB is the only basin containing a significantly higher percentage of soybean acreage relative to corn; on average since 1995, soybean acreage in the RRB has been more than twice that of corn.
Overall, changes in crop type occurred gradually in the MRB and IRB, and much more rapidly and recently in the RRB (Fig. 3). The CRB is largely non-agricultural: only 9 % of the basin grew corn, soy, and hay and small grains in 2015, and the changes in the basin have been small during the period of record (Fig. 3). While we cannot directly ascribe these changes in crop type to changes in drainage practices or vice versa, they provide a relatively detailed history of LULC and whether the changes occurred gradually or rapidly and recently or long ago in each basin.
Summary of the breakpoint years identified from land cover
transition (LCT) (Fig. 3), piecewise linear regression (PwLR) of
precipitation (
The land cover transition (LCT), precipitation, and streamflow breakpoints of change identified using piecewise linear regression (PwLR) and continuous wavelet transform (CWT) reveal that the timing of precipitation and streamflow change generally preceded LCT change (Table 3). This was true for all tests in the RRB and CRB. However, there are some chronological differences in the order of precipitation, streamflow, and LCT breakpoints. In the IRB, the timing of LCT precedes precipitation and streamflow breakpoints identified using PwLR and CWT by between 13 and 20 years (Table 3). In the MRB, LCT follows precipitation by 20 years and streamflow by 11 years as identified using PwLR, but precedes the streamflow breakpoint by 1 year identified using CWT (Table 3).
Land cover transition breakpoints shown in Fig. 3 are not exact; land cover change occurs gradually, and therefore LCT breakpoints represent when a large portion of each watershed was converted from hay and small grains to soybeans. Land cover transition breakpoints are indicated in two ways: (1) when the percent watershed area harvested for soybeans exceeds hay and small grains, and (2) when the proportion of the total acreage harvested for the three commodity groups is dominated by corn and soybeans. The second criterion varies from basin to basin, as some basins may have historically grown more hay and small grains, and others more corn and soybeans. In the CRB and RRB, hay and small grains exceeded 50 % of the total area harvested for corn, soybeans, and hay and small grains from 1915 until the year 2000 or later. However, in the MRB and IRB, hay and small grains only exceeded 50 % of the total area harvested for the three commodity groups from 1915 until 1950 or earlier. The LCT breakpoints, indicated by the vertical dashed lines in Fig. 3, approximately coincide with the horizontal dashed lines, which represent a time when the percent of the total acres harvested for the three commodity groups exceeded 60 % in the RRB and CRB, where hay and small grains have historically dominated, and 75 % in the MRB and IRB, where corn and soybeans have historically dominated. We acknowledge that these breakpoints do not consider the actual extent of soybeans, which is assumed to be a surrogate approximation for the area of drained croplands. Soybean coverage is much higher for both the MRB and IRB compared to the RRB and CRB, even before 1955. Considering the large proportion of the MRB and IRB watersheds cultivated for soybeans in the early 1950s combined with extensive (20–25 %) drainage by 1940 and 1950 (Fig. 2), this suggests streamflow changes generally occurred after both precipitation and LCT changes.
Continuous wavelet transform (CWT) energies for monthly volumetric
streamflow (
We observe minimal changes in the energy of the annual and inter-annual precipitation signal for any basins during the period of record, and therefore could not identify the timing of precipitation change in any basin using CWT (Fig. 4). However, Fig. 4 displays significant increases in the annual and inter-annual energy of the basin outlet streamflow signal around 1975, 1980, and 1995 for the IRB, MRB, and RRB, respectively, while the CRB does not exhibit any striking changes in energy throughout the period of record. All decadal energy shifts in the precipitation signals are clearly translated into the decadal energy of the streamflow signals for all four basins (Fig. 4). The observed correlation between the decadal energy changes in streamflow and precipitation signals together with the lack of any significant correlation between their energies at the annual scale may signal the importance of factors other than precipitation, here artificial drainage, to streamflows in the MRB, RRB, and IRB at the annual scale.
In all basins, the timing of precipitation change coincided with or preceded streamflow breakpoints based on PwLR (Table 3). Similar temporal coincidence of precipitation and streamflow breakpoints in contrast to the LCT and streamflow breakpoints may suggest that streamflow changes are tightly coupled with precipitation changes. However, that interpretation fails to account for the potential effects of drainage, which could amplify the streamflow response to precipitation.
The raw time series of spatially averaged annual precipitation and streamflow
depths (cm), reported in the Supplement, show an increasing trend in
precipitation and streamflow in the RRB, MRB, and IRB and no trend in the CRB
(Fig. S6). The magnitude of the precipitation and streamflow trends are on
the order of 120–150 and 90–170 mm century
Unlike the Chippewa, flow metrics in the Minnesota, Red, and Illinois river
basins have systematically increased in recent decades, with nearly a 2-fold
increase or greater in almost all flow metrics since 1975 (Fig. 5). Seven-day
low flows in summer and winter (i.e., the lowest annual flows) have increased
most in these basins, where mean conditions have increased by 67–275 % (
All seven flow statistics in the Red River of the North basin increase
dramatically after the mid-1990s (Fig. 5a). Low flows have increased 3.5–4
fold (
The MRB and RRB exhibit an increase not only in the magnitude, but also in the cyclicity and synchronicity of these metrics after about 1980 (Fig. 5a). Cyclicity could imply that climate is playing a role in the observed increase in flows. However, the extent to which agricultural land and water management practices may be amplifying this climate effect cannot be ascertained from this figure alone. The Illinois River basin exhibits the most change in summer and winter 7-day low flows, which increase after 1970, and this trend is even more pronounced when only examining gauges within predominantly agricultural sub-basins that are unaffected by large dams (Fig. 5c). However, the changes in the RRB and MRB are much more obvious and statistically significant than those in the IRB.
Statistical results for annual changes in streamflow and precipitation for
all breakpoints can be found in Table S1 in the Supplement. The following
results are based on the 1974/1975 breakpoint. Overall, average annual
streamflow, precipitation, and evapotranspiration depths have increased
significantly in the MRB and RRB, while only streamflow has increased
significantly in the IRB; no significant changes are reported in the CRB.
Average annual runoff depth at the outlet gauge of the MRB has increased by
5.9 cm (
The MRB and RRB exhibit the greatest change in the annual runoff ratio,
followed by the IRB, with negligible change in the CRB. These findings are
consistent with the fact that the MRB and RRB have relatively low runoff
ratios comparted to the CRB and IRB, and are the only two basins where annual
precipitation and evapotranspiration increases were statistically
significant. On average, the fraction of annual precipitation that goes as ET
has decreased 1.0–2.4 % in all four study basins, which is smaller in
magnitude but consistent in direction of change with
Schottler et al. (2014) who found the ratio of PET
Cumulative monthly precipitation (blue) and streamflow (red) depths (cm) for each river basin. Breakpoints, where the streamflow–precipitation relationship starts to change, are hard to detect from the time series alone, but can be clearly seen from the cumulative plots of the monthly data (i.e., when similar increments of monthly precipitation are translated into larger amounts of monthly streamflow).
Cumulative monthly precipitation, plotted in Fig. 6, indicates no systematic change in cumulative precipitation with time (i.e., constant slope) for any basin. However, cumulative monthly streamflow (1935–2013) plotted in Fig. 6 indicates a sudden change in slope around 1973 in the IRB, 1980 in the MRB, and 1995 in the RRB, without a distinct change in slope in the CRB. The visually identified change points are consistent with those identified from the CWT (Fig. 4).
Statistical tests of monthly streamflow and precipitation resulted in the
same interpretations for 95 % of the tests regardless of the breakpoint
(Table S1 in the Supplement); therefore Fig. 7 summarizes the results of
these statistical tests for flow and precipitation in all basins using the
1974/1975 breakpoint. Figure 7a illustrates the kernel density estimation, or
non-parametric estimation of the probability density function, during the
pre-period and post-period for June and September flows in each basin.
Figure 7b reports 192 results (48
In stark contrast to the CRB, the streamflow color wheels for the MRB and RRB show significant changes in mean and distribution of monthly streamflow for nearly all months (22 out of 24 for MRB and 21 out of 24 for RRB) (Fig. 7b). In the RRB, mean precipitation in October has increased, and the precipitation distributions have shifted to the right for September and October (Fig. 7b). In the MRB, there has been a significant increase in mean March precipitation (Fig. 7b). The IRB exhibits fewer overall changes in streamflow than the RRB and MRB, with significant changes in monthly streamflow volumes for September, October, November, December and March, and significant changes in August and November precipitation (Fig. 7b).
We acknowledge that due to high variability and small sample sizes, we may not have sufficient power to detect small but real changes in precipitation and streamflow using these statistical tests, and thus may be prone to Type II error (Belmont et al., 2016). However, these results are consistent with the qualitative assessment of CWT, results of the seven annual flow statistics, and cumulative precipitation and streamflow trends, which indicate only slight changes in total precipitation across all basins, large increases in total flow in the MRB and RRB, moderate flow increases in the IRB, and no streamflow changes in the CRB (Figs. 4, 5, and 6).
To understand whether the cause and effect interconnection of streamflow
(
Log–log empirical quantiles of joint PDF plots of monthly
streamflow (
There is a shift toward a larger monthly streamflow volume for the same
volume of precipitation at each 10 and 60 % quantile in the MRB and 60 and
90 % quantile in the RRB (Fig. 8). However, it appears the 90 %
exceedance contour for the MRB and 10 % exceedance contour for the RRB have
shifted up and to the right, indicating that an increase in precipitation in
the driest months in the MRB and wettest months in the RRB could also be
driving some of the change in flow (Fig. 8). Certainly the largest observable
change in the MRB and RRB during this time is a shift from small grains to
soybeans and an increase in the density and efficiency of drain tile
networks. While the analyses shown above documented significant changes in
the streamflow of the IRB (Figs. 4, 5, 6, and 7b), this change is not as
obvious in these joint PDF contours, which indicate only a slight vertical
shift in all quantiles (Fig. 8). Consistent with other analyses, the CRB does
not demonstrate any shift in the
At the daily scale, we found an increase in the magnitude of streamflow
change (hydrograph slopes) for both the daily rising limbs (d
Daily streamflow change exceedance probabilities, where
daily d
While time series and statistical analyses reveal useful insights regarding the timing, magnitude, and significance of precipitation and streamflow changes, as well as provide a qualitative indication of whether or not changes in precipitation and streamflow may be correlated and proportional, they cannot fully deconvolve or attribute the influence of artificial drainage and climate on streamflows (Harrigan et al., 2014). Therefore, we calculate water budgets for each basin as a tool to understand whether the observed changes in precipitation are large enough to account for the changes in streamflow, and whether there is more or less watershed storage in recent times than in the past (Healy et al., 2007).
Observed average annual precipitation (
Table 4 reports the calculated average annual water budget terms –
precipitation, streamflow, evapotranspiration, and change in storage –
during the periods before and after the 1974/1975 and LCT breakpoint using
raw and conservative (reduced by 17 % in JJA) estimates of ET
The CRB, which is not intensively drained (Fig. 2) and has experienced little change in crop type (Fig. 3), has been subject to an increase in precipitation, but does not exhibit an increase in runoff (Table 4), consistent with Figs. 8 and 9b. The overall trends in the CRB water budget indicate that water storage may have actually increased slightly between the pre-period and post-period, which could be accomplished through increased soil moisture, groundwater recharge, or reservoir storage in recent times.
Using conservative estimates of summer ET
Average monthly (January–December) change in basin soil moisture,
groundwater, and/or reservoir storage (d
The Red River of the North and Minnesota River basins have some of the poorest drained soils of the Upper Midwest and historically grew more hay and small grains than the other basins (Fig. 3). The introduction of artificial drainage combined with the replacement of hay and small grains with soybeans and the lack of major dams and municipal and industrial water use, has resulted in pronounced streamflow amplification in response to land use and climate changes in the RRB and MRB relative to the IRB and CRB (Fig. 4). Additionally these two basins have seen greater changes in annual and even monthly precipitation (Figs. 7 and 8). However, the extensively drained Minnesota River Basin has seen the largest increases in flow and largest decrease in watershed storage for relatively similar climatic change to the IRB and RRB, and this is likely because of the high degree of watershed hydrologic alteration and connectivity from drainage and lack of other anthropogenic water uses.
In this paper we address three research questions: (1) how have LULC,
climate, and streamflows changed during the twentieth and twenty-first
centuries; (2) what are the timing, timescales and times of year where
changes are most prominent; and (3) can changes in climate alone explain
changes in streamflow? The combined results of this study lead us to several
main conclusions. First, widespread drainage expansion and intensification,
especially of tile drainage, coupled with conversion of hay and small grains
to corn and soybeans is evident and continues to occur in agricultural river
basins. Annual precipitation and evapotranspiration totals have increased
since 1975, though we found these changes to only be statistically
significant in the MRB and RRB. Monthly precipitation increases are generally
not significant except in fall months for all basins. Additionally, across
multiple scales (daily, monthly, annual) and for a range of flows (low, mean,
extreme), streamflows have increased at all times of the year in intensively
managed agricultural watersheds (IRB, MRB, and RRB) and have remained
stationary in the more forested CRB. The magnitude and timing of
precipitation increases in each watershed suggest that precipitation
contributes to recently observed increases in streamflow, consistent with
other findings in the Midwestern USA (Frans et al., 2013; Xu et al., 2013).
Despite this apparent correlation, the magnitude of precipitation increases
alone cannot explain the observed increases in flow for agricultural basins
according to the water balances. Therefore, it appears that the pervasive and
extensive artificial drainage in agricultural basins has contributed to
increased streamflow, not only at 10
Harrigan et al. (2014) recognize that often multiple drivers explain hydrologic change. These drivers are not mutually exclusive and may even act synergistically to explain observed streamflow trends. In the Midwestern USA possible explanations that could explain substantial streamflow increases include (1) changes in storm duration and intensity or the amount of precipitation falling as rain versus snow, changing the characteristics of runoff generation while having little change in monthly or annual precipitation magnitudes; (2) increases in precipitation translating into increases in soil moisture, which contributes to amplified flows; and (3) artificial drainage more efficiently routing sub-surface flow to streams, an effect which could be amplified by increased precipitation. First, it is theoretically possible to observe changes in streamflow while having no change in monthly or annual precipitation magnitudes. High intensity, short duration events yield higher runoff ratios in poorly drained soils. Additionally warmer winter temperatures, earlier snowmelt, and more days when winter precipitation falls as rain instead of snow should affect and even increase winter baseflows, decrease the timing of ice break-up, and affect the magnitude of snowmelt floods. Several studies have documented such hydroclimate changes in the Midwestern USA (Feng and Hu, 2007; Groisman et al., 2001; Higgins and Kousky, 2012) and the role of these hydroclimate changes could be explored by future investigations.
Second, increased soil moisture is known to cause a nonlinear increase in
runoff generation for similar precipitation events.
Meyles et al. (2003) and Penna et al. (2011) report a threshold response in runoff
generation when antecedent soil moisture exceeds 65 % of the soil porosity.
It is possible that soil moisture has increased throughout the Midwestern US.
However, no theory exists to predict how big this effect could be on
landscape scales (
Third, several previous studies have demonstrated that artificial drainage
increases streamflow in moderate sized (10
Surface and subsurface drainage remains largely unregulated throughout the Midwestern USA and Canada (Cortus et al., 2011). Drainage census data are prone to reporting inconsistencies and errors, overall underestimation of drainage from excluding farms less than 500 acres, and do not provide the information necessary for modeling basin hydrology in large agricultural watersheds (such as drain size, depth, spacing, and extent). However, these are the most comprehensive inventory of drainage in the United States. This raises the question: why is such a widespread practice with such potentially profound and pervasive impacts on watershed hydrology and water quality so poorly documented and regulated? Until we have the information necessary to calibrate and validate watershed models, it will be difficult to more precisely deconvolve proportional impacts of climate and artificial drainage on flows at large spatial scales.
Decreased residence time of water in the soil has substantially increased nutrient export from agricultural landscapes (Randall and Mulla, 2001; Kreiling and Houser, 2016; Schilling et al., 2017). Though artificial drainage reduces field erosion by reducing surface runoff, it has been shown to essentially have shifted the sediment source from fields to channels (Belmont, 2011; Belmont and Foufoula-Georgiou, 2017). Basins experiencing increases in streamflow due to natural (climate) and anthropogenic (drainage) factors have increased stream power available to erode and transport more sediments and sediment bound nutrients and contaminants. Improved runoff management, specifically increased residence time and damped peak flows, is most needed in spring and early summer when tiles are actively draining soils and precipitation events are large. Thus, substantial gains in water quality might only be achieved if some amount of the lost water storage capacity is reintroduced (e.g., wetlands, detention basins) into these agricultural watersheds.
Precipitation and streamflow data are publicly available
and were accessed from the PRISM Climate Group,
The authors declare that they have no conflict of interest.
This material is based upon work supported by the National Science Foundation (grant no. EAR-1209402) under the Water Sustainability and Climate Program (WSC): REACH (REsilience under Accelerated CHange), and by the National Science Foundation Graduate Research Fellowship Program under grant no. 1147384. This research was supported by the Utah Agricultural Experiment Station, Utah State University, and approved as journal paper number 8938. The authors would like to thank Jon Czuba at Virginia Tech, and Karthik Kumarasamy, Eden Furtak-Cole, and Mitchell Donovan at Utah State University for their input. Thank you to Alexander Bryan at the Northeast Climate Science Center for generously providing evapotranspiration data from Bryan et al. (2015). Funding for AmeriFlux data resources was provided by the US Department of Energy's Office of Science.Edited by: Nunzio Romano Reviewed by: Ben Livneh, Boris Ochoa-Tocachi, and two anonymous referees