Assessing winter cover crop nutrient uptake efficiency using a water quality simulation model

Winter cover crops are an effective conservation management practice with potential to improve water quality. Throughout the Chesapeake Bay watershed (CBW), which is located in the mid-Atlantic US, winter cover crop use has been emphasized, and federal and state cost-share programs are available to farmers to subsidize the cost of cover crop establishment. The objective of this study was to assess the long-term effect of planting winter cover crops to improve water quality at the watershed scale ( ∼ 50 km2) and to identify critical source areas of high nitrate export. A physically based watershed simulation model, Soil and Water Assessment Tool (SWAT), was calibrated and validated using water quality monitoring data to simulate hydrological processes and agricultural nutrient cycling over the period of 1990– 2000. To accurately simulate winter cover crop biomass in relation to growing conditions, a new approach was developed to further calibrate plant growth parameters that control the leaf area development curve using multitemporal satellitebased measurements of species-specific winter cover crop performance. Multiple SWAT scenarios were developed to obtain baseline information on nitrate loading without winter cover crops and to investigate how nitrate loading could change under different winter cover crop planting scenarios, including different species, planting dates, and implementation areas. The simulation results indicate that winter cover crops have a negligible impact on the water budget but significantly reduce nitrate leaching to groundwater and delivery to the waterways. Without winter cover crops, annual nitrate loading from agricultural lands was approximately 14 kg ha−1, but decreased to 4.6–10.1 kg ha −1 with cover crops resulting in a reduction rate of 27–67 % at the watershed scale. Rye was the most effective species, with a potential to reduce nitrate leaching by up to 93 % with arly planting at the field scale. Early planting of cover crops (∼ 30 days of additional growing days) was crucial, as it lowered nitrate export by an additional ∼ 2 kg ha−1 when compared to late planting scenarios. The effectiveness of cover cropping increased with increasing extent of cover crop implementation. Agricultural fields with well-drained soils and those that were more frequently used to grow corn had a higher potential for nitrate leaching and export to the waterways. This study supports the effective implementation of cover crop programs, in part by helping to target critical pollution source areas for cover crop implementation.


Introduction
The Chesapeake Bay (CB) is the largest and most productive estuary in the US, supporting more than 3600 species of plants and animals (CEC, 2000).It is an international as well as a national asset.The importance of CB has been recognized by its designation as a Ramsar site of international importance (Gardner and Davidson, 2011).However, the bay's ecosystems have been greatly degraded.The Chesapeake Bay watershed (CBW) extends over 165 759 km 2 and covers parts of New York, Pennsylvania, Maryland, Delaware, West Virginia, Virginia and the District of Columbia.Nearly 16 million people reside in the CBW, and its population is increasing rapidly, leading to accelerated land use and land cover change.The high ratio of watershed area to estuary water surface (14 : 1) amplifies the influence of human modifications, and excessive nutrient and sediment runoff has led to eutrophication (Kemp et al., 2005;Cerco and Noel, 2007).High nitrogen (N) input to the bay is the foremost water quality concern (Boesch et al., 2001).In the CBW, groundwater contributes more than half of total annual streamflow, and groundwater nitrate loads account for approximately half of the total annual N load of streams entering the bay (Phillips et al., 1999).Nitrate leached to the groundwater has substantial residence time on the order of 5-40 years (McCarty et al., 2008;Meals et al., 2010).
It is particularly important to implement best management practices (BMPs) on agricultural lands in the coastal plain in order to improve water quality in the Chesapeake Bay.Nitrogen exports from agricultural lands are significantly higher than those for other land uses in the coastal plain of the CBW (Jordan et al., 1997;Fisher et al., 2010;Reckhow et al., 2011).Fisher et al. (2010) discussed that N export increases by a factor of ∼ 10 as agriculture increases from 40 to 90 % of land use within coastal plain watersheds.Jordan et al. (1997) showed that N was exported from cropland at a rate of 18 kg N ha −1 year −1 , 7 times higher than the rate from other land uses in the coastal Plain.High nitrate exports from coastal plain watersheds have intensified CB water quality problems, due in part to short hydraulic distances (Reckhow et al., 2011).
The implementation of winter cover crops as a best management practice on agricultural lands has been recognized as one of the most important conservation practices being used in the CBW (Chesapeake Bay Commission, 2004).Winter cover crops can sequester residual N after the harvest of summer crops, reducing nitrate leaching to groundwater and delivery to waterways by surface runoff (Hively et al., 2009), and can also reduce the loss of sediment and phosphorus from agricultural lands.Therefore, federal and state governments have established cost-share programs to promote winter cover cropping practices (MDA, 2012).However, the overall efficiency of cover crops for reducing nitrate loadings has not been fully evaluated.The influence of BMPs, such as winter cover crops, on nitrate flux to streams has not been measured in situ at scales larger than field, because of the substantial residence time of leached N in groundwater and the difficulty of monitoring over long time periods (McCarty et al., 2008).A few field studies have demonstrated cover crop nitrate reduction efficiencies at the field scale (e.g., Shipley et al., 1992;Staver and Brinsfield, 2000).Hively et al. (2009) used satellite remote sensing images and field sampling data to estimate winter cover crop biomass production and N uptake efficiency at the landscape scale.However, the catchment-scale benefits of winter cover crop to improve water quality have not been fully understood.As the nutrient uptake and nitrate reduction efficiencies of winter cover crops are primarily dependent upon cover crop biomass (Malhi et al., 2006;Hively et al., 2009), it is crucial to simulate plant growth accurately.The accurate simulation of the plant growth would require field-based information and an improved calibration method to carefully account for the climate, soil characteristics, and site-specific nutrient management.Furthermore, the effectiveness of nutrient management practices, such as winter cover crops, has not been fully explored for coastal agricultural watersheds in the study region due to the challenge of accurately simulating hydrologic and nutrient cycling in lowland areas with high groundwater-surface water interaction (Lee et al., 2000;Sadeghi et al., 2007;Sexton et al., 2010;Lam et al., 2012).
This study utilized a physically based watershed model, Soil and Water Assessment Tool (SWAT) (Arnold and Fohrer, 2005), to simulate hydrological processes and nitrogen cycling for an agricultural watershed in the coastal plain of the CBW.We examined the long-term impact (∼ 10 years) of winter cover crops on the water budget and nitrate loadings under multiple cover crop implementation scenarios (e.g., species, timing and area planted).To accurately simulate the growth of winter cover crops and their nutrient uptake and nitrate reduction efficiencies, we have developed a new approach to calibrate model parameters that control winter cover crop biomass, resulting in model estimates that closely approximate observed values.This study provided important information for decision making to effectively implement winter cover crop programs and to target critical pollution source areas for future BMP implementation.

Description of the study site
This study was undertaken in the German Branch (GB) watershed, located within the CBW.The GB is a third-order coastal plain stream, located within the non-tidal zone of the Choptank River basin (Fig. 1).Its drainage area is approximately 50 km 2 and its land use is dominated by agriculture (∼ 72 %) and forest (∼ 27 %) (Fig. 2).Agricultural lands are evenly split between corn and soybean cropping.The study site is relatively flat with elevations ranging from 1 to 26 m above sea level.Most of the soils are moderately well-drained (hydrologic soil group (HSG) B) or moderately poorly drained (HSG C).Soil groups B and C cover 52 and 35 % of the study area, respectively.Well-drained (HSG A) and poorly drained (HSG D) soils account for less than 1 and 14 %, respectively, of the study area.Figure 2 presents information on land use, hydrologic soil types, and topography of the study site.The area is characterized by a temperate, humid climate with an average annual precipitation of 120 cm year −1 (Ator et al., 2005).Precipitation is evenly distributed throughout the year, and approximately 50 % of annual precipitation recharges groundwater or enters streams via surface flow, while the remaining precipitation is lost to the atmosphere via evapotranspiration (Ator et al., 2005).
The Choptank River watershed has been identified as an "impaired" water body by the US Environmental Protection Agency (US EPA) under Section 303(d) of the Clean Water Act due to excessive nutrients and sediments, and nutrient runoff from agricultural land has been identified as the main contributor of water pollution (McCarty et al., 2008).Since 1980, substantial efforts have been made to monitor water quality in the Choptank River watershed to establish baseline information on nutrient loadings from agricultural watersheds.Water quality in the GB watershed was intensively monitored between 1990 and 1995 as part of the Targeted Watershed project, a multiagency state initiative (Jordan et al., 1997;Primrose et al., 1997).In 2004, the Choptank River watershed was selected to become part of the US Department of Agriculture (USDA) Conservation Effects Assessment Project (CEAP), which evaluates the effectiveness of various agricultural conservation practices designed  SWAT was used to simulate the effects of winter cover crops on nitrate uptake with multiple cover crop scenarios over the period of 1990-2000.The model simulation was run for the entire watershed (including forested, row croplands, and non-row croplands), and changes in both water budgets and nitrate loads to receiving waters under multiple scenarios were compared with baseline conditions (no cover crops) at the field and/or watershed scales.The overall modeling approach is presented in Fig. 3. Since cover crop N reduction efficiency is controlled by winter cover crop biomass (Malhi et al., 2006), we developed a new method to calibrate plant growth parameters that control leaf area development to produce simulation outputs close to observed values (discussed in Sect.2.2.4).

Description of SWAT model
SWAT is a continuous, physically based semidistributed watershed process model.SWAT simulation runs on a daily time step.SWAT includes and enhances modeling capabilities of a number of different models previously developed by the USDA Agricultural Research Service (ARS) and the US EPA.Arnold and Fohrer (2005) discuss the capabilities of SWAT in detail.Technical documents on physical processes implemented in SWAT, input requirements, and explanation of output variables are available online (Neitsch et al., 2011).
The key physical processes in SWAT relevant to this research are briefly discussed below.
The main components of SWAT include weather, hydrology, sedimentation, soil temperature, crop growth, nutrients, pesticide, pathogens, and land management (Neitsch et al., 2011).In SWAT, a watershed is subdivided into smaller spatial modeling units, subwatersheds and hydrologic response units (HRUs).A HRU is the smallest spatial unit used for field-scale processes within the model.HRU is characterized by homogeneous land cover, soil type, and slope.The overall hydrologic balance as well as nutrient cycling is simulated for each HRU, summed to the subwatershed level, and then routed through stream channels to the watershed outlet.In the SWAT model, a modification of the Soil Conservation Service (SCS) curve number (CN) method was used to simulate surface runoff for all land cover types including row crops, forests, and non-row croplands.The CN method determines runoff based on land use, the soil's permeability, and antecedent soil water conditions.The transformation and transport of nitrogen between several organic and inorganic pools are simulated within a HRU as a function of nutrient cycles.Simulated loss of N can occur by surface runoff in solution and by eroded sediment and crop uptake.It can also take place in percolation below the root zone, in lateral subsurface flow, and by volatilization to the atmosphere.

Data and input preparation
Table 1 presents the list of data and other relevant information used in this study.Daily climate records on precipitation and temperature were obtained from the National Oceanic Atmospheric Administration (NOAA) National Climate Data Center (NCDC) (Royal Oak, Station ID: USC00187806).Daily solar radiation, relative humidity, wind speed, and missing precipitation and temperature information were derived using SWAT's built-in weather generator (Neitsch et al., 2011).Monthly streamflow and water quality information over the period of 1990-1995 was obtained from Jordan et al. (1997).Annual estimates of nitrate loads by subwatershed areas within GB watershed were provided by Primrose et al. (1997).
The geospatial data set needed to run SWAT simulations includes digital elevation models (DEM), hydrologic soil types, and land cover/land use.A lidar-based 2 m DEM, processed to add artificial drainage ditches by the USDA ARS at Beltsville, Maryland (Lang et al., 2012), was used to extract topographic information.The DEM was used to delineate the drainage area, subdivide the study area into smaller modeling units, and define the stream network.Soil information was obtained from the Soil Survey Geographical Database (SSURGO) available from the USDA Natural Resources Conservation Service (NRCS).
A map of land use was prepared based on the comprehensive analysis of existing land use maps, including the US Geological Survey's National Land Cover Database of 1992Database of , 2001Database of , and 2006, the USDA National Agriculture Statistics Service (NASS) National Cropland Data Layer (NCDL) of 2002, 2008, 2009, and 2010(Boryan et al., 2011)), and a high-resolution land use map developed from 1998 National Aerial Photography Program (NAPP) digital orthophoto quad imagery (Sexton et al., 2010).These maps indicated a consistent pattern of land use distribution over the last 2 decades with little change.The spatial distribution of major croplands (e.g., soybean and corns) (Fig. 2) was determined using 2008 NCDL.As the 2-year rotations of corn-soybean or soybean-corn were common practice and agricultural lands were used evenly for both crops, the placement of the crop rotations was simplified to alternate the locations of corn and soybean croplands every year using the 2008 NCDL as a base map.While the placement of crop rotations between various years would vary, it was not possible to obtain the spatial distribution of major croplands for each simulation year.In addition, time series cropland patterns observed from recent NCDL maps seem to support this generalized crop rotation pattern of interchanging the locations of corn and soybean fields.
Detailed agronomic management information was collected in the field, as well as through literature reviews and interviews with farmers and extension agents.Modeled agricultural practices and management reflects actual practices (i.e., no winter cover crop practice, utilizing conservation  1990-1995 2005-2006 tillage without irrigation) in the study region during the time of water quality monitoring (Sadeghi et al., 2007), and the guidelines for winter cover crop implementation practices were developed by the Maryland Department of Agriculture (MDA) cover crop program.
The GB watershed was subdivided into 29 sub-basins based on tributary drainage areas.Within each sub-basin, the superimposing of similar land uses and soil type generated a total of 402 HRUs with 283 classified as agricultural HRUs.The average size of HRUs ranged from 0.2 to 118.6 ha, with an average size of 11.8 ha and a standard deviation of 13.0 ha.

Calibration and validation of SWAT model
Although SWAT simulations were calculated on a daily basis, the calibration and validation were performed using the monthly water quality record available from the monitoring station located at the study watershed outlet.The calibration was performed manually under the baseline scenario with the 2-year crop rotations, following the standard procedure outlined in the SWAT user's manual (Winchell et al., 2011).The key parameters and their allowable ranges were identified using the sensitivity analysis performed by Sexton et al. (2010) and previous studies (Table 2).The simulations included a 2-year warm-up period (1990)(1991) to establish the initial conditions.Model calibration was done using the next 2 years of water quality records (1992)(1993), and the remaining records were used for validation (1994)(1995).This short period of spin up and calibration could limit the model's capability to capture the effects of interannual variability of weather on streamflow and nitrate.The calibration was done as follows.We first adjusted the parameters related to the streamflow and then for nitrate, by making a small change in their allowable ranges (Table 2).The param-eters were calibrated sequentially in order of their sensitivity as reported by Sexton et al. (2010).The calibration was run in a batch and the model performance statistics (discussed below) were computed for each run.We chose the parameter values that produce the best statistical outputs while meeting the model performance criteria as discussed by Moriasi et al. (2007).To assess longer-term effects, the model simulations were performed over the period of 1992-2000.We used ArcSWAT 2009 with the 582 version of the executable file in the ArcGIS 9.3.1 interface.
Accuracy of the model calibration was assessed with three statistical model performance measures: the Nash-Sutcliffe efficiency coefficient (NSE), root mean squared error (RMSE)-standard deviation ratio (RSR), and percent bias (PBIAS) (Moriasi et al., 2007).They are defined as follows:  Note: the ranges of parameters were adapted from existing literature (noted as Reference*).LAIMX1 and LAIMX2 were estimated using the regression method based on biomass estimates reported in Hively et al. (2009) and the simulation outputs from the crop growth module of SWAT (see details in Sect.2.2.3).
where O i are observed and S i are simulated data, O is observed mean values, and n equals the number of observations.The values of those statistical measures were compared to the model evaluation criteria set for various water quality parameters (Moriasi et al., 2007).
The prediction uncertainty of the model was assessed using the 95 % prediction uncertainty (95 PPU), the P factor, and the R factor (Singh et al., 2014).They were computed using all simulation outputs obtained during the manual calibration process.The 95 PPU bands are calculated at the 2.5 and 97.5 percentiles of the cumulative distribution of simulation outputs.The P factor indicates the percentage of observed data falling within 95 PPU band, and the R factor is the average thickness of the 95 PPU bands by the standard deviation of the observed data.The R factor can vary between 0 (i.e., achievement of a small uncertainty bound) and infinity, while the P factor can vary from 0 to 100 % (i.e., all observations bracketed by the prediction uncertainty) (Singh et al., 2014).

Calibration of plant growth parameters
Cover crop plant growth parameters were calibrated to more realistically simulate cover crop growth during winter at the field scale.Specifically, we modified the parameters that control the leaf area development curve using biomass estimates provided by Hively et al. (2009).Their study reported landscape-level biomass estimates for three commonly used winter cover crops categorized by various planting dates over the period of 2005-2006 in the Choptank River region.This information was analyzed to associate winter cover crop biomass estimates with heat units.Heat units were computed based on the potential heat unit (PHU) theory as implemented in SWAT, with the daily climate record over the cover crop monitoring period (2005)(2006).The crop growth module of SWAT was then run with average daily climate data over 1992-2000 using the default parameter values to provide estimates of biomass and leaf area index (LAI) by growing degree days.This assumption should not have a significant effect on plant growth simulation, even if there is some interannual variability in weather conditions between the two periods.This is because the plant growth cycle in SWAT is simulated using heat unit theory, and there was little difference in heat units counted during two different time periods.Heat units are based on the accumulated number of growing days that have a daily temperature above the base temperature.Below the base temperature, no plant growth should occur.
Using this information, we then were able to relate simulated LAI values to the reported biomass estimates and heat units.These LAI values and the corresponding heat units were then normalized by the maximum LAI and total potential heat units required for plant maturity, and the relationship between these two normalized values (fractional LAI and heat units) was fitted using a simple regression model.This fitted model was extrapolated to identify two LAI parameter values (Table 2) required to adjust the leaf area development curve in the SWAT model.

Assessing the effectiveness of winter cover crops with multiple scenarios
We assessed the potential effects of winter cover crops on nitrate removal at the field and watershed scales under multiple implementation scenarios.Details of these scenarios are presented in Table 3.The MDA Cover Crop Program offers a varying cost share according to winter cover crop planting species and cutoff planting dates.Following the program guidelines and county-level statistics of winter cover crop implementation (MDA, 2012), we constructed multiple scenarios relevant to regional cover crop practices with three major cover crop species -i.e., barley (Hordeum vulgare L.), rye (Secale cereale L.), and wheat (Triticum aestivum L.) -and two planting date categories (early/late).Additional cover crop scenarios were developed to assess their effectiveness by varying extent of cover crop implementation.The average nitrate export was assessed at the field scale based on the simulation output over the period of 1992-2000 under the baseline scenario (i.e., no cover crop).Then, all agricultural HRUs were sorted by nitrate loading and equally subdivided into five groups.Each group was then introduced incrementally for cover crop implementation, in order from the highest to the lowest nitrate loading.Table 4 summarizes agricultural practices and scheduling used for different scenarios.There was no difference between baseline and cover crop scenarios during the growing season.The croplands were managed with the typical 2-year corn-soybean or soybean-corn rotation, and fertilizer was only applied to corn cropping in the beginning of the growing season, due to its high demand for nutrients to support growth and yield.Instead of winter fallow, cover crop scenarios assumed placement of winter cover crops.The cover crops were planted after harvesting of summer crops either in the beginning of October (early planting) or November (late planting), and were chemically killed at the beginning of the following growing season (early April).The specific dates (3 October and 1 November) of cover crop planting were set according to MDA guidelines, with slight adjustment over the course of the simulation period to avoid days with substantial precipitation falling immediately prior to winter cover planting.Note that the harvest date of summer crops under the baseline was set for 15 October to make the model results from the baseline more comparable to the early and late cover crop scenarios by setting the harvesting date in between them.Actual practices and historical statistics indicate that early planting was generally allowed for corn only, as soybean requires later harvest in the Choptank River region.MDA's county level statistics over 2006-2011 showed that winter cover crops were generally planted later following soybean (in general, after mid-October), while two-thirds of cover crop implementation occurred prior to mid-October after corn.This difference could be due to late harvesting to allow for double planted soybean crops.In this study, early planting scenarios were considered to be more active conservative agricultural practices than late planting scenarios.Therefore, early planting scenarios were set to apply the early planting date at 100 % where it could be applicable (i.e., corn fields), while the remaining fields (i.e., soybean fields) were assumed to be treated with 100 % of late plantings.As a result, these scenarios include 50 % of cover cropping with early planting on cornfields and the remaining 50 % with late planting on soybean fields, as both crop types have roughly an equal share of total croplands.Due to this mixed effect, the nitrate removal efficiency by different planting dates could not be fully assessed at the watershed scale, but evaluated at the field scale.

SWAT calibration and validation
The simulated results of monthly streamflows and nitrate were compared with the observed data for both the calibration and validation periods.Table 2 provides the list of the adjusted parameter values after model calibration.Overall, Fig. 4 shows good agreement between measured and simulated monthly discharge of streamflow and nitrate.It illustrates the 95 PPU (the shaded region) of the SWAT simulation model with the monthly observed and the best simulated streamflows and nitrates.The 95 PPU of streamflow seems to quantify most uncertainties as the interval includes most of the measured data.However, the 95 PPU of nitrate does not seem to represent all the uncertainty, particularly for the low-flow season when most of the simulated streamflows are not in good agreement with the observed streamflows.This could be caused by the limitations of SWAT itself and the large errors associated with calibration.The calibration was conducted over a short period and this could limit the capability of the calibrated model to capture the effects of weather variability on streamflow and nitrate.In addition, the nitrate load calculated based on the field sampling of nitrate stream concentration (i.e., the observed nitrate load) could     Table 5 also presents a summary of model performance measures and their accuracy ratings based on the statistical evaluation guidelines reported by Moriasi et al. (2007).These performance measures are calculated based on a monthly water quality record.Overall, the model performance rating for streamflow and nitrate loads exceeded the "satisfactory" rating in both the calibration and validation periods.Model simulation results for streamflow were more congruent with the observed values than for nitrate, but the pattern of simulated nitrate was similar to the trend of simulated streamflow.Also, simulation results for the calibration period were in better agreement with the observed values, compared to the validation period.The largest discrepancy between simulated and measured streamflow and nitrate was in 1994.Unlike the simulation output, a high peak in streamflow and consequently in nitrate loading was observed in August.This relatively high flow and nitrate were somewhat unusual, as the weather record for this site did not show any dramatic change in precipitation during August of 1994 compared to the previous years.However, the reported streamflow in August of 1994 was much higher than observations from other years.In addition, the streamflow record from an adjacent watershed, with similar characteristics and size, did not produce high peak values for streamflow during the same period.This difference could perhaps be explained due to unexpected agricultural practices, localized thunderstorms that did not occur at the weather station and nearby watershed, or human/measurement errors, although the exact cause of such error could not be determined.The SWAT simulation provided considerably improved results compared to previous studies conducted in the study area (Lee et al., 2000;Sadeghi et al., 2007;Sexton et al., 2010).These improvements may be due to different model choice (Niraula et al., 2013), the recent update of the SWAT model to more accurately predict nitrate in groundwater (USDA-ARS, 2012; Seo et al., 2014), and use of more accurate higher spatial resolution DEMs (Chaplot, 2005;Chaubey et al., 2005).
Accurate simulation of winter cover crop growth and biomass at various stages of production is crucial to accurately estimating the potential of winter cover crop to uptake residual N and reduce nitrate loading.The winter cover crop program was implemented in 2005 at this site and, therefore, no data were available to validate predicted winter cover crop biomass over the period of 1992-2000.However, we are confident in our biomass simulation, as the simulated 8year averaged winter cover crop biomass estimates obtained at the HRU scale were comparable to the range of cover crop biomass reported by Hively et al. (2009).It is to be noted that without calibration, cover crop growth was simulated at a much faster growth rate, and the growth trend over winter months did not match field data as reported in Hively et al. (2009).This study calculated above-ground winter cover crop biomass with a range of planting dates, based on field survey and satellite images acquired over the period of 2005-2006.For example, the modeled growth rate of rye before calibration was substantially lower in the early growth stage, producing much less biomass than observed values.Figure 5 shows the agreement between measured and simulated biomass estimates after calibration, at the field (HRU) scale.Note that the simulated estimates of cover crop biomass were at the upper end of the reported values, as the simulation output included both above-and below-ground biomass.

Multiple scenarios analysis
Winter cover crops had little impact on catchment hydrology but a profound effect on nitrate exports.Figure 6 presents 9-year average annual mean streamflow, annual evapotranspiration, and annual nitrate loads, under baseline and multiple cover crop scenarios.As reported from previous studies (Kaspar et al., 2007;Islam et al., 2006), the inclusion of a winter cover crop reduced streamflows only slightly (< 10 %).Similarly, our study found streamflow reductions of less than 8 %.Winter cover cropping reduced streamflow from 8.5 to 7.8 m 3 s −1 (RE, rye early) and 8.4 m 3 s −1 (WL, wheat late), and increased evapotranspiration from 667 to 673 mm (WL) and 710 mm (RE), in comparison to the baseline scenario.While the effects of winter vegetation on evapotranspiration were relatively low, any water loss due to evapotranspiration could be offset as cover cropping usually increases soil saturation by increasing water infiltration capacity (Dabney, 1998;Islam et al., 2006).Because the study site typically exhibits maximum streamflow during winter with rising groundwater levels (Fisher et al., 2010), the relative difference in streamflows due to winter cover crops remained small.Rye cover crops caused the most changes to the hydrologic budget followed by barley and winter wheat cover crops.Early planting scenarios produced slightly lower streamflow and higher evapotranspiration, compared to those with the later planting date.
Unlike its small hydrologic effect, winter cover cropping greatly reduced nitrate loads and there were large differences in nitrate loads by planting species and dates.Annual nitrate loads with cover crop scenarios ranged from 4.6 (RE) to 10.1 kg ha −1 (WL).The difference in nitrate loadings under different cover crop scenarios ranged from 1.3 (when RE was compared to BE, barley early) to 5.5 kg ha −1 (when RE was compared to WL).If the comparison of the removal efficiency was made within species, early cover cropping (3 October) lowered annual nitrate loads by 1.8 (rye and winter wheat) to 2.7 (barley) kg ha −1 , compared to late cover cropping (1 November).When compared with the baseline scenario (13.9 kg ha −1 ), the cover crop scenarios reduced nitrate loads by 27 (WL)-67 % (RE) at the watershed scale.This finding compared well with the results of previous studies that reported the importance of early planting date (Ritter et al., 1998;Feyereisen et al., 2006;Hively et al., 2009).Shorter day lengths and lower temperatures could also limit the growth of cover crop biomass during the winter season.Therefore, earlier planting could increase the amount of nitrogen uptake by cover crops because of longer growing seasons and warmer conditions (Baggs et al., 2000).Similar research in Minnesota also demonstrated that winter cover crops planted 45 days earlier reduced 6.5 kg N ha −1 more nitrogen than late planting (Feyereisen et al., 2006).Our simulation results are slightly lower than these published values, due to fewer growing days (∼ 30 days).The earlier planting occurred ∼ 30 days prior to the late planting.
The simulation results indicate that rye is the most effective cover crop at reducing nitrate loads.Rye is well adapted for use as a winter cover crop due to its rapid growth and winter hardiness, and these characteristics enabled rye to consume a larger amount of excessive nitrogen than other crops (Shipley et al., 1992;Clark, 2007;Hively et al., 2009).Barley is a cool-season crop and develops a strong root system during the winter season.Barley exhibits better nutrient uptake capacity than wheat (Malhi et al., 2006;Clark, 2007).Our simulation results were consistent with previous studies.As shown in Fig. 5, rye grows faster than other winter cover crops particularly in the early growth stage, taking up higher levels of nitrate.Compared to the baseline scenario, rye removed more than 67 % of nitrate with early planting, and 54 % with late plating (Fig. 6).Barley had a nitrate reduction rate of 57 % and winter wheat 41 % with early planting, but this removal efficiency drops to 38 % for barley and 27 % for winter wheat with late planting (Fig. 6). Figure 6 illustrates that late planted rye was nearly as effective as early planted barley and more effective than early planted winter wheat.
Simulated nitrate removal efficiency was greatly affected by different levels of cover crop implementation as shown in Fig. 7.As expected, removal efficiency increased with increasing coverage of cover crop implementation, though the slope of removal efficiency slightly decreased at the 60 % extent.This finding seems to indicate that the nitrate reduction rate does not increase linearly with increasing coverage, but its relative efficiency could decrease after the coverage of cover crop implementation exceeds 50 % of the croplands.While this finding seems to be reasonable, further field-based studies are needed to verify this finding.It was noted that 60 % cover crop coverage with an early planting date would reduce more nitrate than 100 % cover crop coverage with late planting, emphasizing the importance of early cover crop planting as indicated by other studies (Ritter et al., 1998;Hively et al., 2009).
The effects of cover cropping were further assessed by quantifying the amount of nitrate transported from agricultural fields by different delivery pathways to waterways (surface runoff, lateral flow, and shallow groundwater) and nitrate leached to deep groundwater.Figure 8 presents nitrate loads per unit area leaving agricultural fields during the winter fallow period (October-March).The effectiveness of winter cover cropping to reduce nitrate leaching is particularly noticeable, as reported by earlier studies (McCraacken et al., 1994;Brandi-Dohrn et al., 1997;Francis et al., 1998;Bergstrom and Jokela, 2001;Rinnofner et al., 2008).At the field scale, the seasonal average of nitrate leaching (shown   as "L" in Fig. 8) over the winter fallow period (October-March) without cover crops was estimated as 43 kg ha −1 .With winter cover crops, nitrate leaching decreased to 3.0-32.0kg ha −1 , depending on planting species and timing, resulting in a reduction rate of 26-93 %, compared to baseline values.In addition, the amount of nitrate transported from fields to waterways by surface runoff, lateral flow, or shallow groundwater (referred to as DPs, direct pathways, in Fig. 8) was greatly reduced from 2.9 to 10.7 kg ha −1 with cover crop scenarios, a reduction rate of 25-80 %.Similar to the watershed-scale analysis, rye with an early planting date produced the most effective result at the field scale with the highest reduction rate both through direct pathways and leaching.

Geospatial analysis to identify high nitrate loading areas
The 9-year annual and monthly nitrate loads from agricultural fields (HRU) simulated under the baseline scenario were analyzed to pinpoint those areas with a high potential for nitrate loadings and better understand the characteristics and variability of these high loading zones.We classified all agricultural HRUs into five classes according to different levels of nitrate export potential.Nitrate export potential was computed by summing up nitrate transported by direct pathways and leaching to groundwater.We observed consistent spatial patterns in nitrate loadings at the interannual and monthly timescale.Figure 9 illustrates the geographical distribution of nutrient loadings from all agricultural HRUs based on the 9-year annual and monthly average simulation results from selected months.Those selected months were chosen considering seasonal characteristics of climate and hydrology as well as the timing of agricultural practices and scheduling that may produce differences in nitrate loadings (e.g., high precipitation and groundwater flow in March/April, killing winter cover crop and fertilizer application in April, and cover crop application in November).
The location of high nitrate loading areas was generally associated with moderately well-drained soils and agricultural fields more frequently used for corn over the simulation period.Nitrate leaching dominated the total nitrate loads from the fields (i.e., potential for nitrate export), as it outweighed nitrate transport by direct pathways (as shown in Fig. 8).We hypothesize that areas with moderately well-drained soils allowed high nitrate leaching due to their high infiltration capacity (Fig. 2).Because of the high nitrogen demand for corn growth and yield, corn cropping requires a considerable amount of fertilizer application during the early growth stage, while soybean does not require any fertilizer application (Table 4).Consequently, nitrate export from agricultural fields more frequently used for corn over the simulation period was significantly greater than those used for soybean, as reported by Kaspar et al. (2012).Therefore, it would be important to prioritize winter cover cropping application for those areas with well-drained soils used for corn production.

Conclusions
This study demonstrates the effectiveness of winter cover crops for reducing nitrate loads and shows that nitrate removal efficiency varies greatly by species, timing, and extent of winter cover crop implementation.It also illustrates that nitrate exports vary based on edaphic and agronomic characteristics of the croplands upon which crops are planted.Therefore, it is important to develop management guidelines to encourage optimal planting species, timing, and locations to achieve enhanced water quality benefits.This study suggests that early planted rye is the most effective cover crop practice, with the potential to reduce nitrate loading by 67 % over the baseline at the watershed scale.We hypothesize that the relatively high nitrate removal efficiency of early planted rye is due to the more rapid growth rate of rye, especially in the early growth stage, compared to other species.As expected, nitrate removal efficiency increased significantly with early planting of all species and increasing cover crop implementation.The study also illustrates that locations of high nitrate export were generally associated with moderately well-drained soils and agricultural fields more frequently used for corn.Therefore, it would be important to prioritize winter cover crop application with early planted rye for those areas with well-drained soils used for corn production.
This study also provides a new approach to calibrate winter cover crop growth parameters.Growth parameters for winter cover crops need to be carefully calibrated for shorter day lengths and lower temperatures during the winter, to provide an accurate estimation of the nutrient uptake efficiency of cover crops.Unfortunately, at present there are limited data available on winter cover crop growth and biomass estimation at the field or landscape scales.However, this data limitation is expected to be resolved in the future, as the planting of winter cover crops becomes more common and monitoring programs are enhanced through the availability of no-or low-cost time series of remotely sensed data (e.g., Landsat).With multiyear cover crop biomass and growth data, the methodology presented in this paper could be extended to better calibrate growth parameters and validate winter cover crop biomass, improving the accuracy of SWAT in estimating nitrate removal efficiency by winter cover crops.

Figure 1 .
Figure 1.Geographical location of the study area (German Branch watershed, with the size of 50 km 2 ).

Figure 3 .
Figure 3. Schematic diagram of modeling procedure.Note: This shows the overall modeling procedure of the presented study and summarizes what simulation results are compared at the various spatial scales.HLZ (High Loading Zones) refers to those agricultural fields (HRUs) with high nitrate export potential.

Flow
rating a indicates satisfactory, b good, c very good.The performance rating criteria are adapted fromMoriasi et al. (2009)  and these statistics are computed based on the monthly water quality record.

Figure 4 .
Figure 4. Observed and simulated monthly streamflows and nitrate loads during the monitoring period (1992-1995) at the watershed scale.

Figure 5 .
Figure5.Estimation of winter cover crop biomass during the winter fallow period.Note: This figure presents monthly average total biomass (both above-and below-ground biomass) over the simulation period for three planting species obtained at the field (HRU) scale.The vertical dotted line represents the range of above-ground biomass estimates due to different growing/planting days fromHively et al. (2009).The simulated total biomass lies at the upper end of above ground biomass estimates.

Figure 6 .
Figure 6.The 9-year average streamflow, actual evapotranspiration (ET), and nitrate loads at watershed scale under multiple cover crop scenarios.Note: Error bar (vertical line) represents standard deviation.The numeric value in parentheses, (), indicates reduction rate (RR).RR is calculated by taking the relative difference in simulation outputs from the baseline and cover crop scenarios [RR = (Baseline − Cover crop Scenario) / Baseline].

Figure 7 .
Figure 7. Nitrate reduction rates by varying degree of cover crop implementation at the field scale.

Figure 8 .
Figure 8.The 8-year average nitrate leaching and delivery to waterways during winter fallow assessed at the field scale under multiple cover crop scenarios.Note: DPs (Direct pathways) refers to the amount of nitrate transported from agricultural fields (HRUs) to waterways by surface flow, lateral flow, and groundwater; L is nitrate leaching to groundwater.The numeric value in parentheses, (), indicates reduction rate (RR).As the growth period of winter cover crop covers from October to March, results presented here were based on the eight years of simulation from October 1992 to March 2000.

Figure 9 .
Figure 9.The spatial distribution of nitrate export potential from agricultural fields.Note: Nitrate export potential was computed by adding the annual or monthly averaged amount of nitrate leaching to the groundwater (L) and leaving to the streams by surface runoff, lateral flow, and groundwater (DPs) from the 9-year simulation results.Estimated nitrate loads from the HRUs were classified into five groups.In the legend M. High refers to Moderately High and M. Low Moderately Low.The HRUs within the black circle indicates outliers with extremely high nitrate loadings.This area is characterized by poorly drained hydric soil ("Urban land") and consistently produces extremely high nitrate loadings throughout years and seasons.The white area is non-agricultural land as shown in Fig. 2.

Table 1 .
List of data used in this study.

Table 2 .
List of calibrated parameters.

Table 3 .
List of cover crop scenarios.

Table 4 .
Agricultural practices and management scheduling for the baseline and cover crop scenarios.
smaller P factor value than the streamflow, indicating much greater uncertainty.However, the R factor value of nitrate is smaller than that of streamflow, indicating the 95 PPU band for the nitrate is narrower (Table5).

Table 5 .
Model performance measures for streamflow and nitrate.