Technical Note : Demonstrating a 24 / 7 solution for monitoring water quality loads in rivers

Introduction Conclusions References


Introduction
Quantifying river transfers of chemical parameters, especially those relating to suspended sediments and/or higher discharges, remains a challenge to research and regulatory bodies owing to dependencies on storm hydrology (Bowes et al., 2009;Wall et al., 2011).Phosphorus (P ) and sediment, for example, tend to be removed from land in rapid diffuse transfer events, via surface and near-surface hydrological pathways (Heathwaite and Dils, 2000;Deasy et al., 2009) and the magnitude and rate of change of transfer is particularly influenced by land use (urban and agricultural intensities) and landscape permeability (soil permeability, geology and prevalence of hard surfaces).Highest risk is often when these factors coincide with critical source areas for diffuse transfer (Gburek et al., 2002).Transfers of P are also observed in rivers during lower Introduction

Conclusions References
Tables Figures

Back Close
Full flows, due to municipal and domestic waste water discharges that may exhibit a diurnal pattern linked to the timing of discharge cycles (Palmer-Felgate et al., 2010).Regulatory monitoring of P is considered especially important as this is generally considered to be the limiting nutrient in freshwater systems.There are exceptions, but for the most part, considerable additional efforts have been placed in national policies in many countries to deal with P from municipal and domestic effluents and from diffuse agricultural sources (e.g.OJEC, 1991aOJEC, , b, 2000)).This monitoring is often spatially rich in terms of river network coverage but temporally poor and, as storm driven diffuse P transfers from agricultural land tend to form very significant contributions to annual loads in rivers (Greene et al., 2011), low frequency monitoring may be insufficient to monitor these transfers.This has potentially serious implications as competent authorities are charged with implementing agricultural mitigation measures within legislation and monitoring the benefits at catchment scale (Mainstone et al., 2008).Expectations of change from these national programmes, which in many countries are based on descriptive means of (up to) monthly sampling in rivers, may then be unrealised due to sampling strategies which are inadequate to identify change associated with mitigation effects.Johnes (2007) clearly demonstrates the uncertainties in trying to collate sparse data into annual metrics of P load by decimating daily sampling from large catchments in the UK into less frequent datasets and applying interpolation and extrapolation algorithms.
Several methods have been used to overcome issues of sparse data samples with automatic water sampling used to target storm events or composite sampling to include a flow weighted element to account for higher flows (Lennox et al., 1997;Jordan et al., 2001).However, targeted storm and composite sampling tends to avoid interim periods where process and chemical-biological interactions may be greatest (Hilton et al., 2006).More recently, automated bankside spectrophotometric equipment has been used to monitor P on a near continuous basis (Jordan et al., 2007;Palmer-Felgate et al., 2010;Wall et al., 2011) but, while giving temporally rich data, these systems are never likely to be spatially rich owing to capital and maintenance costs.Introduction

Conclusions References
Tables Figures

Back Close
Full There is, therefore, a requirement to add complimentary monitoring to national programmes that is sufficient to monitor all flow regimes in rivers.Such monitoring should account for the influences of point and diffuse P transfers, provide process information such as diurnal cycling and hysteresis patterns, respectively, and enable trajectories of change for all transfer types to be audited.
Recent work in the Plynlimon experimental catchment has used an alternative strategy based on the use of an automatic water sampler with a 24 bottle configuration set to sample on a 7 h basis (Kirchner andNeal, 2010a, b, 2011).This "24/7" configuration can be used for total chemical species and conservative solutes if the sampler is retrieved on a weekly basis.
In this paper, we evaluate the 24/7 sampling and other configurations, which are feasible using standard field autosampling equipment, by decimating and sampling a two-year sub-hourly time series for discharge and P chemistry.Loads are estimated for all sample sets and the aggregated results compared across all sampling configurations.

Methods
Data were used from a hydrometric and hydrochemical monitoring station in Co. Monaghan, Ireland.The station set-up is described in detail elsewhere (Jordan et al., 2005;Jordan et al., 2007).In summary, discharge was monitored at a rated station at 5 km 2 in a catchment draining grassland agriculture on drumlin soils.This landscape type is predisposed to high P transfers during storm events (Douglas et al., 2007) and high background P concentrations during low flows from scattered point sources (Arnscheidt et al., 2007).These transfers were monitored by a Dr Lange Sigmatax-Phosphax suite of instruments that samples river water and analyses TP on a 20 min time-step (3 samples each every hour).Data were extracted over two hydrological years, quality controlled and assessed for completeness.During 2006During -2007During and 2007During -2008, discharge/TP data were 100 %/94 % and 100 %/98 % complete, respectively.Introduction

Conclusions References
Tables Figures

Back Close
Full Sampling was simulated by applying a numerical algorithm to generate all possible sample sets from the sub-hourly time series, based on a set of caveats for each sampling strategy, and combining both systematic and Monte-Carlo approaches.For the 24 weekly samples a random and 7hr interval were assessed: 1. 24 samples at 7 h intervals -Sampling was initiated between 8 a.m. and 6 p.m., and restricted to Monday to Friday to correspond to normal working hours during which an autosampler would be deployed and initiated.
2. 24 samples at random times over 7 days -The autosampler is programmed to take samples at 24 random times during the week, with each set of time stamps generated weekly and uploaded to the autosampler.
For comparison, sample sets were also generated for monthly, weekly, daily and random sampling frequencies as described in Cassidy and Jordan (2011a, b); summarised in Table 1.
Loads were estimated for each sample set using a standard flux-based approach (Method 5 in Littlewood et al., 1998), also known as the first-choice Paris Commission algorithm (PARCOM, 1992), to estimate load L E (kg) as:

Conclusions References
Tables Figures

Back Close
Full where C i is the instantaneous TP concentration (mg l −1 ), Q i is the instantaneous discharge (m 3 s −1 ), and Q r is the average discharge, based on higher frequency discharge records over that sampling duration.K is a constant which accounts for the duration of the record.Q k is the recorded discharge at 20 min intervals.
The load was estimated for each sample set from each sampling strategy and aggregated for comparison with the "true load", L T , based on the sub-hourly data, which was calculated as: where, over a sampling period (t start to t end ), Q is the instantaneous discharge and C the instantaneous concentration at sample time t.

Results and discussion
The several hundred datasets generated by simulation and applied to the flux-based load algorithm (Eq. 1) were plotted as loads in box whisker plots and compared with the "true load" of sub-hourly data (Eq.3) (Fig. 1a and b).
The true load was calculated as 1608 kg in 2006-2007 and 1880 kg in 2007-2008.In general, all sampling strategies gave variable estimates of TP load when used with the algorithm.However, as might be anticipated, the distribution in this variation was greater with decreased initial sample frequency and less so with increased frequency.Random (10 samples) and monthly sample estimates were highly inaccurate with 25th and 75th interquartile ranges well below the true load calculations and only maximum estimates within the datasets being over-estimates and excessively so in 2006-2007.
Accounting for storm events through triggering sampling when Q < 10th percentile increased the median load and produced an improvement on the standard weekly strategies, with greatest effect in 2006-2007.Random approaches exhibited a greater range Introduction

Conclusions References
Tables Figures

Back Close
Full in estimated loads which is most likely an artefact of the clustering inherent in randomly distributed points compared to uniform intervals.Daily sampling, while resulting in a smaller range of estimated loads compared with the lower frequency approaches, still underestimated load with the interquartile range (IQR) for both years ranging between 72 and 88 % of the true load for 2006-07 and 91 to 95 % for 2007-2008.
The improved load estimates using the 24/7 approach are directly attributable to the increased probability of capturing short term fluctuations in concentration.Storm events and diffuse P transfers are often of hours rather than days duration and have a much increased sampling probability at 7 h intervals (Fig. 2a).Apparent also in the 24/7 sub-sample datasets were a representation of non-storm discharge periods with important diurnal processes represented relating to point P transfers that were absent in the daily sampling (Fig. 2b).Also inherent with sampling at the same time daily is the risk of coinciding with diurnal discharge from agricultural or wastewater treatment works and therefore either under or overestimating the low flow concentration by coinciding with either a peak or trough in the cycle (Fig. 2b).
The scale of the river catchment and landscape type used in this study (flashy hydrology with a low baseflow index, high magnitude diffuse P transfers and high frequency point source signals) is typically a challenge for most sampling strategies not based on near continuous data collection.Most of the total annual load can result from higher concentration, short duration events for which sampling probabilities decrease as a power-law with increasing concentration (see Cassidy and Jordan (2011b) for a discussion).Sampling as much of the concentration range as possible is desirable and Introduction

Conclusions References
Tables Figures

Back Close
Full the improved coverage provided by 24/7 sampling compared with daily sampling is demonstrated by examining the proportion coverage of the concentration range by the sample sets for each strategy (Fig. 3).It is a scale, however, that is useful in monitoring the influences of changed policy expediencies towards catchment management such as the Nitrates Directive and other programmes of measures linked to the EU Water Framework Directive (Wall et al., 2011).This landscape type is again a useful benchmark to demonstrate that TP patterns linked specifically to source, viz.point, diffuse and incidental (Jordan et al., 2007) and where these patterns, independent of annual load, change according to specific mitigation measures.The attractiveness of using the 24/7 approach, for conservative analytes at least, is that it can be achieved using low-technology and off-the-shelf autosampling equipment at existing hydrometric stations and the ISCO or Sigma type samplers with a 24-bottle configuration are ubiquitous items of equipment.Personnel requirements and laboratory resources are also easily timetabled on a once per week basis (C.Neal, personal communication, 2011) and if TP load is preferred over pattern then the 24 samples can be composited in the laboratory according to flow-weighted volumes using weekly hydrometric data.
There are, however, some issues related to extreme events and non-conservative analytes.The 24/7 approach only gives scope for 3 samples every 24 h and this, in some very small catchments, may not be enough to sample extreme storm events that can dominate the annual transfers of P and sediment.It is here, even for less extreme events, that uncertainty in the box whisker plots in Fig. 1 are generated during the higher flow events.Again, this may not be so much of an issue as scale and catchment buffering increases at larger national monitoring sites.Extreme event sample capture can also be accommodated, for example, by once off passive (rising limb) storm sample collection (e.g.Burton, 2010) set above normal high water stages and integrated into the 24/7 datasets once triggered.The non-conservative nature of some analytes is, however, not so readily accommodated.Total reactive P and other soluble P fractions are possibly not amenable to being left in sample bottles for up to 7 days (Haygarth et Introduction Conclusions References Tables Figures

Back Close
Full  , 1995) and possibly the best that can be achieved is for samples taken from the last three bottles for immediate processing on return to the laboratory.Other conservative solutes and sediments are easily accommodated (Kirchner et al., 2011).

Conclusions
-A two year, sub-hourly data-set of synchronous discharge and P concentration from a flashy 5 km 2 catchment was decimated into artificial data-sets of coarser resolution, using a numerical algorithm to generate sample sets both systematically and using a Monte-Carlo approach.
-Sample sets were collated into annual P loads using a standard flux-based algorithm based on metrics of instantaneous discharge and concentration, and average discharge over the sample duration.
-In this catchment, with examples of point and diffuse P transfer, daily sampling tends to underestimate load with an IQR between 72 and 88 % of the "true load" and failed to reveal important sub-daily transport patterns.
-This sample design is easily implemented and is almost certainly likely to improve the coverage of all metrics (annual load, sub-daily patterns, etc.) as catchment scale increases; with, for example, hydrological buffering increasing to higher baseflow indices and the interaction of multiple P sources diminishing.
-Implemented as a complimentary part of forward national WFD monitoring, the 24/7 solution is likely to be a parsimonious and cost effective compromise Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Screen / Esc Printer-friendly Version Interactive Discussion Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | Discussion Paper | al.
Discussion Paper | Discussion Paper | Discussion Paper | between spatially rich routine monthly sampling and temporally rich fully automated bankside analysis.Discussion Paper | Discussion Paper | Discussion Paper |

Fig. 1 .
Fig. 1.Representation of a period of discharge and diffuse P transfer during a diffuse storm event (a) with sub-hourly data (solid line), 7hourly sampling (open circles) and daily sampling (closed triangles).Seven hourly sampling is more likely to capture important times of hydrograph and chemograph development.Also, (b) important diurnal signals during point source P transfers are better represented using the 7hourly sampling which is missed with daily sampling which, by coinciding with a daily peak in the cycle, may overestimate low flow concentrations.