Reducing epistemic errors in water quality modelling through high-frequency data and stakeholder collaboration: the case of an industrial spill
Abstract
Catchment management, as driven by legislation such as the EU WFD or grassroots initiatives, requires the apportionment of in-stream pollution to point and diffuse sources so that mitigation measures can be targeted and costs and benefits shared. Source apportionment is typically done via modelling. Given model imperfections and input data errors, it has become state-of-the-art to employ an uncertainty framework. However, what is not easily incorporated in such a framework, and currently much discussed in hydrology, are epistemic uncertainties, i.e. those uncertainties that relate to lack of knowledge about processes and data. For example, what if an otherwise negligible source suddenly matters because of an accidental pollution incident? In this paper we present such a case of epistemic error, an industrial spill ignored in a water quality model, demonstrate the bias of the resulting model simulations, and show how the error was discovered somewhat incidentally by auxiliary high-frequency data and finally corrected through the collective intelligence of a stakeholder network. We suggest that accidental pollution incidents like this are a wide-spread, though largely ignored, problem. Hence our discussion will reflect on the practice of catchment monitoring, modelling and management in general. The case itself occurred as part of ongoing modelling support in the Tamar catchment, one of the priority catchments of the UK government's new approach to managing water resources more decentralised and collaboratively. An Extended Export Coefficient Model (ECM+) had been developed with stakeholders to simulate transfers of nutrients (N & P), sediment and Faecal Coliforms from land to water and down the river network as a function of sewage treatment options, land use, livestock densities and farm management practices. In the process of updating the model for the hydrological years 2008-2012 an over-prediction of the annual average P concentration by the model was found at one sub-catchment outlet compared to high-frequency measurements at this point that had become available through another UK government initiative, the Demonstration Test Catchments. This discrepancy had gone unnoticed when calibrating the model in a probabilistic framework against the statutory monitoring data due to the high uncertainties associated with their low-frequency monitoring regime. According to these data what turned out to be an over-prediction seemed possible, albeit with low probability. It was only through the well-established contacts with the local stakeholders that this anomaly could be connected to an industrial spill elsewhere in the catchment, and the model eventually corrected for this additional source. Failing to account for this source would have resulted in drastic over-estimation of the contributions of other sources, in particular agriculture, and eventually wrong targeting of catchment restoration funds and collateral damage of stakeholder relations. The paper will conclude with a discussion of the following general points: the pretence of uncertainty frameworks in the light of epistemic errors; the value of high-frequency data; the value of stakeholder collaboration, particularly in the light of sharing sensitive information; the (somewhat incidental) synergies of various pieces of information and policy initiatives.
- Publication:
-
EGU General Assembly Conference Abstracts
- Pub Date:
- May 2014
- Bibcode:
- 2014EGUGA..16.7240K