Integrated environmental modelling (IEM) is a burgeoning science area that aims to solve complex environmental problems. One of the main ways this is achieved is by linking components, which can be process models. To do this, there are a number of scientific and technical challenges which need to be addressed. These include developing methods based on standards to link components as easily as possible and the quantification of uncertainty once this has been undertaken.
Cultural change is also required for scientists to accept this approach, which involves significant collaboration and for the process of model development to be more rigorous. The techniques for linking models also need to be made easier. Decision-makers who could potentially use this approach to ensure that the assessment of their choices could be improved need to be educated. Therefore, significant effort needs to be spent on the development and promotion of exemplars showing the utility of IEM and allowing interaction with likely users.
There are many and various challenges, some unique to IEM, some not and they include but are not limited to:
By the end of March 2014, the BGS integrated modelling and uncertainty team aims to:
There are a number of different sources of uncertainty in any modelling activity: input data, conceptualisation, parameterisation and representation of processes to name a few. However, since integrated modelling involves the linking of different models then it is important to quantify the propagation of uncertainty through the model chain. The science behind doing this is, as yet, fairly under-developed. Therefore, this task is aimed at developing and testing methodologies to quantify uncertainty in linked model systems.
The treatment of uncertainty within environment models is well established with extensive literature available, for example Beven, 2009. Since integrated modelling is a relatively new field, there isn’t yet a consensus on the approach to be used to quantify uncertainty for a system of linked models. There is, however, a relatively mature literature on the quantification of uncertainty for integrated assessment (IA). Work by Rotmans and van Asselt (2001) suggests an approach that includes assessing uncertainty taking into account the needs of the policy-maker and using a range of methods (e.g. standard approaches such as sensitivity analysis, probability-based methods as well as less used approaches such as hedging-oriented methods) to quantify uncertainty. There have also been assessments of uncertainty in a cascade of models (e.g. Pappenberger et al., 2005) and the recognition that the increase in numbers of models being evaluated results in a huge increase in the number of simulations undertaken. To reduce the number of runs required to assess uncertainty ‘sub-sampling’ can be used intelligently. As well as this, initiatives have been undertaken to quantify uncertainty in linked modelling systems (e.g. FRAMES-3MRA; Papendrier and Castleton, 2005 and the related supercomputing initiative SuperMUSE).
Beven (2007) proposes an approach to dealing with uncertainty in complex, all-encompassing models to avoid ‘uncertainties of model predictions and the consequential risks of potential outcomes’. He recognises the role of the Water Framework Directive in promoting the use of holistic approach (such as OpenMI) and views data as including model predictions. The suggested approach is an RIVM initiative NUSAP (numerical, unit, spread, assessment and pedigree), developed as a result of a public debate on the failure of the understanding of uncertainty. NUSAP is a quantitative approach of assessing the degree of belief in the predictive method based on the past experience of its use and the data on which it’s based. It allows dealing with the uncertainties in all parts of the process, especially when there is controversy in the science behind the method and even the uncertainty in the science. It is an approach that properly engages with stakeholders at all stages, to ensure that the uncertainty in the predictions is properly appreciated by the stakeholders, including policy-makers.
Babendreier, J E and Castleton, K J. 2005. Investigating uncertainty and sensitivity in integrated, multimedia environmental models: tools for FRAMES-3MRA. Environmental Modelling & Software 20 (2005) 1043–1055
Beven, K. 2007 Towards environmental models of everywhere: uncertainty, data and modelling as a learning process. Hydrol. Earth Syst, Sci. 11(1): 460-467.
Beven, K. 2009. Environmental Modelling: An Uncertain Future. Routledge, Abingdon, UK.
Pappenberger, F, Beven, K, Hunter N M, Bates, P D, Gouleweleeuw, B T, Thielen, J, and de Roo, A J P. 2005. Cascading model uncertainty from medium range weather forecasting (10 days) through rainfall-runoff models to flood inundation predictions using European Flood Forecasting System (EFFS). Hydrol. Earth Syst, Sci. 9(4), 381-393.
Rotmans, J and van Asselt M B A. 2001. Uncertainty in integrated assessment modelling: A labyrinthic path, Integrated Assessment 2: 43–55.
Contact Dr Andrew Hughes for more information