This study proposes a framework to estimate the uncertainty of hydrodynamic models on floodplains.
The traditional floodplain resistance formula of

Flow resistance can be considered as the contribution of four components, according to

There are several approaches available in the literature for determining flow resistance coefficients of vegetated floodplains in numerical models.
These approaches are basically divided into four categories: rigid or flexible, and emergent or submerged vegetation.
They aim to determine the resistance exerted by the vegetation on the flow based on physical properties such as vegetation height and width, stem diameter and density, etc.
Recent research on flow resistance of emergent floodplain vegetation is given in

Even though much work has been done in applying different approaches to include vegetation-induced resistance effects in hydrodynamic calculations, the majority of these studies were verified only under laboratory-scale conditions.
A gap between those results and river engineering projects still exists.
While free surface information on flooded areas can be well approximated from river channel measurements, flow velocity cannot.
And because floodplain measurements are usually not available, model performance is neglected at those areas.
That means, when flood scenarios belong to the scope of a project or study, attention should be given to this matter.
A way to address this problem is to consider a probabilistic approach and to carry out an uncertainty quantification (UQ) of the floodplain friction.
Uncertainty in the context of fluid dynamics is defined as a potential deficiency of the simulation process, according to

Some studies can be found in the literature that involve the estimation of uncertainty related to floodplains and the resistance coefficient.

In this context, a framework to estimate the uncertainty of hydrodynamic models on floodplains due to vegetation is proposed in the current study. Within a large number of different floodplain friction methods there is still no generally accepted choice for large-scale applications. Thus, the outcomes of the uncertainty quantification can assist to identify a well-suited friction method for practical use. A two-dimensional hydrodynamic model is calibrated with floodplain friction formulations, with which input parameter uncertainties are associated based on a practical range of variation. After defining the variation ranges for sensitive input parameters, the UQ is carried out with different methods for comparison. In other words, the quantification of uncertainties in flow velocity simulation is addressed by considering uncertainties in floodplain friction parameters. In the next section four chosen floodplain resistance formulae are described and analysed. Then the concept of uncertainty quantification is briefly explained and three different methods are presented in the third part. The fourth section provides information on the case study including a brief description of the hydrodynamic model, parameters used for model calibration and the defined uncertainties for the selected input parameters needed for carrying out the analysis. In the fifth section results are presented and discussed, from which conclusions are drawn in the last part of the paper.

Vegetation found on river banks and floodplains plays an important role in the flow velocity profile and, therefore, on hydraulic roughness.
Current research aims to relate vegetated floodplain properties to their “hydraulic signatures” and to incorporate the complex nature of vegetation characteristics into floodplain friction models.
According to

For the current study four out of several vegetation friction formulations

The modified formulation from

The approach from

The approach from

The approach from

From now on the presented floodplain friction formulations will be referred to as LIND, BAPT, JAER and BATT, respectively.
The formulae will be analysed in terms of the total Darcy–Weisbach friction factor calculated as

In LIND and BAPT there is a direct dependency between the term

Numerical models represent only an approximation of the observed process.
The measured difference between the model and the observation can be considered either as error or uncertainty.

Uncertainty quantification aims to describe the system reliability by combining the uncertainties in the basic components (variables) of the system.
The framework of the numerical model used to represent the system characterizes the interactions of the basic components.
The overall response of the system is described by the performance function

The analysis yields the combined effect of all input variables that significantly contribute to the performance function.
The results can be represented in terms of

In the current study

Three probabilistic methods are chosen for the UQ: first-order second-moment, Monte Carlo (MC) and metamodelling. The first method is based on the method of moments and requires the calculation of the model sensitivities (first-order derivatives). The MC method requires the simulation of a large number of random experiments and is the most expensive in terms of computing time. The metamodelling method is based on random experiments (MC) with the benefit that it requires far fewer samples. Polynomial chaos (PC) is a type of metamodelling technique, which is chosen for the present study. Further details on each method will be given in the following sections.

Moment method approximations are obtained from the truncated Taylor series expansion about the expected value of the input parameter.
The first-order second-moment (FOSM) method uses the first-order terms of the series and requires up to the second moments of the uncertain input variables for estimating the output variance of a system.
The variance of the performance function

It should be noted that the FOSM method is suited as long as (a) the input variables are statistically independent and (b) the linearity assumption is valid, i.e. the first-order approximation is enough to describe the sensitivity of the system.
If

Monte Carlo simulation is a probabilistic method in which a very large number of similar random experiments form the basis. An attempt is made to solve analytically unsolvable or complicated solvable problems with the help of probability theory. The law of large numbers makes up one of the main aspects of the method. The random experiments can be carried out in computer calculations in which (pseudo-)random numbers are generated with suitable algorithms to simulate random events.

The basic steps of a MC method can be described as follows:

Sample the input random variables

Calculate the deterministic output

Determine the statistics of the distribution of

Step (2) should be repeated

Metamodelling attempts to offset the increased cost of probabilistic modelling by replacing the expensive evaluation of model calculations with a cost-effective evaluation of surrogates.
Polynomial chaos is a powerful metamodelling technique that aims to provide a functional approximation of a computational model through its spectral representation of uncertainty based on polynomial functions.
A more detailed introduction to the PC method can be found in

Spectral-based methods allow for an efficient stochastic reduced basis representation of uncertain parameters in numerical modelling.
By means of a truncated expansion to discretize the input random quantities it is possible to reduce the order of complexity of the system.
Let us consider the uncertain variable

In this study, the non-intrusive polynomial chaos (NIPC) method will be considered.
The main objective of this method is to obtain the polynomial coefficients without modifying the original model.
This approach considers the deterministic model as a “black box” and approximates the polynomial coefficients based on model evaluations.
The advantage is that this method requires much fewer evaluations of the original model (with regard to MC) for providing reliable results (at least 1 order of magnitude).
The main disadvantage is that it is an additional approximation in the modelling framework, thus leading to further loss of information of the physical process.
The reader is referred to

The current study focuses on a reach of the river Rhine used for numerical tests by the German Federal Waterways Engineering and Research Institute (BAW).
It is an 11

Lower Rhine river topography and numerical model boundaries (red polygon) nearby Düsseldorf

A numerical model is used to simulate the flood scenario.
In the BAW, studies carried out in large-scale river projects (

The friction coefficient (

The model consists of an unstructured triangular mesh composed of 56 825 points and 112 360 elements.
The resolution varies from about 2.5

This numerical model was extensively investigated from the point of view of sediment transport and morphodynamics

Water level difference along the river channel axis for different floodplain friction values

The calibration of the floodplain friction with water level measurements can be achieved either with (a) the traditionally used Lindner–Pasche friction law (Eq.

The reader may ask himself or herself about which approach to be used.
In this case it is useful to compare the absolute difference of the flow velocity with and without the floodplain friction formulation (see Fig.

An alternative to the deterministic approach in such situations, when there is a potential shortcoming in the modelling process due to a lack of information, is to carry out an UQ.
As explained in Sect.

The next step now is to calibrate the remaining floodplain friction formulations with water level measurements.
In order to make a comparison to the LIND approach, first

Floodplain friction parameter values calibrated under emergent conditions (

Floodplain friction parameter values calibrated under submerged conditions (

For the UQ it is required that all sensitive parameters relevant to model results should be considered for the determination of the PIs.
Once the parameters are chosen a very important step follows: an error or deviation should be carefully assigned to each parameter.
This variation should be small enough to be treated as an error, but large enough to include the actual parameter uncertainty (due to the lack of knowledge).
Unfortunately there is no general rule for choosing a proper value, since different aspects might contribute, for example, measurement accuracy, spatial/time variances and numerical representation of process.
In the current study, the chosen variations for the parameters related to the vegetation species (

Input parameter deviations and ranges.

Finally, the UQ methods presented in Sect.

The numerical model was evaluated with the four floodplain friction formulations.
A constant discharge of 7870

Scatterplots of flow velocity results against floodplain friction input parameters at node 13470.

The scatterplots together with the SRCs are presented in Fig.

Prediction interval of flow velocity under emergent conditions (rows: friction formulations; columns: UQ methods).

In Fig.

Prediction interval of flow velocity under submerged conditions (rows: friction formulations; columns: UQ methods).

As explained in Sect.

Risk analysis of flow velocity for threshold

Figures

When compared to emergent conditions the overall uncertainty of submerged conditions is significantly larger.
This is an expected result in UQ as there is an additional input (vegetation height

An important topic not only regarding UQ but numerical simulation in general, is the matter of input uncertainty definition.
When performing a numerical simulation that is based on physical processes one will eventually need to validate calculations with measurements.
Also, initial and boundary conditions are usually based on measurements of the original process.
That is to say one should know a priori how accurate the available measurements are.
This is usually not a trivial task, since measurement errors may not be easily evaluated

A framework for the estimation of uncertainties of hydrodynamic models on floodplains was presented.
A traditional resistance formula used for river modelling together with three more recent approaches to floodplain friction were considered for carrying out an uncertainty quantification.
The analysis was performed by means of three different methods: traditional MC, FOSM and NIPC (metamodelling).
A two-dimensional model of a 10

From the scatterplots it was identified that the permeability coefficient (

The three UQ methods compared gave similar results, which means that FOSM is the most efficient in this case.
Despite being a very simple method to apply, FOSM will only produce good results when the first-order approximation is sufficient to describe the sensitivity of the system.
In the presented study this was the case, probably because all the chosen inputs are directly correlated to the resistance coefficient.
Research on related topics such as floodplain mapping usually focuses on the analysis of uncertainties that relies on Monte Carlo-based methods

Model simulation and calibration data are available upon request from the corresponding author. Digital elevation model data are provided by the German Federal Waterways and Shipping Administration and are available under the terms of use for the provision of German federal geodata at

GLD designed the methodology and carried out the investigation with cooperation from all co-authors. TBZ provided the original model input data. RK contributed to the application of uncertainty quantification methods. GLD prepared the first draft of the manuscript, which was then revised and improved by the co-authors.

The authors declare that no competing interests are present.

The authors would like to thank the EDF researcher Cédric Goeury for his help with the metamodel set-up, and Audrey Valentine for her valuable comments in revising this paper. Moreover, we thank Renata Romanowicz and an anonymous referee for their coherent and constructive reviews.

This paper was edited by Giuliano Di Baldassarre and reviewed by Renata Romanowicz and one anonymous referee.