By averaging over an ensemble of model integrations the statistical noise is reduced, and the detection of trends in the noisy weather signal becomes possible even for short time periods (10-20 years) and small regions. At the same time the large amount of data obtained from an ensemble makes it possible to determine weather extremes and their possible change more accurately.
Together with the Institute for Marine and Atmospheric Research of Utrecht University (IMAU) KNMI has performed a follow-up of the successful Challenge project1) to address these questions. In ESSENCE the ECHAM5/MPI-OM climate model of the Max-Planck-Institute for Meteorology2) was used because it performed well on a number of criteria during an intercomparison of models3) that were considered in the IPCC Fourth Assessment Report (AR4). All runs were driven by the A1b scenario of 21st century greenhouse gas concentrations from the IPCC Special Report on Emission Scenarios (SRES). Besides a basic ensemble of 17 runs also three experimental ensembles were performed.
Figure 1 shows the annual-mean 2m-temperature averaged globally (a) and at station De Bilt in the Netherlands (b) for the 17 members of the basic ensemble, together with their mean and the observations (HadCRUT3 dataset4)for the global-mean, station data for De Bilt). Although the ensemble-spread of the global-mean temperature is fairly small (about 0.4 K), it encompasses the observations very well. Between 1950 and 2005 the observed global-mean temperature increased by 0.65 K, while the ensemble-mean gives an increase of 0.76 K. The discrepancy mainly arises from the period 1950-1965. After that period observed and modelled temperature trends are nearly identical. This gives confidence in the model's sensitivity to changes in GHG concentrations.
On the local scale (Figure 1b) the variability is obviously much larger than for global-mean values. This is reflected both in the large spread of the ensemble and in the large year-to-year variability of the observed temperatures. Both spreads are comparable, as are modelled and observed trends.
The enhanced signal-to-noise ratio that is achieved by averaging over all ensemble members of a large ensemble makes the detection of significant trends over short periods (10-20 years) possible. For near-surface temperature the earliest detection times (Figure 2) are found off the equator in the western parts of the tropical oceans as well as in the Middle East. In these regions the trend is only modest, but the variability is extremely low, so that the signal emerges as early as around 2005 from the noise. A second region with such an early detection time is the Arctic, where the trend is very large due to the decrease of the sea-ice. Over most of Eurasia and Africa detection is possible before 2020. The longest detection times are found in the equatorial Pacific, where, due to El Niño, the variability is very high, as well as in the Southern Ocean and the North Atlantic, where the trend is very small (not shown).
The early detection time of the annual-mean near-surface temperature in the Arctic is mainly due to the decline of the summer sea-ice coverage, which disappears much more rapidly than the winter coverage (Figure 3). During winter it remains cold enough to replace the ice cover that has melted away during summer. While winter ice-cover only reduces in the southernmost parts of the Arctic (like Hudson Bay or Sea of Okhotsk) or at the margins (along East Greenland, Barents Sea), the summer ice cover declines rapidly. About half of the initial volume is already lost around 2020. Sea-ice coverage is halved by about 2050 and declines very rapidly thereafter. The rapid decline in Arctic summer ice-cover is in line with recent model results5) and observations6).
Practical questions concerning the construction of river dikes and coastal defence systems require an estimate of the probability of occurrence of extreme high river and sea levels. The coastal defence system should be designed such that the risk of flooding by the sea is once every ten thousand years. Meteorologists are asked what the exact meteorological conditions are that correspond to such a rare flooding event. The problem that meteorologists face is that only relatively short time series of the order of 100 years are available to estimate the one-in-ten-thousand year event from the present climate. Moreover, the probability of this event most likely changes as climate changes.
One therefore has to extrapolate from the relatively few available observations to the 0ne-in-thousand or even one-in-ten-thousand year event. The common approach to solve this problem is to fit a Generalized Extreme Value (GEV) distribution to the observed meteorological extremes (such as wind, precipitation or temperature). The GEV is extrapolated to hitherto unobserved extremes and their return periods are determined. The extrapolation is only allowed if the physical mechanism creating observed and yet unobserved extremes is the same. One way to check this important but in principle improvable assumption is to make use of climate simulations such as the ESSENCE ensemble. For the period 1951-2000, a total of 37 simulations is available (17 + 20 from a control ensemble), yielding 1850 years of data. For the period 2051-2100, the 17 scenario simulations yield 850 years of data. This is much more than is available from observations.
Figure 4 shows a so-called Gumbel plot for the annual extreme 10-m wind speeds for the grid point (11°W,36°N). The plot shows the annual maxima as a function of the return period; the stronger the extreme, the longer the return period. It shows that the most extreme winds (from return periods of 100 years on) systematically deviate from the theoretical distribution. This suggests that they are formed by a mechanism different from that for the less extreme storms which follow the theoretical distribution. The second mechanism leads to rare, over-extreme winds compared to what we could call ‘normal’ extremes. While the average of the annual extremes decreases in the future period (the blues curve lies below the red one), the most extreme events do not change, suggesting that the mechanism leading to the over-extreme storms is not affected by climate change.
Analysis of the over-extreme events suggests an important role for horizontal interaction of cyclones (wave merging). An example is shown in Figure 5. It shows that a cyclone merges with a second one, leading to extremely strong winds, sea-level pressures and deepening rates. Research continues to quantify the effect of merging and detect the geographical regions where this wave merging leads to over-extreme winds. If the interaction is well understood, the mechanism leading to over-extreme storms may be identified in the relatively short record of observational data as well.
In the KNMI Climate Change Scenarios 2006 for the Netherlands7) a distinction is made between scenarios in which the circulation patterns change (W+ and G+ scenarios) and scenarios with no significant changes in circulation patterns (W and G scenarios). This is based on the observation that the majority of climate models simulate an increase in the westerly flow in winter over Western Europe and an increase in easterly flow during summer7). The cautiousness to distinguish between scenarios with and without circulation changes is partly based on the spread in the climate model simulations and partly on the effect of the internal variability of the climate system. Even in an unchanging climate, the atmospheric circulation can vary from decade to decade. This internal climate variability can be estimated from a large ensemble of climate simulations. For the ESSENCE ensemble Figure 6 shows the wind climate in the Netherlands for January and August. For the ensemble mean it shows an increase in westerly flow during winter and an increase in easterly flow during summer, which is in accordance with the conclusions by Van Ulden and Van Oldenborgh3). However, the spread between the ensemble members (Figure 6) is so large that for the coming century the climate change signal in the circulation over the Netherlands can be masked by the internal variability of the climate. This underscores the cautiousness being made in the KNMI Climate Change Scenarios 2006 for the Netherlands.
The examples presented here show the strength of the ensemble approach to distinguish the climate change signal from the weather noise. However, the ESSENCE runs only address one source of uncertainty, namely the one due to internal variability. Another large source of uncertainty stems from the many approximations that are made in the modelling of the physical processes that play a role in the climate system. Examples include the parameterizations of unresolved processes like clouds and small scale turbulent motions and the representation of (biogeo)-chemical processes leading to changes in the atmospheric composition To address these issues we will in the future conduct ensemble experiments with the new ECEARTH model that is under development, in which we will systematically investigate the impact of uncertainties in the modelling of physical processes on the simulated climate change.
Acknowledgment: The DEISA-DECI project ESSENCE is lead by Wilco Hazeleger (KNMI) and Henk Dijkstra (UU/IMAU) and was carried out with support of DEISA, HLRS, SARA and NCF (through NCF projects NRG-2006.06, CAVE-06-023 and SG-06-267). The authors thank Camiel Severijns (KNMI), and HLRS and SARA staff for technical support. The Max-Planck-Institute for Meteorology in Hamburg made available their climate model ECHAM5/MPI-OM and provided valuable advice on implementation and use of the model.