“The weather forecast looks sunny and particularly hot from Sunday to Friday, with afternoon temperatures above 30°C every day, and likely exceeding 35°C by the middle of the week. One consequence is that the poster sessions (Tuesday and Thursday) have been moved to the morning as they will be held outside under a marquee.”
I have never received a notification like this before a conference. And it was then followed up by a warning from the Guardian: ‘Hell is coming’: week-long heatwave begins across Europe.
The heatwave took place and was an appropriate frame for the International meeting on statistical climatology (IMSC), which took place in Toulouse, France (June 24-28). France set a new record-high temperature 45.9°C on June 28th, beating the previous record 44.1°C from 2003 by a wide margin (1.8°C).
One of the topics of this meeting was indeed heatwaves and one buzzword was “event attribution”. It is still difficult to say whether a single event is more likely as a result of climate change because of model inaccuracies when it comes to local and regional details.
Weather and climate events tend to be limited geographically and involve very local processes. Climate models, however, tend to be designed to reproduce more large-scale features, and their output is not exactly the same as observed quantity. Hence, there is often a need for downscaling global climate model results in order to explain such events.
A popular strategy for studying attribution of events is to run two sets of simulations: ‘factual’ (with greenhouse gas forcing) and ‘counterfactual’ (without greenhouse gas forcings) runs for the past, and then compare the results. Another question is how to “frame” the event, as different definitions of an event can give different indicators.
Individual heatwaves are still difficult to attribute to global warming because soil moisture may be affected by irrigation wheras land surface changes and pollution (aerosols) can shift the temperature. These factors are tricky when it comes to modeling and thus have an effect on the precision of the analysis.
Nevertheless, there is little doubt that the emerging pattern of more extremes that we see is a result of the ongoing global warming. Indeed, the results presented at the IMSC provide further support for the link between climate change and extremes (see previous post absence of evidence).
I braved the heat inside the marquee to have a look at the IMSC posters. Several of them presented work on seasonal and decadal forecasting, so both seasonal and decadal prediction still seem to be hot topics within the research community.
A major hurdle facing decadal predictions is to design climate models and give them good enough information so that they are able to predict how temperature and circulation evolve (see past post on decadal predictions). It is hard enough to predict the global mean temperature (link), but regional scales are even more challenging. One question addressed by the posters was whether advanced statistical methods improve the skill when applied to model output.
A wide range of topics was discussed during the IMSC. For instance, how the rate of new record-breaking events (link) can reveal trends in extreme statistics. There was one talk about ocean wave heights and how wave heights are likely to increase as sea-ice retreats. I also learned how severe thunderstorms in the US may be affected by ENSO and climate change.
Another interesting observation was that so-called “emergent constraints” (and the Cox et al, (2018) paper) are still debated, in addition to methods for separating internal variability from forced climate change. And there is ongoing work on the reconstruction of temperature over the whole globe, making use of all available information and the best statistical methods.
It is probably not so surprising that the data sample from the ARGO floats shows an ongoing warming trend, however, by filling in the spaces with temperature estimates between the floats, the picture becomes less noisy. It seems that a better geographical representation removes a bias that gives an underestimated warming trend.
While most talks were based on statistics, there was one that was mostly physics-based on the transition between weather regimes. Other topics included bias-adjustment (multi-variate), studies of compound events (straining the emergency service), the connection between drought and crop yields, how extreme weather affects health, snow avalanches, precipitation from tropical cyclones, uncertainties, downscaling based on texture analysis, and weather generators. To cover all of these would take more space than I think is appropriate for a blog like this.
One important issue was about data sharing which merits wider attention. The lack of open and free data is still a problem, especially if we want to tackle the World Climate Research Programme’s grand challenges. European and US data are freely available and the Israeli experience indicate that open access is beneficial.