Like all human endeavours, the IPCC is not perfect. Despite the enormous efforts devoted to producing its reports with the multiple levels of peer review, some errors will sneak through. Most of these will be minor and inconsequential, but sometimes they might be more substantive. As many people are aware (and as John Nieslen-Gammon outlined in a post last month and Rick Piltz goes over today), there is a statement in the second volume of the IPCC (WG2), concerning the rate at which Himalayan glaciers are receding that is not correct and not properly referenced.
IPCC
Are the CRU data “suspect”? An objective assessment.
Kevin Wood, Joint Institute for the Study of the Atmosphere and Ocean, University of Washington
Eric Steig, Department of Earth and Space Sciences, University of Washington
In the wake of the CRU e-mail hack, the suggestion that scientists have been hiding the raw meteorological data that underpin global temperature records has appeared in the media. For example, New York Times science writer John Tierney wrote, “It is not unreasonable to give outsiders a look at the historical readings and the adjustments made by experts… Trying to prevent skeptics from seeing the raw data was always a questionable strategy, scientifically.”
The implication is that something secretive and possibly nefarious has been afoot in the way data have been handled, and that the validity of key data products (especially those produced by CRU) is suspect on these grounds. This is simply not the case. [Read more…] about Are the CRU data “suspect”? An objective assessment.
Copenhagen
The ‘Copenhagen Diagnosis‘, a report by 26 scientists from around the world was released today. The report is intended as an update to the IPCC 2007 Working Group 1 report. Like the IPCC report, everything in the Copenhagen Diagnosis is from the peer-reviewed literature, so there is nothing really new. But the report summarizes and highlights those studies, published since the (2006) close-off date for the IPCC report, that the authors deemed most relevant to the negotiations in Copenhagen (COP15) next month. This report was written for policy-makers, stakeholders, the media and the broader public, and has been sent to each and every one of the COP15 negotiating teams throughout the world.
Among the points summarized in the report are that:
The ice sheets are both losing mass (and hence contributing to sea level rise). This was not certain at the time of the IPCC report.
Arctic sea ice has declined faster than projected by IPCC.
Greenhouse gas concentrations have continued to track the upper bounds of IPCC projections.
Observed global temperature changes remain entirely in accord with IPCC projections, i.e. an anthropogenic warming trend of about 0.2 ºC per decade with superimposed short-term natural variability.
Sea level has risen more than 5 centimeters over the past 15 years, about 80% higher than IPCC projections from 2001.
Perhaps most importantly, the report articulates a much clearer picture of what has to happen if the world wants to keep future warming within the reasonable threshold (2°C) that the European Union and the G8 nations have already agreed to in principle.
The full report is available at www.copenhagendiagnosis.org. Three of us at RealClimate are co-authors so we can’t offer an independent review of the report here. We welcome discussion in the comments section though. But read the report first before commenting, please.
Monckton’s deliberate manipulation
Our favorite contrarian, the potty peer Christopher Monckton has been indulging in a little aristocratic artifice again. Not one to be constrained by mere facts or observable reality, he has launched a sally against Andy Revkin for reporting the shocking news that past industry disinformation campaigns were not sincere explorations of the true uncertainties in climate science.
[Read more…] about Monckton’s deliberate manipulation
What the IPCC models really say
Over the last couple of months there has been much blog-viating about what the models used in the IPCC 4th Assessment Report (AR4) do and do not predict about natural variability in the presence of a long-term greenhouse gas related trend. Unfortunately, much of the discussion has been based on graphics, energy-balance models and descriptions of what the forced component is, rather than the full ensemble from the coupled models. That has lead to some rather excitable but ill-informed buzz about very short time scale tendencies. We have already discussed how short term analysis of the data can be misleading, and we have previously commented on the use of the uncertainty in the ensemble mean being confused with the envelope of possible trajectories (here). The actual model outputs have been available for a long time, and it is somewhat surprising that no-one has looked specifically at it given the attention the subject has garnered. So in this post we will examine directly what the individual model simulations actually show.
The IPCC model simulation archive
In the lead up to the 4th Assessment Report, all the main climate modelling groups (17 of them at last count) made a series of coordinated simulations for the 20th Century and various scenarios for the future. All of this output is publicly available in the PCMDI IPCC AR4 archive (now officially called the CMIP3 archive, in recognition of the two previous, though less comprehensive, collections). We’ve mentioned this archive before in passing, but we’ve never really discussed what it is, how it came to be, how it is being used and how it is (or should be) radically transforming the comparisons of model output and observational data.
[Read more…] about The IPCC model simulation archive
Global dimming and global warming
Readers might remember a minor kerfuffle in EOS (the AGU house journal) in February this year in which Gerald Stanhill claimed to find a paradox in the contemporaneous effects of global warming and global dimming (a long term reduction of surface solar radiation, mainly due to aerosols and clouds). The article attracted attention mainly because the paradox was claimed to “pose [a challenge] to the consensus explanation of climate change”.
Rather than point out the subtle confusions (between surface and tropospheric forcing, and local and global signals) here, I and two co-authors wrote a comment to the journal. After a number of avoidable and unavoidable delays, this comment (along with another one and a reply) have all now appeared in EOS (Nov 6 edition). By now of course, the original piece has been long forgotten and so the point in having the correspondence printed is unclear, but still…
For those that care, I’ll link our comment once it’s been posted on the GISS website (now available here), but the bottom line is clearly seen in the following figure:
That is, if you take all of the IPCC AR4 models (now called the CMIP3 ensemble), then over the twentieth century all of them show varying degrees of global warming, while at the same time they show significant global dimming. An earlier paper of ours had pointed to the aerosols (unsurprisingly) being the dominant cause for long term changes in dimming, but that changes in clouds on a decadal basis were responsible for much of the shorter term variability. Thus there doesn’t appear to be much ‘paradox’ left to worry about – both dimming and warming are seen in models and in observations.
Apologies for appearing to push my own papers here (not something we like to do particularly), but the published comment would have been better done as a blog post in February. There may be a lesson there….
CO2 equivalents
There was a minor kerfuffle in recent days over claims by Tim Flannery (author of “The Weather Makers”) that new information from the upcoming IPCC synthesis report will show that we have reached 455 ppmv CO2_equivalent 10 years ahead of schedule, with predictable implications. This is confused and incorrect, but the definitions of CO2_e, why one would use it and what the relevant level is, are all highly uncertain in many peoples’ minds. So here is a quick rundown.
Definition: The CO2_equivalent level is the amount of CO2 that would be required to give the same global mean radiative forcing as the sum of a basket of other forcings. This is a way to include the effects of CH4 and N2O etc. in a simple way, particularly for people doing future impacts or cost-benefit analysis. The equivalent amount is calculated using the IPCC formula for CO2 forcing:
Total Forcing = 5.35 log(CO2_e/CO2_orig)
where CO2_orig is the 1750 concentration (278 ppmv).
Usage: There are two main ways it is used. Firstly, it is often used to group together all the forcings from the Kyoto greenhouse gases (CO2, CH4, N2O and CFCs), and secondly to group together all forcings (including ozone, sulphate aerosols, black carbon etc.). The first is simply a convenience, but the second is what matters to the planet. Many stabilisation scenarios, such as are being discussed in UNFCCC negotiations are based on stabilising total CO2_e at 450, 550 or 750 ppmv.
Magnitude The values of CO2_e (Kyoto) and CO2_e (Total) can be calculated from Figure 2.21 and Table 2.12 in the IPCC WG1 Chapter 2. The forcing for CO2, CH4 (including indirect effects), N2O and CFCs is 1.66+0.48+0.07+0.16+0.34=2.71 W/m2 (with around 0.3 W/m2 uncertainty). Using the formula above, that gives CO2_e (Kyoto) = 460 ppmv. However, including all the forcings (some of which are negative), you get a net forcing of around 1.6 W/m2, and a CO2_e (Total) of 375 ppmv with quite a wide error bar. This is, coincidently, close to the actual CO2 level.
Implications The important number is CO2_e (Total) which is around 375 ppmv. Stabilisation scenarios of 450 ppmv or 550 ppmv are therefore still within reach. Claims that we have passed the first target are simply incorrect, however, that is not to say they are easily achievable. It is even more of a stretch to state that we have all of a sudden gone past the ‘dangerous’ level. It is still not clear what that level is, but if you take a conventional 450 ppmv CO2_e value (which will lead to a net equilibrium warming of ~ 2 deg C above pre-industrial levels), we are still a number of years from that, and we have (probably) not yet committed ourselves to reaching it.
Finally, the IPCC synthesis report is simply a concise summary of the three separate reports that have already come out. It therefore can’t be significantly different from what is already available. But this is another example where people are quoting from draft reports that they have neither properly read nor understood and for which better informed opinion is not immediately available. I wish journalists and editors would resist the temptation to jump on leaks like this (though I know it’s hard). The situation is confusing enough without adding to it unintentionally.
Regional Climate Projections
Regional Climate Projections in the IPCC AR4
How does anthropogenic global warming (AGW) affect me? The answer to this question will perhaps be one of the most relevant concerns in the future, and is discussed in chapter 11 of the IPCC assessment report 4 (AR4) working group 1 (WG1) (the chapter also has some supplementary material). The problem of obtaining regional information from GCMs is not trivial, and has been discussed in a previous post here at RC and the IPCC third assessment report (TAR) also provided a good background on this topic.
The climate projections presented in the IPCC AR4 are from the latest set of coordinated GCM simulations, archived at the Program for Climate Model Diagnosis and Intercomparison (PCMDI). This is the most important new information that AR4 contains concerning the future projections. These climate model simulations (the multi-model data set, or just ‘MMD’) are often referred to as the AR4 simulations, but they are now officially being referred to as CMIP3.
One of the most challenging and uncertain aspects of present-day climate research is associated with the prediction of a regional response to a global forcing. Although the science of regional climate projections has progressed significantly since last IPCC report, slight displacement in circulation characteristics, systematic errors in energy/moisture transport, coarse representation of ocean currents/processes, crude parameterisation of sub-grid- and land surface processes, and overly simplified topography used in present-day climate models, make accurate and detailed analysis difficult.
I think that the authors of chapter 11 over-all have done a very thorough job, although there are a few points which I believe could be improved. Chapter 11 of the IPCC AR4 working group I (WGI) divides the world into different continents or types of regions (e.g. ‘Small islands’ and ‘Polar regions’), and then discusses these separately. It provides a nice overview of the key climate characteristics for each region. Each section also provides a short round up of the evaluations of the performance of the climate models, discussing their weaknesses in terms of reproducing regional and local climate characteristics.
Transparency of the IPCC process
Recently, a Financial Times op-ed criticised the IPCC for having contributors and peers drawn from a narrow professional circle. I don’t think this is fair, unless one regards a whole discipline as ‘narrow’. Furthermore, recent public disclosure of both comments and response suggests a different story to the allegations in the FT op-ed of ‘refusing to disclose data and methods’. The IPCC has no control over the independent publication, but the disclosure of the comments and response at least enhances the openness for the synthesis of the report.