There is a climate splash in Nature this week, including a cover showing a tera-tonne weight, presumably meant to be made of carbon (could it be graphite?), dangling by a thread over the planet, and containing two new articles (Allen et al and Meinshausen et al), a “News & Views” piece written by two of us, and a couple commentaries urging us to “prepare to adapt to at least 4° C” and to think about what the worst case scenario (at 1000 ppm CO2) might look like.
At the heart of it are the two papers which calculate the odds of exceeding a predefined threshold of 2°C as a function of CO2 emissions. Both find that the most directly relevant quantity is the total amount of CO2 ultimately released, rather than a target atmospheric CO2 concentration or emission rate. This is an extremely useful result, giving us a clear statement of how our policy goals should be framed. We have a total emission quota; if we keep going now, we will have to cut back more quickly later.
There is uncertainty in the climate sensitivity of the Earth and in the response of the carbon cycle, and the papers are extremely useful in the way that they propagate these uncertainties to the probabilities of different amounts of warming. Just looking at the median model results, many people conclude that a moderately optimistic but not terribly aggressive scenario such as IPCC B1 would avoid 2°C warming relative to pre-industrial. But when you take into account the uncertainty, you find that there is a disturbingly high likelihood (roughly even odds) that it won’t.
Both papers come to the same broad conclusion, summarized in our figure, that unless humankind puts on the brakes very quickly and aggressively (i.e. global reductions of 80% by 2050), we face a high probability of driving climate beyond a 2°C threshold taken by both studies as a “danger limit”. Comparing the two papers is obscured by the different units; mass of carbon versus mass of CO2 (moles, anyone? Is there a chemist in the house?). But chugging through the math, we find the papers to be broadly consistent. Both papers conclude that humankind is already about half-way toward releasing enough carbon to probably reach 2°C, and that most of the fossil fuel carbon (the coal, in particular) will have to remain in the ground.
We feel compelled to note that even a “moderate” warming of 2°C stands a strong chance of provoking drought and storm responses that could challenge civilized society, leading potentially to the conflict and suffering that go with failed states and mass migrations. Global warming of 2°C would leave the Earth warmer than it has been in millions of years, a disruption of climate conditions that have been stable for longer than the history of human agriculture. Given the drought that already afflicts Australia, the crumbling of the sea ice in the Arctic, and the increasing storm damage after only 0.8°C of warming so far, calling 2°C a danger limit seems to us pretty cavalier.
Also, there are dangers to CO2 emission other than the peak, such as the long tail of the CO2 perturbation which will dominate the ultimate sea level response, and the acidification of the ocean. A building may be safe from earthquakes but if it is susceptible to fires it is still considered unsafe.
The sorts of emission cuts that are required are technologically feasible, if we were to build wind farms instead of coal plants, an integrated regional or global electrical power grid, and undertake a crash program in energy efficiency. But getting everybody to agree to this is the discouraging part. The commentary by Parry et al advises us to prepare to adapt to climate changes of at least 4°C, even though they recognize that it may not be possible to buy our way out of most of the damage (to natural systems, for example, including the irreversible loss of many plant and animal species). Anyway, how does one “adapt” to a train wreck? There is also the fairness issue, in that the beneficiaries of fossil energy (rich countries today) are not the ones who pay the costs (less-rich countries decades from now). We wonder why we were not advised to prepare to adapt to crash curtailing CO2 emissions, which sounds to us considerably less frightening.
p.s. For our German-speaking readers: Stefan’s commentary on the KlimaLounge blog.
Mark says
re 595, no you had it right the first time.
If some huge tragedy comes up where a pulse of energy is needed, there will still be easily available oil/coal/gas if we stop using it all up before it’s gone.
Hank Roberts says
Seems we’re more likely to have some huge tragedy where a pulse of energy is the problem, though.
http://www.agu.org/pubs/crossref/2009/2009GL037525.shtml
McPhee, M. G., A. Proshutinsky, J. H. Morison, M. Steele, and M. B. Alkire (2009), Rapid change in freshwater content of the Arctic Ocean, Geophys. Res. Lett., 36, L10602, doi:10.1029/2009GL037525.
“… The dramatic reduction in minimum Arctic sea ice extent in recent years has been accompanied by surprising changes in the thermohaline structure of the Arctic Ocean, with potentially important impact on convection in the North Atlantic and the meridional overturning circulation of the world ocean ….”
Kevin McKinney says
A late addition, and semi-off topic–I chose this thread because it’s under the “Greenhouse Gases” category–but I’ve got a “life and times” article up on Claude Pouillet, the protean–but too little remembered–French physicist credited with the first estimate of what we now call the solar constant, and in particular his 1838 paper Mémoire sur la chaleur solaire, sur les pouvoirs rayonnants et absorbants de l’air atmosphérique, et sur les températures de l’espace.
Pouillet was an elegant experimentalist, and refined Fourier’s work on atmospheric heat transport. As with the Fourier article that preceded it, the present page is more focused on the life and times of the subject, and less on the scientific detail–I’m trying to supply context, not so much to elaborate or interpret the paper itself. Those looking for “human face” accounts of classic GW science may check it out at:
http://hubpages.com/hub/The-Science-of-Global-Warming-in-the-age-of-Napoleon-III
Those interested in the original paper itself can find it here:
http://wiki.nsdl.org/index.php/PALE:ClassicArticles/GlobalWarming/Article2
Next up, Tyndall. . .
Eric Rehm says
I am confused about one paragraph in the accompanying article “Climate crunch: A burden beyond bearing” by Richard Monasterksy in the same 29 April 2009 issue of Nature.
Under the section “Slow Recovery” (p. 1093), Monastersky summarizes the accompanying Meinshausen et al. 2009 article, stating “For the period 2000 to 2050, they find that the world would have to limit emissions of all greenhouse gases to the equivalent of 400 gigatonnes of carbon in order to stand a 75% chance of avoiding more than 2°C of warming.”
However, I can’t find the 400 GT figure in the Meinshausen et al. article. Instead, Meinshausen et al. say: “…we find that the probability of exceeding 2°C can be limited to below 25% (50%) by keeping 2000?49 cumulative CO2 emissions from fossil sources and land use change to below 1,000 (1,440) Gt CO2”
Where does Monastersky’s 400 GT figure come from?