Climate modeling groups all across the world are racing to add their contributions to the CMIP5 archive of coupled model simulations. This coordinated project, proposed, conceived and specified by the climate modeling community itself, will be an important resource for analysts and for the IPCC AR5 report (due in 2013), and beyond.
There have been previous incarnations of the CMIP projects going back to the 1990s, but I think it’s safe to say that it was only with CMIP3 (in 2004/2005) that the project gained a real maturity. The CMIP3 archive was heavily used in the IPCC AR4 report – so much so that people often describe those models and simulations as the ‘IPCC models’. That is a reasonable shorthand, but is not really an accurate description (the models were not chosen by IPCC, designed by IPCC, or run by IPCC) even though I’ve used it on occasion. Part of the success of CMIP3 was the relatively open data access policy which allowed many scientists and hobbyists alike to access the data – many of whom were dealing with GCM output for the first time. Some 600 papers have been written using data from this archive. We discussed some of this success (and some of the problems) back in 2008.
Now that CMIP5 is gearing up for a similar exercise, it is worth looking into what has changed – it terms of both the model specifications, the requested simulations and the data serving to the wider community. Many of these issues are being discussed in a the current CLIVAR newsletter (Exchanges no. 56). (The references below are all to articles in this pdf).
There are three main novelties this time around that I think are noteworthy: the use of more interactive Earth System models, a focus on initiallised decadal predictions, and the inclusion of key paleo-climate simulations as part of the suite of runs.
The term Earth System Model is a little ambiguous with some people reserving that for models that include a carbon cycle, and others (including me) using it more generally to denote models with more interactive components than used in more standard (AR4-style) GCMs (i.e. atmospheric chemistry, aerosols, ice sheets, dynamic vegetation etc.). Regardless of terminology, the 20th Century historical simulations in CMIP5 will use a much more diverse set of model types than did the similar simulations in CMIP3 (where all models were standard coupled GCMs). That both expands the range of possible evaluations of the models, but also increases the complexity of that evaluation.
The ‘decadal prediction’ simulations are mostly being run with standard GCMs (see the article by Doblas-Reyes et al, p8). The different groups are trying multiple methods to initialise their ocean circulations and heat content at specific points in the past and are then seeing if they are able to better predict the actual course of events. This is very different from standard climate modelling where no attempt is made to synchronise modes of internal variability with the real world. The hope is that one can reduce the initial condition uncertainty for predictions in some useful way, though this has yet to be demonstrated. Early attempts to do this have had mixed results, and from what I’ve seen of the preliminary results in the CMIP5 runs, significant problems remain. This is one area to watch carefully though.
Personally, I am far more interested in the inclusion of the paleo component in CMIP5 (see Braconnot et al, p15). Paleo-climate simulations with the same models that are being used for the future projections allow for the possibility that we can have true ‘out-of-sample’ testing of the models over periods with significant climate changes. Much of the previous work in evaluating the IPCC models has been based on modern period skill metrics (the climatology, seasonality, interannual variability, the response to Pinatubo etc.), but while useful, this doesn’t encompass changes of the same magnitude as the changes predicted for the 21st Century. Including tests with simulations of the last glacial maximum, the Mid-Holocene or the Last Millennium greatly expands the range of model evaluation (see Schmidt (2010) for more discussion).
The CLIVAR newsletter has a number of other interesting articles, on CFMIP (p20), the scenarios begin used (RCPs) (p12), the ESG data delivery system (p40), satellite comparisons (p46, and p47) and the carbon-cycle simulations (p27). Indeed, the range of issues covered I think presages the depth and interest that the CMIP5 archive will eventually generate.
There will be a WCRP meeting in October in Denver that will be very focused on the CMIP5 results, and it is likely that much of context for the AR5 report will be reflected there.
Richard Whiteford says
This is a must read book for all climate presenters:
http://www.amazon.com/Living-Denial-Climate-Emotions-Everyday/dp/0262515857/ref=sr_1_1?s=books&ie=UTF8&qid=1313067079&sr=1-1
Dikran Marsupial says
It is impressive that climate modelling has reached the point where decadal prediction is being seriously considered. The first I heard about it was from the paper by Fildes and Kourentzes paper on “Validation and forecasting accuracy in models of climate change”. I suspect that those performing decadal predictions are well aware of proper validation procedure already.
[Response: Actually, that isn’t a given. This is a new endeavour and although it can draw on experience with seasonal forecasting, it has it’s own issues (lack of sufficient examples of usable hindcasts from which to derive bias corrections for instance, also rapidly evolving observational networks etc.). The whole thing is very much at the experimental stage – how to initialise, how to evaluate, how to predict are all up in the air – at least for now. – gavin]
PeteB says
Gavin,
I just skim read Schmidt (2010). Sounds very interesting, and a great way of testing models, but looks like a large project needing interdisciplinary input. Is there any ‘grand plan’ on this with timescales – I guess this is likely to take years to really get going
[Response: Indeed. Actually this is a big focus for a few people (including me, but it goes quite wide). Getting the paleo runs included in CMIP tool a lot of lobbying from PMIP, and getting people to realise the potential for using this resource is ongoing. I’m part of an effort to run a workshop next year to focus exactly on this for instance. You are right that this will take years, but I’m hopeful it will happen. – gavin]
William Freimuth says
Hindcasting problem? The problem is most of the world is either unaware or in total denial of the emerging crisis. Obviously climate models were not available in the Dark Ages (which we’re perhaps still in) but data exists to make a better case for “change”.
For Christ’s sake don’t point out the difficulties….point out the problem!
Could there be a court setup whose job it is to track those who are lying and enforce penalties relative to the damages forthcoming? If nothing else, at least get off the defensive and state charging those in the way, with crimes. Perhaps Limbaugh could be sent an assessment of his ‘offenses’ as the damages due to neglect mount. BTW, lying is one of the essential violations of Christendoms B1G (commandments).
[Response: Careful, you are sounding a bit like Steve McI, only in reverse. –eric]
Pete Dunkelberg says
Re # 2: Do Climate Models need Independent Verification and Validation?
Edward Greisch says
1 Richard Whiteford: Read it. I Agree. Also read “The Authoritarians” by Bob Altemeyer. Free download from:
http://home.cc.umanitoba.ca/~altemey/
Also scary: http://thinkprogress.org/romm/2011/08/11/293781/attacks-on-climate-science-education
David Horton says
Bit hard on William, Eric. It seems to me too not so much like rearranging deck chairs on the Titanic as determining exactly the position of each one before the boat sank. Do any scientists still believe that overcoming the Koch brother’s funding of a massive disinformation campaign is just a matter of refining models? Do you think that, say, a President Bachmann, Perry, or indeed Obama, will suddeny say “Oh, at last, decadal prediction, now I can act”?
Of course I know we need to keep refining models, but in the current, and forseeable, political climate, it seems like merely an academic exercise.
Hero says
Thanks for all your hard work Gavin, and to the scientists around the globe for their contribution to this important body of work.
Pete H. says
Re #4, For Chirst’s sake DO point out the difficulties and DO refine the models. That’s how science progresses. And if we get to the point were the American public will tolerate witch trials, then the message will have already made it through. So there will be no point in retribution. Your comment makes me wonder if you’re just trying to bait the RC bloggers.
Re #7 “Do any scientists still believe that overcoming the Koch brother’s funding of a massive disinformation campaign is just a matter of refining models?” What makes you think refining models is about overcoming the Koch brothers? Or that it is a strictly academic exercise.
The practical side of refining models? Policy makers need temporally and spatially refined models to make plans to deal with the inevitable (in the pipeline) changes we are going to experience. And let’s not forget, what at one time seemed like strictly academic exercises is what brought us to the current level of knowledge about climate.
[Response: Thank you for getting it, because a lot of folks (see above) sure don’t.–Jim]
richard pauli says
Pete #8 re: #7…
Ummm, what policy? I see very little policy that is based on AGW science – most is based on politics and economics and carbon fuel interests. Lots of good ideas, good plans, great science, even a few token laws that impart symbolic action – mpg goals, carbon tracking, etc
There is some good legislative success around stopping acid rain and the freon/ozone hole problem – but those were very small and economic no-brainers. But when it comes to global warming — there is NOT MUCH policy in place that exerts real physical interaction with this problem. Especially in corralling the rampaging elephant in the room of carbon combustion. No constraints on coal – and no curtailing of crude. We all talk. But nothing is happening.
Kevin McKinney says
#8–
Good points, Pete. But the current case of Dr. Monnett seems to me perilously close to a witch trial *now*–true, criminal prosecution will not be undertaken, but Dr. Monnett is suffering real harm–psychologically and to his reputation at least, and with a significant risk of career and financial harm to come–apparently on the basis of pure speculation. And a significant number of folks are not only ‘tolerating,’ but cheering.
Joe Hunkins says
Thank you Gavin – great project and nice description along with your 2008
“IPCC” modelling intro from 2008: https://www.realclimate.org/index.php/archives/2008/05/what-the-ipcc-models-really-say/
Jim says
If some people want to drag this post down into the swamp hole of politics, go elsewhere. This is a science post. The number of interesting and relevant and educational-for-everyone topics that could be discussed here is damn near infinite, but the politics of action against climate change…
…is not one of them.
Thanks
Kevin McKinney says
Good point, Jim, there’s “Unforced Variations” available for such. Or at least, arctic discussion. A (mild) mea culpa.
MS says
Thank you, Gavin for writing about this topic and for the link to Schmidt (2010).
I understand that the (mid) Pliocene climate could give us clues to what the long-term consequences of enhanced anthropogenic warming might be.
I am wondering if the data and the model runs really show something similar to the Pliocene or if it is too early to tell.
Does the pliocene seem to be a good “model” for future warming or is it suspected, that there would be major differences?
[Response: There will almost certainly be major differences, but I still think insight can be gained. The key for the Pliocene model/data comparison is the relative imprecision of the CO2 (and other GHG levels), and the uncertainty in the ice sheet/vegetation reconstructions. The Lunt et al (2010) paper discusses some of this. – gavin]
richard pauli says
#13 Jim – you touch a very important aspect of climate change – human politics. But I cannot agree with ignoring it.
You may not want to talk politics, but all the climate science I read concludes that humans are a direct cause of excessive warming. So do we may not want to remove human influence from this science.
As we model and propose future scenarios – just how, pray tell, do we remove politics from the discussion? And don’t we want to influence public policy?
You ask that politics be ignored, but the climate models themselves are specifically predicated on different social reactions to carbon emission change.
Perhaps we could talk of the psychology of denial and the human reticence to change – something rigorously studied by psychologists who also write peer reviewed papers. (admittedly a vast subject area, this indeed deserves its own web site)
But I hope you are not saying that we must remove human psychology from modeling climate scenarios.
Edward Greisch says
13 Jim: What web site would you suggest for the politics of action against climate change? [Not counting climateprogress.org or Daily Kos] And what web site would you suggest for social sciences of climate change? No matter how much you dislike politics, if you want to get anything done, you have to go there. The Contributors should be running for the US Senate because GW would not be happening if the Senate had 60 scientists in it.
Yes, policy makers do need computer simulations that will tell them what actions would actually work. That is, if any actions would actually work. It is true at the present time that we could put an end to GW. It will not be true some time in the future. The models should tell us how the costs will ramp up and when the last possible moment to save ourselves will be. The social sciences tell us why the public doesn’t get it. That is also something we have to know if we are to make a difference.
It IS important to continue doing climate science. Research is also part of education. We must have the fully trained scientists to be witnesses before Congress. There must always be new research to show Congress, not something done by a previous generation. As computing power increases, there must be ever better models that will show policy makers in ever increasing detail what their options are.
Hank Roberts says
> Now that CMIP5 is gearing up for a similar exercise, it is worth
> looking into what has changed – it terms of both the model
> specifications, the requested simulations and the data serving
> to the wider community. Many of these issues are being discussed in …
> the current CLIVAR newsletter (Exchanges no. 56).
> (The references below are all to articles in this pdf).
…
> The CLIVAR newsletter has a number of other interesting articles,
> on CFMIP (p20), the scenarios begin used (RCPs) (p12), the ESG data
> delivery system (p40), satellite comparisons (p46, and p47) and the
> carbon-cycle simulations (p27).
Ah, homework. Thank you.
One Anonymous Bloke says
I agree with Jim – there are much better (ie: more effective) places to discuss politics; an email to your local MP/representative, for example.
Radge Havers says
Climate science communication –> behavioral and social science implications –> Unforced Variations.
No?
Jim Eager says
MS @15, although CO2 levels and eventually global mean temperature may be similar one key difference between the pliocene and current is a completely different ocean current regime due the closure of the isthmus of Panama.
John Mashey says
This all sounds good, and sort of makes me wish I were still helping design and sell supercomputers to modelers … well, maybe again.
But I remain interested in the unusual CO2 drop into 1600AD.
http://i39.tinypic.com/if0m5g.jpg
Is anyone specifically modeling that event (give the rarity of a CO2 drop like that)?
One of Ruddiman’s hypotheses was that this was caused in part by massive post-Columbus die-offs in the Americas and resulting large reforestration. For the latest, see one of the interesting articles in the August issue of The Holocene.
Without arguing whether or not that hypothesis is good or not, the question is
suppose that the reforestration sequestration of CO2 was within the uncertainties, what effects (especially regional) would climate models show for that period, given other known forcings? (Say 1500-1700) Would these effects be different if the CO2 drop were mostly caused by other reasons?
[Response: Thanks for the reminder on this interesting topic, and the good question. John is referring to a topic closely related to those covered in this recent post (which contains a link to the The Holocene issue mentioned). See also earlier RC posts here and here–Jim]
CM says
OK, stupid question: why’d they bump the numbering to 5?
BillS says
#7
“You may have heard the French slogan l’art pour l’art – popularly Latinized as ars gratia artis in MGM’s roaring logo – and typically translated into English as art for art’s sake. However, the more common interpretation for the Latin ars is actually ‘science’. Starting from this basic point, here is an argument for scientific literacy… science for your sake… science for science’s sake.” By Chris Garrigues
The remainder of this interesting essay can be found here:
http://www.zcommunications.org/science-for-sciences-sake-by-chris-garrigues
Some do science simply for science’s sake….
Buck Smith says
Is the source code for any of the models available for download?
Pete H. says
#17 What web site would you suggest for the politics of action against climate change?
Richard Rood often discusses politics, policy, and planning.
http://www.wunderground.com/blog/RickyRood/article.html
There is also DeSmog Blog
http://www.desmogblog.com/
Andrew says
Climate models are generally underestimating the decline of arctic sea ice.
A recent study found that this could be explained if the decline were attributed half to “natural variability” and rest to greenhouse gases:
http://www.nsf.gov/news/news_summ.jsp?cntn_id=121359&WT.mc_id=USNSF_51&WT.mc_ev=click
However, how can they be sure that the assumed albedo of snow does not need significant improvement?
There is at least one papers with better snow albedo models.
However, it’s not clear if this could also explain the discrepancy between models and recently sea ice observations.
http://www.agu.org/pubs/crossref/2011/2010JD015507.shtml
One Anonymous Bloke says
#25 Buck Smith, have you actually looked for any? I doubt it, since at the top of this very page there is a link: ‘Data Sources’.
Aaron Lewis says
My own observations are consistent with Rampal et al’s conclusions in Journal of Geophysical Research – Oceans . That is, the forecasts of climate models are significantly off, and Arctic sea ice is thinning, on average, four times faster than the models say, and it’s drifting twice as quickly. In this case, the ice variability studies that use multiple runs of these models to suggest the possibility of ice recovery, may simply reflect the current climate models’s inadequate analysis of heat transport into the Arctic. This suggests that without the near term feedbacks from loss of Arctic Sea Ice, these models dramatically understate long term effects and impacts of global warming.
Past histories of large programing efforts (i.e., IBM OS/360) suggest that large programming efforts do not make large non-linear strides forward even with massive increases in effort. However, with feed backs such as albedo effects, water vapor feedbacks, CH4 from melting permafrost, CH4 from clathrate decomposition, and the ice dynamics from large ice sheets, the effects and impacts of AGW are likely to make large non-linear strides forward. Thus, the difference between reality and the model forecasts is likely to increase.
Moreover, as long as there are new models in the pipeline, risk managers and policy makers will consider the question of global warming open. They will hope that the next model will show that AGW is not as bad as we thought.
Hank Roberts says
> Buck Smith … source code
Yes.
http://www.google.com/search?q=cmip5+“source+code”
http://scholar.google.com/scholar?q=cmip5+%22source+code%22
Steve Bloom says
Re 15/21: My understanding is that the Central American Seaway was closed or at least mostly so by 3.3 mya, but there are certainly other things (e.g. IIRC the height of the Rockies) that made the Pliocene warm period climate a little different in its details from what we would get with the same CO2 levels run to equilibrium.
The trick is that while knowing the details of past higher-CO2 equilibrium states is useful, we won’t be seeing anything like them since what we actually have is an unnaturally rapid CO2 transient that is having/will likely have all sorts of feedback effects (e.g sink saturation, ocean acidifcation, possible large methane pulse) that wouldn’t amount to much under natural conditions. As Jim Hansen has forcefully pointed out, even the PETM (the biggest transient nature has been able to manage in the Phanerozoic) is a poor analogy since it started out in ice-free conditions. As we can see from the record of recent deglaciations, ice sheets like to collapse quickly and have all sorts of effects that we would probably prefer to avoid, not that a too-rapid warming in ice-free conditions would be a walk in the park (consider e.g. the implications of a too-fast shift in fertile zones due to expansion of the tropics). It seems clear enough at this point that we won’t be avoiding a lot of them.
Steve Bloom says
Gavin, I’ve been following with great interest what seems to be camapign by Knutti (with various co-authors) to in effect decertify some of the poorer-performing models or, I suppose, force improvements. What’s your view of that and how is it likely to be reflected in the AR5?
[Response: “decertify” is not a useful word in this context. Rather the idea is that it should be possible to find metrics that would show that different models merit different levels of credibility in their projections. Any specific prediction for an outcome would use input from all relevant models but they would be weighted based on prior assessments of their credibility. This is much harder to do than it sounds for various reasons – not least of which is that you have to justify the weighting scheme, find skill metrics that actually correlate to projection outcomes etc. This was all quite well discussed in the IPCC experts meeting report. – gavin]
prokaryotes says
What is the climate sensitivity in CMIP5? I lookign forward to read some feedback about David Wasdell’s paper on climate sensitivity.
A few bits…
In addition to initiating a process of global warming, the anthropogenic disturbance has also triggered the action of a complex web of interconnected feedback mechanisms
which amplify the effect of the original disturbance. The value of the amplification factor determines the required increase in average surface temperature if thermal equilibrium is eventually to be restored.
[..]
Taking account of the contribution of the CO2 with no feedback amplification we arrive at a correlate temperature 0.8oC below the pre-industrial benchmark.
Using the Charney amplification factor of 2.5 the figure decreases to -2.0oC.
Incorporating the carbon cycle feedbacks of the Hadley model yields -3.8oC.
Adding other slow feedbacks for the Hansen amplification factor yields -4oC. However, the empirically derived value for the average surface temperature during the depth of the ice ages stands at 5.0oC below the pre-industrial benchmark.
This provides us with an anchor point of [180 ppm, -5.0oC] through which the amplification line representing the sensitivity of the whole earth system must pass. The second point on the line is of course the pre-industrial benchmark of 280 ppm and 0.0oC. Projecting that forward into the next doubling of CO2 concentration yields an AF of 6.5, a climate sensitivity of 7.8oC, and a Temperature-Forcing ratio of 2oCw-1m-2. Those figures are just over 21⁄2 times the values derived from the Charney Sensitivity.
[..]
In 2005, Ferdinand Engelbeen, a Belgian scientist, conducted a regression analysis of the correlated values of temperature and CO2 concentration based on the Vostok records. It was posted on the Real-Climate web site and little further attention was paid to it
[..]
The high level of certainty associated with the Earth System Sensitivity of at least 7.8oC for a doubling of CO2, requires that the Charney Sensitivity (of 3oC) should now be abandoned and replaced by that figure for all future strategic negotiations.
[..]
The outcome of the COP 15 deliberations affirmed the need to limit temperature rise to the 2oC ceiling, and this element of the Copenhagen Accord [16] was subsequently embedded in the Cancun Agreement of COP16. These positions are totally dependent on the Charney Sensitivity. Leaving aside for the moment the challenge that even a 2oC rise in temperature would take us well beyond dangerous climate change and into the domain of “extremely dangerous climate change”, we note that the Earth System Sensitivity indicates that a sustained CO2 concentration of 440 ppm would result in an equilibrium increase of 5oC above the pre-industrial benchmark In this case the 2oC guardrail has already been overwhelmed by some 60 ppm. The 2oC threshold was passed when the concentration reached 330 ppm.
[..]
Utilising the ensemble of climate models that underpin the IPCC Fourth Assessment
Report, Malte Meinshausen produced a probability density function (PDF) showing the
clustering of the model outputs around a climate sensitivity of 3oC [13] (co-incident with the Charney Sensitivity). With very few exceptions, that ensemble limits its treatment of feedback mechanisms to the same set of fast feedbacks utilised in the original Charney analysis.
[..]
If we explore the Hansen amplification factor of 5.0 (a sensitivity of 6oC for a doubling of atmospheric CO2) then we see he predicts an equilibrium of rise of 4oC with a stabilised CO2 concentration of 440 ppm. The “safe” ceiling of 2oC is reached with a concentration of 350 ppm. That is why he consistently asserts that we need to reduce CO2 concentration to below 350 ppm while warning that even then, the temperature rise would expose the system to further amplification from slow feedbacks as well as initiate a dangerous increase in sea level
[..]
Exploring the C-ROADS Simulator
Using the best available expertise in system dynamics simulation, field leaders from the MIT Sloane School of Management and Ventana Systems created the C-ROADS simulator in preparation for the COP 15 gathering in Copenhagen [21]. The acronym stands for “Climate Rapid Overview and Decision Support”. The simulator provides a visual interface that responds in real time to inputs of proposed reductions in CO2 emissions, relating outcomes to atmospheric concentration trajectories and implications for increase in global temperature. The simulator is now hosted independently at http://ClimateInteractive.org
Its underlying model architecture has been stringently reviewed by an august panel of leading scientists. They validated its accuracy in representing the “state-of-the-art” climate models used in the preparation of the IPCC Fourth Assessment Report, (the same set utilised by Meinshausen to derive his PDF of Climate Sensitivity). They went on to recommend it for use as the official simulator for the UNFCCC negotiations. ClimateInteractive staff and simulation platform were extensively involved in the preparation of “The Emissions Gap Report” of the UNEP [22], released in Nov. 2010 prior to the COP 16 gathering in Cancun. Following the promulgation of the Copenhagen Accord, nearly 140 countries associated themselves with the document and over 80 countries, representing about 80 per cent of global emissions, have appended targets and/or mitigation actions. (UNEP p.38) These promises and commitments were entered into the C-ROADS platform and the resulting increase in average global temperature by the year 2100 was presented in the form of a Climate Scoreboard thermometer.
The C-ROADS simulator uses the Charney Sensitivity and limits the effects of positive feedback to the fast mechanisms associated with that value.
[..]
All the work on climate sensitivity is based on paleo records of slow, close to equilibrium behaviour at a global level. Those conditions no longer apply.
[..] the high level of climate sensitivity, combined with rapid change and far-from
equilibrium dynamics, exposes us to a severe risk of triggering an episode of runaway
climate change.
Link to full paper and blog post..
http://climateforce.net/2011/08/13/climate-shift-impact-risk-assessment-revisited/
JimCA says
Can anyone shed light for us laymen about the recent NCAR prediction that arctic ice loss might temporarily abate during the next decade?
In a nutshell: Is this a credible prediction, and if so, is there a simple explanation?
When I look at the trends over the past 30 years, especially ice volume, it’s hard to believe those losses would suddenly stop for a while.
Didactylos says
Buck Smith: Yes!
Click on “data sources” at the top of the page for 7 major climate models, and various others.
Hank Roberts says
> JimCA
Asked and answered:
https://www.realclimate.org/index.php/archives/2011/08/unforced-variations-aug-2011/comment-page-5/#comment-213032
prokaryotes says
David Wasdell talks about his climate sensitivity assessment and computer models in a recent interview (july 2011)
http://www.youtube.com/watch?NR=1&v=tDAQ9lXzc1c
prokaryotes says
Then, one day, to your horror, you open a journal of Uighur studies and find a lead article proving that everybody has been interpreting Uighur wheat production records wrong, and that all previous estimates of what the Uighur numbers mean were off by a factor of two. https://www.realclimate.org/index.php/archives/2005/12/natural-variability-and-climate-sensitivity/
I got a response from Ken Caldeira and added more content to this blog post … http://climateforce.net/2011/08/13/climate-shift-impact-risk-assessment-revisited/
When reading about models and simulations i find it would make sense to list which of the climate class forcings(or at thresholds sub feedbacks) and what CS is considered. The logarythmic approach of Wasdell i think is very reasonable. The question is about thresholds and there are so many feedbacks which are still not used and i find it is important to make that clearer that we use really conservative estimates – which not consider all variables. So im still reading into it and looking for more feedback…
Pete Dunkelberg says
Crowd sourcing science:
Pete Dunkelberg says
On the other hand, Prokaryotes @ 37 “So im still reading into it….”
Yes, we must all be wary of confirmation bias, not to mention overestimating the size of every imaginable feedback.
Hank Roberts says
> Ken Caldeira about the Wasdell paper ….
> http://climateforce.net/2011/08/13/climate-shift-impact-risk-assessment-revisited/
Caldeira’s response is well worth reading in full.
Nomination for a FAQ item.
[Response: Caldeira’s refers to the ‘Eocene methane spikes’ as evidence for high sensitivity. The problem with this is that both the temperature change of the Eocene and the greenhouse gas levels are highly uncertain. Contrary to popular view, the Eocene data are simply not good enough to ‘challenge’ climate models with midrange sensitivity. See e.g. the open access paper The early Eocene equable climate problem revisited by Huber and Caballero (2011) in Climate of the Past. I think it highly unlikely anyone is going to be able to demonstrate high sensitivity. It is even more unlikely we can rule it out. Except by waiting 50 years or so and seeing what happens of course. –eric]
wili says
I am glad to see some discussion of climate sensitivity.
It seems to me that historical comparisons have the disadvantage that we are adding GHGs at a much faster rate than at any other point in the history of the earth, if I understand correctly.
It seems to me that this very fast rate of increase could change the nature of certain feedbacks. For seabed methane in fairly shallow waters, for example, a relatively slow increase in GHG and global temperatures may melt glaciers and polar ice enough to increase sea levels, thus increasing the pressure that helps keeps clathrates in place, before the ocean warmed up enough to melt the clathrates.
As it is, we are warming the oceans, particularly the Arctic very quickly, while sea level rise is still measured in centimeters at most.
IIRC, Wasdell has been criticized by Archer in the past. Does this recent work seem more solid in Archer’s view?
reCaptcha: Bremen ogionti
Aaron Lewis says
Eric inline comment on Hanks comment 41:
It does not matter what the Eocene conditions were at the start of its CH4 releases, because we are there (or very close). The observed methane plumes from sea beds in the Arctic, observed melt of permafrost, observed release of methane from Arctic wetlands, and large number of high “outlier” methane concentrations in global carbon cycle monitoring programs all suggest that we are seeing the start of carbon-cycle methane releases. AGW has put a lot of heat in our oceans, and there is no doubt that more of that heat will be transferred to permafrost and clathrates resulting in additional melt/decomposition. We are at the start of carbon cycle feedback.
CH4 concentration in the northern atmosphere (including high outliers) is now close to 2 ppm based on local, current emissions. That is almost like having another 100 or 150 ppm of CO2 in the air (effect calculated on annual basis), so we need to think about our current GHG concentration as 500 CO2- eq in 2011. I know that means assuming CH4 is 70 times more effective as a green house gas than CO2, but 70 is a good number for one year, and what we are worried in this case is the local warming that will drive next year’s permafrost melt. We do not need a 20 year greenhouse equivalent number, because in 20 years the permafrost will be mostly seasonal wetlands, all producing CH4. For example, CH4 trapped under ice over the winter and released as the ice melts, just in time to jump-start local warming of the wet lands the next spring.
We do not need to wait 50 years. Go count those wetland ponds. You can do it from your desk with satellite photos. Now, there are a lot of them, and every year there are a more ponds. Every year many of them get a little bigger. Every year, most of them have a little longer melt season. Every year they get a little warmer, so that they produce a little more CH4 per acre. It is a non-linear feedback cycle resulting in higher sensitivity.
Now is the time to plan for it.
prokaryotes says
[b]From the newsletter:[/b]
Simplified, fast models (so called intermediate complexity models or EMICs) were also used for these time slice simulations in order to provide a reference for their use for long-term transient experiments and in palaeoclimate studies.
The main focus for time slice experiments were the Last Glacial Maximum (LGM), 21 000 years before present (BP), and the mid-Holocene (MH), 6 000 years BP: intervals which correspond to times when differences in boundary conditions led to extreme climates that are relatively well documented by palaeoenvironmental data.
The third phase of PMIP and its role in CMIP5 An important aspect of PMIP3 is the inclusion of three key time periods as part of the Phase 5 Coupled Modelling Intercomparison Project (CMIP5, Taylor et al., 2009; 2011) within Tier 1 (the Last Glacial Maximum 21,000 years ago, and mid-Holocene 6,000 years ago) and Tier 2 (Last Millennium).
__
So Wasdell point’s out that these past climate changes cannot serve as a basis for an accelerated cc out of equilibrium range a few magnitudes larger.
[b]Wasdell’s critique on modeling and more bits from his draft:[/b]
Additional change in greenhouse gas concentrations is caused by non-anthropogenic feedbacks which are sensitive to climate change. Additional carbon dioxide, water-vapour and eventually methane, combined with the temperature-driven change in ice and snow albedo, together with complex oceanic, vegetative and cloud-system feedbacks, all contribute to amplify the original disturbance. The value of the eventual quilibrium rise in average surface temperature depends on the amplification factor applied to the original anthropogenic disturbance by the feedback system.
About Charney Sensitivity
The report explicitly excludes the role of the biosphere in the carbon cycle (and so takes no note of the carbon-cycle and vegetation feedbacks). It also ignores the transfer of heat to the deep oceans, a position that leads to a fast approach to dynamic thermal equilibrium. Our current observation and understanding of this factor leads to slower predictions of the rate of temperature rise.
“If the Charney sensitivity, supported by our modern computer models, projects that a doubling of the concentration of atmospheric carbon-dioxide leads to a temperature rise of 3oC at equilibrium, then why, in the empirically measured behaviour of the planetary system, does an increase of only 56% in CO2 concentration (from 180 ppm to 280 ppm) lead to a 5oC change in temperature?”
If we take into account the non-linear relationship between the variables as expressed in the semi-log (base 2) presentation, the 56% increase in absolute value of the CO2 concentration across this particular range, equates to only 63.4% of the effect of doubling, i.e. a forcing of 2.54 wm-2. Applying the Charney Sensitivity to this proportion yields an increase of only 2oC for the change from coldest point of the ice-age to the pre-industrial benchmark. Historical data tells us that the shift should be 5oC. It just does not compute. The inescapable conclusion is that the computer modelling ensemble, together with the Charney sensitivity and supported by Hansen’s “empirical sensitivity” are all omitting something fundamental. They are grossly under-representing the contribution of the complex feedback system to the amplification of the effects of change in CO2 concentration.
The implications are profound. The whole international strategic response to climate change is based on the output of the computer ensemble. The basis is recognised to be “conservative”, but to be under-representing the threat by a factor of two-and-a-half is a culpable collusion with a process of collective denial.
In an attempt to close the gap between computer modelling and empirical measurement, Hansen et al offered a hybrid solution in their earlier paper “Target Atmospheric CO2: Where Should Humanity Aim?”, published March 2008 [7]. They started with the assertion: 8.8.1 “Paleoclimate data show that climate sensitivity is ~3oC for doubled CO2, including only fast feedback processes. Equilibrium sensitivity, including slower surface albedo feedbacks, is ~6oC for doubled CO2 for the range of climate states between glacial conditions and ice-free Antarctica.” [op. cit. p1]
The methodological approach is summarised in the paragraph:
“Climate models alone are unable to define climate sensitivity more precisely, because it is difficult to prove that models realistically incorporate all feedback processes. The Earth’s history, however, allows empirical inference of both fast feedback climate sensitivity and longterm sensitivity to specified GHG change including the slow ice sheet feedback.”
After careful and technical evaluation of the long-term slow feedback mechanisms, they conclude that:
“Global climate sensitivity including the slow surface albedo feedback is 1.5oC per wm-2 or 6oC for doubled CO2, twice as large as the Charney fast-feedback sensitivity.”
This “Hansen Upgrade” is represented by the green line on the semi-log (base 2) scale. The sensitivity of 6oC for a doubling of CO2 yields an Amplification Factor of 5.0. However, it still falls short (by some 4 wm-2) of the forcing required to balance a 5oC rise in temperature.
Towards the end of 2009, Mark Pagani et al published a paper on “High Earth-system climate sensitivity determined from Pliocene carbon dioxide concentrations” [Nature Geoscience Letters 20 December 2009] [10]. They concluded that “the Earth-system climate sensitivity has been significantly higher over the past five million years than estimated from fast feedbacks alone”.
Writing in the “Perspectives” section of the Jan. 2011 edition of Science, Jeffrey Kiehl reviewed current peer-review academic papers reporting on the reconstruction of values of atmospheric CO2 concentration reaching back through ~100 million years. The authors also derived values for earth system climate sensitivity across this period. Kiehl’s summary conclusion was that the data for 30 to 40 million years before the pre-industrial benchmark indicate that Earth’s climate feedback factor is ~2oCw-1m-2. That is equivalent to a climate sensitivity of 8oC for a doubling of atmospheric concentration of CO2, with an amplification factor of 6.7.
Re-working Kiehl’s figures using the graphic simulator leads to a marginally higher outcome.
Earth surface temperature decreased by 16oC during the period requiring a shift of 52.8 wm-2 of forcing to balance the dynamic thermal equilibrium. During the same period, CO2 concentration declined from1000 ppm to 280 ppm, equivalent to 1.87 of the doubling/halving forcing from CO2 alone. CO2 change therefore contributed some 7.48 wm-2 towards the overall forcing, leaving a balance of 45.32 wm-2 as the contribution from the dynamic feedback system. That yields an amplification factor of 7.0, a sensitivity value of 8.47oC for a doubling of CO2, and a climate feedback factor of 2.1oCw-1m-2.
Rapid Climate Change in Far-from-Equilibrium Conditions
Historically climate change at a global level has been slow and in conditions of dynamic thermal equilibrium. Net radiative imbalance has remained close to zero and the earth system has responded to change at a pace that allowed continuous adaptation of the bio-geo-chemical systems. Within those conditions there have been examples of comparatively rapid change in limited sub-system behaviour where specific tipping points have been activated by the slow global change, and the sub-system has moved from one stable state to another. All the work on climate sensitivity is based on paleo records of slow, close to equilibrium behaviour at a global level. Those conditions no longer apply.
The Earth System Sensitivity would indicate an expected rise in temperature at equilibrium of 3.8oC. There is therefore an expected rise of 3.0oC still in the pipeline, to which we are already committed. If non-CO2 GHGs are included then the expected increase would rise to 5.4oC (4.6oC still in the pipe-line).
Historically a change in CO2 concentration of 100 ppm has taken place over a period of some 10,000 years. Humanity has now generated the same change in the space of a single century, one hundred times faster than at any point in the historical record (apart perhaps from the effects of the impact of a massive asteroid).
Net Radiative Imbalance during the past has not exceeded 0.01 wm-2. Anthropogenic forcing over the last century has generated a net radiative imbalance of between 1.0 and 3.0 wm-2.This rate of global heating is of the order of 300 times the historical maximum. It has pushed the earth system significantly away from equilibrium and activated increasing time-delay between forcing and the eventual achievement of a new state of dynamic thermal equilibrium.
Under these conditions a range of feedback processes are brought into operation that can be considered negligible when the system is very close to equilibrium:
Time delay in mixing of ocean layers (thermal inertia of the deep ocean heating) leads to relative heating of the surface, increased stratification, less up-welling of cold nutrient-rich water, decay in plankton take-up of CO2.
Increased acidification of the surface layer leads to lowered efficiency of the ocean sink of atmospheric CO2.
Hotter ocean surface combines with hotter atmosphere to increase the water-vapour feedback and so enhance the endothermic phase-change feedback that increases forcing while bypassing the temperature-sensitive radiative damping negative feedback.
Heat transfer from equatorial to high latitude polar regions is partially taken up in the endothermic phase-change of net ice melt. The resultant decrease in albedo constitutes a positive feedback which is also partially independent of the temperature-sensitive radiative damping negative feedback.
The greater the net radiative imbalance the longer the time-lag to establish new dynamic thermal equilibrium. Non-temperature-sensitive feedbacks, driven by increased CO2 concentration or by energy-flux in distinction from increased temperature, all contribute to amplification of the forcing, so increasing the time-lag and setting up second-order feedback reinforcement.
These non-temperature-sensitive feedbacks continue to accelerate global heating even during periods of increased heat-transfer to deep ocean with consequent slowing of the rate of change of average global surface temperature.
The pace of change overwhelms the capacity for smooth adaptation, evolution and mobility of the biological systems leading to patterns of die-back and burn that transfer carbon from biomass to atmosphere. That increases the carbon-cycle feedback dynamics.
Taken all together these phenomena enhance the system sensitivity and increase the amplification factor beyond the value of the Earth System Sensitivity previously developed from slow and close-to-equilibrium patterns of change. The value of the amplification factor of 6.5 representing a Sensitivity value of 7.8 oC for a doubling of the concentration of atmospheric CO2 should therefore be taken as a conservative
minimum figure in our current situation.
Rapid climate change, in conditions of dis-equilibrium, precipitates the activation of an interconnected series of sub-system tipping-elements [24]. That in turn drives turbulence and inherent unpredictability in the global climate system. There is also an increasing frequency of extreme events in local weather conditions.
At the overall global system level, the increasing power of amplifying feedback dynamics could push the system beyond the critical threshold which signals the onset of a period of self-amplifying or “runaway” climate change for which there is currently no modelling capacity. The subject of the boundary conditions of runaway behaviour in the earth climate system as a whole is addressed in the second section of this paper under the title of “Beyond the Stable State”.
___
Notice that Ken Caldeira’s response is based on a paper with Pagani from 2006 and Wasdell cites a paper from him, dated 2009.
Notice that i could not find the Engelbeen post here at RC, though i did not checked all his comments, but he posted it at WUWT.
wili says
@ ALewis “70 times more effective as a green house gas than CO2, but 70 is a good number for one year”
I’m not sure I follow your wording here, but you should be aware that Shindell et alia in “Improved Attribution of Climate Forcing to Emissions“ found that the actual short-term global warming potential of methane was 105 times that of CO2.
http://www.sciencemag.org/cgi/content/abstract/326/5953/716
reCaptcha: mickshin rejoiced
prokaryotes says
The Pagani paper
High Earth-system climate sensitivity determined from Pliocene carbon dioxide concentrations
Climate sensitivity—the mean global temperature response to a doubling of atmospheric CO2 concentrations through radiative forcing and associated feedbacks—is estimated at 1.5–4.5 ◦ C (ref. 1). However, this value incorporates only relatively rapid feedbacks such as changes in atmospheric water vapour concentrations, and the distributions of sea ice, clouds and aerosols2. Earth-system climate sensitivity, by contrast, additionally includes the effects of long-term feedbacks such as changes in continental ice-sheet extent, terrestrial ecosystems and the production of greenhouse gases other than CO2. Here we reconstruct atmospheric carbon dioxide concentrations for the early and middle Pliocene, when temperatures were about 3–4◦C warmer than preindustrial values3–5, to estimate Earth-system climate sensitivity from a fully equilibrated state of the planet. We demonstrate that only a relatively small rise in atmospheric CO2 levels was associated with substantial global warming about 4.5 million years ago, and that CO2 levels at peak temperatures were between about 365 and 415 ppm. We conclude that the Earth-system climate sensitivity has been significantly higher over the past five million years than estimated from fast feedbacks alone.
The magnitude of Earth-system climate sensitivity can be assessed by evaluating warm time intervals in Earth history, such as the peak warming of the early Pliocene ∼4–5million years ago (Myr). Mean annual temperatures during the middle Pliocene (∼3.0–3.3 Myr) and early Pliocene (4.0–4.2 Myr) were ∼2.5 ◦ C (refs 3, 4), and 4 ◦ C (ref. 5) warmer than preindustrial conditions, respectively. During the early Pliocene, the equatorial Pacific Ocean maintained an east–west sea surface temperature (SST) gradient of only ∼1.5 ◦ C, which arguably resembles permanent El Niño- like conditions6. Meridional5,7 and vertical ocean temperature gradients8 were reduced, and deep-ocean ventilation enhanced, relative to today9,10. Deterioration in Earth’s climate state from 3.5 to 2.5Myr led to an increase in Northern Hemisphere glaciation11 . By ∼2 Myr, subtropical Pacific meridional SST gradients resembled modern conditions5, and the Pacific zonal SST gradient (∼5 ◦ C) was similar to the gradient observed today, with a strong Walker circulation6.
Tectonics and changes in ocean12–14 and atmospheric circulation15,16 were potentially important factors in climate evolution during this time. http://www.geo.umass.edu/courses/geo763/Pagani.pdf
Re eric’s, comment :”Contrary to popular view, the Eocene data are simply not good enough to ‘challenge’ climate models with midrange sensitivity. See e.g. the open access paper The early Eocene equable climate problem revisited by Huber and Caballero (2011) in Climate of the Past.”
From the abstract you link, it states “We find that, with suitably large radiative forcing, the model and data are in general agreement for annual mean and cold month mean temperatures, and that the pattern of high latitude amplification recorded by proxies can be largely, but not perfectly, reproduced.”
The models are not perfect either so i don’t see the problem here with the “popular view”, especially when assessing a wider range of studies.
Hank Roberts says
> Engelbeen
Dunno; one of these eight or so hits might lead you to it:
http://www.google.com/search?q=site%3Arealclimate.org+%2BEngelbeen+%2B2005+%2Bregression+%2Bcorrelated+%2BVostok
> biological systems
I do wonder whether there are big surprises waiting for the physicists once biological systems are better modeled. I’d suspect there are cross-discipline areas that simply can’t be modeled yet that while they may produce only slight wiggles in the models, would produce tipping points/crashes/excursions in reality.
Since evolution has changed the biology of the planet so dramatically between each of the major extinctions, the past isn’t much of a guide to _what_ could be modeled.
But is there any way to get at this better by looking at the paleo work?
For example this:
http://www.clim-past.net/5/297/2009/cp-5-297-2009.html
Clim. Past, 5, 297-307, 2009
http://www.clim-past.net/5/297/2009/
doi:10.5194/cp-5-297-2009
Ecosystem effects of CO2 concentration: evidence from past climates
Or looking at older work, this one (it’s a Letter):
http://www.nature.com/nature/journal/v429/n6991/abs/nature02566.html
suggested that a new core drilling investigation around a group of hydrothermal vent complexes ought to be able to resolve the question about timing a methane excursion. I haven’t found out whether the drilling was done. There’s a copy
http://folk.uio.no/hensven/nature/svensen_etal_nature04.pdf
This is stuff the modelers presumably know far more about than blog readers — I’d love to hear more from the people who are doing the modeling, but definitely don’t want dumb questions to slow down the actual work they’re doing. Just very curious.
Pete Dunkelberg says
@ 44: “The whole international strategic response to climate change is based on the output of the computer ensemble.”
I don’t think so, one reason being that humanity’s response is based on denying this, as you have noticed elsewhere.
@ 46: Re Eric plus Huber_and_Caballero_2011_The_early_Eocene_equable_climate_problem_revisited.pdf.
If you look into that paper, they find that the Eocene tropics were warmer than was thought 10 to 15 years ago. This facilitates solving the problem via higher pCO2. Lots more. Applying a sensitivity of 7 to that pCO2 would really get earth cooking.
prokaryotes says
Re Pete Dunkelberg, again i quote from the Wasdell paper:
The higher the concentration of any particular greenhouse gas, the less efficient it becomes at inhibiting infra-red radiation in the particular wavelength zone associated with its specific molecular structure. A long history of experimental verification has shown the relationship between concentration and absorption efficiency to be logarithmic. In particular, the change associated with a doubling of the concentration of carbon-dioxide is known to reduce its efficiency as a greenhouse gas. The forcing associated with each doubling is a constant 4 watts per square metre (wm-2) at the earth surface. That requires a change of 1.2oC in surface temperature to re-balance the energy budget. Logarithmic functions of this kind produce a constant output for any halving or doubling of the parameter across a given range.
Hank Roberts says
Given any one model — is the climate sensitivity something that emerges from a variety of assumptions chosen for that particular model, based on how it comes out? Or from each run of that model?
(if you had computer and time enough to let each scenario run go a few thousand years of scenario time, til the temperature leveled off and started to decline — which I think isn’t practical)
Or does the model run a collection of scenarios, for a few hundred years each, through the first fast feedbacks, based on a single sensitivity number, to see what the other factors produce?
Or — I’m sure I don’t have this clear enough, I’m aiming for the high school level or simpler words here — can one of the modelers give an idea of what the various CMIP groups are doing with their models?