Nature has an interesting editorial this week on the state of the science for attributing extreme events. This was prompted by a workshop in Oxford last week where, presumably, strategies, observations and results were discussed by a collection of scientists interested in the topic (including Myles Allen, Peter Stott and other familiar names). Rather less usual was a discussion, referred to in the Nature piece, on whether the whole endeavour was scientifically worthwhile, and even if it was, whether it was of any use to anyone. The proponents of the ‘unscientific and pointless’ school of thought were not named and so one can’t immediately engage with them directly, but nonetheless the question is worthy of a discussion.
This workshop was a follow-up to one held in 2009, which took place in a very different environment. The meeting report was typical of a project that was just getting off the ground – lots of potential, some hints of success. Today, there is a much richer literature on the topic, and multiple approaches have been tried to generate the statistical sample required for statements of fractional attribution.
But rather than focus on the mechanics for doing this attribution, the Nature editorial raises more fundamental questions:
One critic argued that, given the insufficient observational data and the coarse and mathematically far-from-perfect climate models used to generate attribution claims, they are unjustifiably speculative, basically unverifiable and better not made at all. And even if event attribution were reliable, another speaker added, the notion that it is useful for any section of society is unproven.
Both critics have a point, but their pessimistic conclusion — that climate attribution is a non-starter — is too harsh.
Nature goes on to say:
It is more difficult to make the case for ‘usefulness’. None of the industry and government experts at the workshop could think of any concrete example in which an attribution might inform business or political decision-making. Especially in poor countries, the losses arising from extreme weather have often as much to do with poverty, poor health and government corruption as with a change in climate.
Do the critics (and Nature sort-of) have a point? Let’s take the utility argument first (since if there is no utility in doing something, the potentially speculative nature of the analysis is moot). It is obviously the case that people are curious about this issue: I never get as many media calls as in the wake of an extreme weather event of some sort. And the argument for science merely as a response to human curiosity about the world is a strong one. But I think one can easily do better. We discussed a few weeks ago how extreme event attribution via threshold analysis or absolute metrics reflected a view of what was most impactful. Given that impacts generally increase very non-linearly with the size/magnitude of an event, changes in extremes frequency or intensity have an oversized influence on costs. And if these changes can be laid at the feet of specific climate drivers, then they can certainly add to the costs of business-as-usual scenarios which are then often compared to the cost of mitigation. Therefore improved attribution of shifts in extremes (in whatever direction) have the potential to change cost-benefit calculations and thus policy directions.
Additionally, since we are committed to certain amount of additional warming regardless of future trends in emissions, knowing what is likely in store in terms of changing extremes and their impacts, feeds in directly to what investments in adaptation are sensible. Of course, if cost-effective investments in resilience are not being made even for the climate that we have (as in many parts of the developing world), changes to calculations for a climate changed world are of lesser impact. But there are many places where investments are being made to hedge against climate changes, and the utility is clearer there.
Just based on these three points, the question of utility would therefore seem to be settled. If reliable attributions can be made, this will be of direct practical use for both mitigation strategies and adaptation, as well as providing answers to persistent questions from the public at large.
Thus the question of whether reliable attributions can be made is salient. All of the methodologies to do this rely on some kind of surrogate for the statistical sampling that one can’t do in the real world for unique or infrequent events (or classes of events). The surrogate is often specific climate simulations for the event with and without some driver, or an extension of the sampling in time or space for similar events. Because of the rarity of the events, the statistical samples need to be large, which can be difficult to achieve.
For the largest-scale extremes, such as heat waves (or days above 90ºF etc), multiple methodologies – via observations, coupled simulations, targeted simulations – indicate that the odds of heat waves have shortened (and odds for cold snaps have decreased). In such cases, the attributions are increasingly reliable and robust. For extremes that lend themselves to good statistics – such as the increasing intensity of precipitation – there is also a good coherence between observations and models. So claims that there is some intrinsic reason why extremes cannot be reliably attributed doesn’t hold water.
It is clearly the case that for some extremes – tornadoes or ice storms come to mind – the modelling has not progressed to the point where direct connections between the conditions that give rise to the events and climate change have been made (let alone the direct calculation of such statistics within models). But in-between these extreme extremes, there are plenty of interesting intermediate kinds of extremes (whose spatial and temporal scales are within the scope of current models) where it is simply the case that the work has not yet been done to evaluate whether models suggest a potential for change.
For instance, it is only this year that sufficient high frequency output has been generically archived for the main climate models to permit a multi-model ensemble of extreme events and their change in time – and with sufficient models and sufficient ensemble members, these statistics should be robust in many instances. As of now, this resource has barely been tapped and it is premature to declare that the mainstream models are not fit for this purpose until someone has actually looked.
Overall, I am surprised that neither Nature or (some?) attendees at the workshop could find good arguments supporting the utility of attribution of extremes – as this gets better these attributions will surely become part of the standard assessments of impacts to be avoided by mitigation, or moderated by adaptation. We certainly could be doing a better job in analysing the data we have already in hand to explore whether and to what extent models can be used for what kinds of extremes, but it is wrong to say that such attempts are per se ‘unverifiable’. As to whether we are better off having started down this path, I think the answer is yes, but this is a nascent field and many different approaches and framings are still vying for attention. Whether this brings on any significant changes in policy remains to be seen, but the science itself is at an exciting point.
Don Williams says
Re Tom Scharf at 100,
Maybe that hurricane energy went somewhere else — and maybe the insurance companies are not raking in as much money as you think.
a) From http://www.usgs.gov/newsroom/article.asp?ID=2343#.UGEEHo09zl8
“The epic flooding that hit the Atlanta area in September was so extremely rare that, six weeks later this event has defied attempts to describe it. Scientists have reviewed the numbers and they are stunning.
“At some sites, the annual chance of a flood of this magnitude was so significantly less than 1 in 500 that, given the relatively short length of streamgaging records (well less than 100 years), the U.S. Geological Survey cannot accurately characterize the probability due to its extreme rarity,” said Robert Holmes, USGS National Flood Program Coordinator. “Nationwide, given that our oldest streamgaging records span about 100 years, the USGS does not cite probabilities for floods that are beyond a 0.2 percent (500-year) flood.”
“If a 0.2 percent (500-year) flood was a cup of coffee, this one brewed a full pot,” said Brian McCallum, Assistant Director for the USGS Georgia Water Science Center in Atlanta. “This flood overtopped 20 USGS streamgages – one by 12 feet. The closest numbers we have seen like these in Georgia were from Tropical Storm Alberto in 1994. This flood was off the charts.” …
“Applying rainfall frequency calculations, we have determined that the chance of 10 inches or more occurring at any given point are less than one hundredth of one percent”, said Kent Frantz, Senior Service Hydrologist for the National Weather Service at Peachtree City. “This means that the chance of an event like this occurring is 1 in 10,000.”
b) Wiki says Georgia estimated damages were $500 million
http://en.wikipedia.org/wiki/2009_Southeastern_United_States_floods
c) There was massive flooding in the Midwest in March 2008 and in the upper Midwest in June 2008.
http://en.wikipedia.org/wiki/March_2008_Midwest_floods
d) The May 2010 floods in Tennessee were 1000 year floods.
http://en.wikipedia.org/wiki/2010_Tennessee_floods
f) In 2011, Hurricane Irene caused $296 million in damages in New York alone — one of the most costly storms there
ever.
http://en.wikipedia.org/wiki/Effects_of_Hurricane_Irene_in_New_York
David B. Benson says
Of course one could solve the relevant exercise in chapter 6 of Ray Pierrehumbert’s “Principles of Planetary Climate” to see that a decent approximation to annualized global precipitation increase is as the square of the temperature increase.
Marcus says
Dear Don Williams,
You seem to be very keen on having climate maps not based on climated statistics, but scenario projections based on computer simulations based on climate models, as mandatory(?) guidelines for engineering in the USA.
For safety thresholds and so on, I assume. You also seem to frame this as kind of responsibility of climate scientists.
I encourage to read this RC article
https://www.realclimate.org/index.php/archives/2012/06/far-out-in-north-carolina/
Especially the section “North Carolina politics” and maybe also the discussion.
People like senator Inhofe will go as far as combatting such attempts by law, at least when public funds are involved
Don Williams says
To Marcus at 103:
1) I am not competent to prescribe how climate maps should be constructed–
I merely agree with Gavin’s comment above that climate data provided by the
US Government should be the best possible — hence, based on current
science.
2) As I noted above, NOAA’s Atlas 14 Volume 2 says a 100 year storm
event in my area has a 24 hour rainfall of 7.63 inches. Yet NOAA’s data
archive shows the following rainfall events in this area in recent years:
Aug 28,2011: 8 inches, October 1, 2010: 9.23 inches , Sept 29, 2004:
6.92 inches, Sept 17, 1999: 9.4 inches.
To the list of floods I noted in post 101 above I would add the
New England Flood of May 2006.
3) Obviously the maps have to be based partially on climated statistics
but I don’t see how NOAA could have collected a record of rainfall
occurring on the Atlantic ocean’s surface every six miles for the past
100 years and going up to 400 miles offshore. Plus the continental
interior does not have things called currents. So I think Atlas 14 is missing
something for US East Coast areas which receive large amounts of
rainfall from offshore hurricanes. But I do not have the professional
qualifications needed to publish a critique of Atlas 14 in a reputable journal.
4) Senator Inhofe does not run NOAA. Senator Inhofe also has not
repealed the First Amendment for scientific journals. If someone at least
published a critique of Atlas 14, then common citizens could point to that
critique in local government hearings and argue for additional stormwater
measures as a precaution. If US scientists fear for their funding, then
have some European publish it. As I noted, Atlas 14 is the most important
and influential climate model in the USA today.
You don’t even have to provide an alternative model — just note where
reality is not matching government climate data in major ways. I know —
improbable extreme events can happen. But not, I think, several times
in the same location within a few years.
[Response: Atlas 14 is not a climate model. The documentation is quite clear, and the people involved easily findable. If you have a problem with it, you should be contacting them in the first instance. – gavin]
SecularAnimist says
Tom Scharf wrote: “Global cyclone energy is running near all time lows.”
For the 12 Atlantic hurricane seasons 2000-2011, the ACE for 8 seasons has been “above normal”, with 4 of those seasons rated as “hyperactive”. Only 2 seasons had ACE “below normal”. Three of the ten Atlantic seasons with the highest numbers of named storms have occurred since Katrina (2007, 2008 and 2010).
Tom Scharf wrote: “We haven’t had a CAT3 landfall in the US since 2006, an all time record.”
The number of hurricanes that happen to make landfall in the US has nothing to do with anything.
And of course, by cleverly selecting CAT3 as an arbitrary cutoff point, you get to ignore CAT2 Hurricane Ike (2008) which caused nearly $28 Billion in damages, and and CAT2 Hurricane Gustav (2008) which caused over $4 Billion in damages, and CAT1 Hurricane Irene (2011) which caused nearly $16 Billion in damages, and tropical storm Lee (2011) which caused over $1.6 Billion in damages and produced unprecedented flooding all over the east coast, as far north as New York and Vermont.
You are just reciting talking points.
tokodave says
We might also point Tom to Jeff Master’s Wunderblog for an update on the Pacific Supercyclones… http://classic.wunderground.com/blog/JeffMasters/show.html, or his earlier discussions of this year’s series of derechos that hit the east coast or….
Charles says
The National Geographic article showed that US weather disaster losses only increased 50% ($339 B to $541 B) from the previous 15 years to the most recent 15. Although the main reason was “more people living on higher-value properties in vulnerable areas.”
Using monetary values to describe natural disasters does not necessarily relate to strength or frequency.
David says
The climate policy discussion is now primarily about economics. I think it is essential that extreme weather events due to our warming of the global climate are correctly attributed so that costs can be assessed. As you say, Gavin, one cannot perform a cost/benefit analysis without accurate figures for both sides of the equation.
The key fact is that analyses to date suggest that the costs of inaction on climate so vastly outweigh those of preventing further damage, that even a little extra evidence of this could move leaders to action. But they do need numbers, and those depend on being able to attribute weather events to human agency.
Don Williams says
1) Re David at 108, before you make a strong effort to provide info for
cost-benefit analysis, you need to ask if rational cost-benefit analysis
exists in the US political system.
For example, we have a transport system based on $40 per gallon
gasoline –$4/gal at the pump and $36/gal on our income tax to support military
operations in the Middle East. But better alternatives don’t arise because the market
signal –pump price — is kept artifically low. A simple fix would be to add a $20/gal
security fee to the pump price –with an offsetting tax credit so people are not worse
off. If you then let people pocket any savings they get from going to cheaper
alternatives, you would have a boom in alternative fuel development.
That might also significantly reduce carbon emissions, although I’ve heard
electrical cars referred to as coal-burning vehicles.
2) If the insurance industry isn’t eating the increased damages from
heavy flooding, then their corporate clients are via higher premiums. Someone
would benefit from remedial measures.
3) But, in my personal opinion, the campaign finance system puts a heavy
thumb on the scale in the USA.
Hank Roberts says
> add a $20/gal security fee to the pump price –with an
> offsetting tax credit so people are not worse off.
Carbon Tax & 100% Dividend … James E. Hansen …
http://www.columbia.edu/~jeh1/2009/WaysAndMeans_20090225.pdf
GlenFergus says
Gavin at #104:
Mr Williams seems a little excited, but his starting point is relevant enough. In my experience engineering design rainfall intensities have a direct, measurable and substantial effect on the cost of civil infrastructure. A confident, model-based prediction of how they may change over time should be of considerable public policy utility.
(Another example: The Australian equivalent of Atlas 14 is currently under revision, and early indications are for a substantial design rainfall increase. That is based on data, not model projections, which are being addressed separately.)
Chris Dudley says
Peter Thorne (#84),
I think that if we were to go back and retroactively eliminate the emissions from 1850, it would have astonishingly little effect on the climate signals we are seeing today. Eliminating the the emissions from 2011, however and continuing to do so, would give us a noticeable change from our projected BAU course. There is a problem with the analysis you are invoking. Huge and growing emissions today are much much more important to climate change than tiny emissions in the past, not the other way around.
Peter Thorne says
Chris Dudley (#112)
I fear you entirely misunderstand my point. The climate system consists of a set of fast atmospheric responses and much slower cryospheric and oceanic responses. So, it takes a long time for the effect of CO2 emissions to be fully equilibriated and the climate find a new steady state. Even if we all decided to go and live in caves today the climate would continue to warm for at least a couple of centuries until the slow response components were in equilibrium with the imposed change in LW radiative balance we have already ‘acheived’. So, the freshly minted CO2 atoms the coal fire power plant down the road belched out for me to type this message have proportionately much less effect on the state of today’s climate than the emissions I made when I first used a BBC B+ computer at primary (elementary) school. By the same token how much C-word emits today is much less important than how much OECD countries emitted prior to 1950 in explaining the state of today’s climate. So, not all carbon emissions are equal in the context of a transient climate change, the historical emissions which have gone further towards equilibrium with the slow changing climate system components are more important in explaining the transient (but NOT the final) climate state. My kids can blame the CO2 I emitted writing this for events visited upon them when they are adults …
Chris Dudley says
Peter (#113),
I see the origin of your mistake now. You think there is momentum in the climate system. There is not. There is only lag. We are at the temperature today that would be equilibrium for the carbon dioxide concentration of 1982. It is only the concentration above that (emitted in recent years, not 1850) that induces further temperature increase. If, as you say, we were all to go live in caves, there would be a continued increase in temperature until the concentration fell to the energy balance level and then temperature would fall as the concentration continued to fall. If the lag is as short as 30 years (years back to 1982) then the temperature would rise an additional 0.3 C or so and then start to fall around 2050, not centuries from now. For longer lags, you get a later turn around but also a lower maximum temperature. For a 3 century lag we’d get a less than 0.1 C further increase in temperature and turn around to cooling around 2150. There is a little more discussion here: https://www.realclimate.org/index.php/archives/2012/08/an-update-on-the-arctic-sea-ice/comment-page-7/#comment-249977
Peter Thorne says
Chris (#113)
Where did I suggest this was anything other than the fact there are lags or even infer anything about momentum? Under transient climate change the fact that there are lags means the emissions of 30-years ago are more important than today’s in explaining today’s climate. Nothing to do with momentum and absolutely everything to do with lags. Under transient climate change it is impossible to argue on a physical basis of our understanding of the climate system that the CO2 molecules I emit now have a greater bearing on today’s climate system than those emitted 100 years ago. Not even the fast atmospheric response component has realized the impact of today’s emissions yet. Whereas a similar amount of CO2 that was emitted 100 years ago has had a chance to equilibriate with many components of the climate system. In explaining the climate in the here and now those 100 year old emissions ‘per capita CO2′ have far more say. At equilibrium state they will have equal say. But here and now we are mainly reaping what our parents’ sowed. This is why scenario projections in CMIP3/5 only meaningfully start to diverge in global mean temperature around mid-Century. What we do now will make a difference for our grandkids. In terms of the next 30 years what we do now makes little odds … there is substantial interest due. That is what happens when you have a system which has substantive physical lags and you tweak it.
Hank Roberts says
http://www.agu.org/pubs/crossref/pip/2012GL053338.shtml
European winter climate and its possible relationship with the Arctic sea ice reduction in the recent past and future as simulated by the models of the Climate Model Intercomparison Project phase 5 (CMIP5) is investigated, with focus on the cold winters. … A pattern of cold-European warm-Arctic anomaly is typical for the cold events in the future, which is associated with the negative phase of the Arctic Oscillation. These patterns, however, differ from the corresponding patterns in the historical period, and underline the connection between European cold winter events and Arctic sea ice reduction.
David B. Benson says
Chris Dudley & Peter Thorne — The time scale you are considering is far too short. Think centennially.
Peter Thorne says
David (#117)
Or even millenially (is that even a word?) for the ice sheet components. The ocean takes about 200 years (roughly) to completely mix to the abysal ocean but the big ice sheets all bets are off when (if) they would ever be in perfect equilibrium with the climate system (except if they melted entirely of course but lets not even countenance that …). I was using the shorter term for analogy – its hard to think about your great^90 grandkids after all …
David B. Benson says
Peter Thorne @118 — Try millennially.
http://www.merriam-webster.com/dictionary/millennially
Chris Dudley says
Peter (#115),
Putting labels on identical particles will get you into trouble. But, I’m not sure that is really where you are having difficulty. I would say also that there were no emissions from 100 years ago that are comparable to today’s. But, mainly I think what you are considering a transient response is what is confusing you. The temperature response to a Heaviside function does take a while to complete. But to get a Heaviside function in concentration, you need ongoing emissions because carbon dioxide is absorbed out of the atmosphere as time goes by. If you take account of that, you’ll understand that the most recent emissions are the most important.
You claim that we are only seeing the effect of emissions from thirty years ago today. More recent emissions have yet to have any physical effect. But we can rather easily calculate a limit to what would have happened to temperature today had we all gone to live in caves in 1982. Again, we take a 30 year lag as the shortest possible lag for the system and see that there would have been only 0.17 C of warming from 1982 to today instead of the 0.4 C that actually occurred. The more recent emissions are responsible for at least 58% of the warming. For a 3 century lag, they’d be responsible for 88% of the warming since 1982.
How quickly and how far a variable moves under lag conditions depends on the target it is trying to match. That target is the temperature we’d calculate for a specific climate sensitivity. I’ve been using 3 C per doubling. Raise the concentration of carbon dioxide, and the lagged variable responds immediately to that change. Double the concentration today and the temperature begins to rise today. It may take a while to finish, but it does not take a while to start. Your claim is basically that it does wait to begin to respond. But that is not how energy systems work.
If you run this script which ends emissions when the concentration reaches 340 ppm, the solid line in the first panel is the target temperature and the dashed line is the response with a thirty year lag, dot-dashed with a 90 year lag and triple-dot-dashed with a 300 year lag (applied starting when emissions end).
;use IDL or (free) Fawlty languages
a=findgen(1000) ;year since 1850
b=fltarr(1000) ;BAU concentration profile
b(0)=1
for i=1,999 do b(i)=b(i-1)*1.02 ;2 percent growth
;plot,b(0:150)*4.36+285.,/ynoz ; 370 ppm year 2000
c=(18.+14.*exp(-a/420.)+18.*exp(-a/70.)+24.*exp(-a/21.)+26.*exp(-a/3.4))/100. ;Kharecha and Hansen eqn 1
e=fltarr(1000) ;annual emissions
for i=1,999 do e(i)=b(i)-b(i-1)
d=fltarr(1000) ; calculated concentration
t=340.-285. ;target concentration
f=0 ;flag to end BAU growth
for i=1,499 do begin & d(i:999)=d(i:999)+e(i)*c(0:999-i)*4.36*2. & if d(i) gt t then begin & e(i+1:999)=0 & f=1 & h=i & endif else if f eq 1 then e(i+1:999)=0 & endfor ;factor of two reproduces BAU growth
!p.multi=[0,2,2]
g=alog(d/285.+1)/alog(2.)*3; instantaneous temperature curve.
gg=g*0.8/1.5
for i=h, 998 do gg(i+1)=gg(i)+(g(i)-gg(i))/30. ; 30 year lag curve
gg6=g*0.8/1.5
for i=h, 998 do gg6(i+1)=gg6(i)+(g(i)-gg6(i))/90. ; 90 year lag curve
gg12=g*0.8/1.5
for i=h, 998 do gg12(i+1)=gg12(i)+(g(i)-gg12(i))/300. ; 300 year lag curve
plot,a+1850.,alog(d/285.+1)/alog(2.)*3.,/ynoz,xtit='Year',ytit='Final warming (C)',charsize=1.5 ; Temperature assumming 3 C per doubling of carbon dioxide concentration
oplot,a+1850.,gg,linesty=2
oplot,a+1850.,gg6,linesty=3
oplot,a+1850.,gg12,linesty=4
plot,a(0:499)+1850,e(0:499),xtit='year',ytit='carbon dioxide emissions (AU)',charsize=1.5 ;emission profile in arbitrary units
plot,a+1850.,d+285.,/ynoz,xtit='Year',ytit='carbon dioxide concentration (ppm)',charsize=1.5 ; atmospheric carbon dioxide concentration in ppm
Jim Larsen says
114 Chris D said, “There is only lag. We are at the temperature today that would be equilibrium for the carbon dioxide concentration of 1982. It is only the concentration above that (emitted in recent years, not 1850) that induces further temperature increase.”
Crapola. Stop CO2 emissions and temperatures spike immediately. Stuff equalizes over a thousand years, but the aerosol spike over a few months trumps all.
Philip Machanick says
One bunch have no problem with using attribution to guide investment decisions. Fossil fuel companies (remember the bunch who want us to believe climate change is a hoax) are rushing to exploit the opportunity opened up by declining Arctic sea ice.
How, I wonder, do they know that decline is a reliable trend, not a short-term blip that will correct, unless they trust the very science they have paid professional deniers to traduce? It really does not make sense to invest heavily in Arctic resource extraction if you expect that it will revert to being mostly inaccessible over summer.
More on my blog.
Peter Thorne says
Chris,
I think we are arguing semantics.
But how, logically can the carbon I emit this very moment have more impact than carbon emitted a Century ago to the climate right now? It can’t. In partitioning ‘blame’ for today’s climate, which was my point as to what the legal folks at the meeting were interested in (as a meeting organizer I WAS there after all …), when matters as well as how much because of the interaction between the forcing and the physical processes some of which are very slow. The effect is not instantaneous. The climate is much further out of equilibrium with today’s CO2, than the day before’s than the day before that’s etc etc ad pretty much infinitum since we ever started burning fossil fuels on an industrial scale. Yes, its more complicated than that in reality but this is blog comments not a manuscript proof … and the point stands!
When and how much CO2 was emitted are both important factors in understanding today’s climate state under a transient climate change when the system stands out of equilibrium. If the industrial revolution had started in 1950 not 1850 but we had released the same amount of CO2 the climate state (and CO2 burden) would be different to today. Agreed? If so, as I say we are in semantics arguing the same point here. When and how much both matter in explaining the climate today.
Chris Dudley says
I don’t think millennialism really helps in this discussion. First, If you are going to take 3000 years to melt the ice, that hardly counts as a large energy imbalance and the temperature should be at its fast feedback equilibrium, or fairly close to it if the carbon dioxide concentration has been held steady for a while. Second, slow feedbacks come into play leading to a 6 C per doubling sensitivity according to Hansen. That makes a lag in the sensitivity to which we would subsequently apply a lag in the temperature response. But, the temperature response lag is likely quick compared to the slow feedback lag so essentially you just have a sensitivity knob. And, that is a different sort of behavior.
Chris Dudley says
David (#117),
Hansen et al. 2011 give some preference to the intermediate climate response function shown in their fig. 9 based on in situ ocean heat uptake measurements. http://pubs.giss.nasa.gov/abs/ha06510a.html
In that case, 75% of the response is complete within 100 years and 50% within 8 years. The long tail is not all that important, I’d say, even though it does linger over centuries.
JCH says
Not even the fast atmospheric response component has realized the impact of today’s emissions yet. … -Peter Thorne
In what form will that impact be? Just reading blogs, it appears it’s going to be lumped in with internal variability and passed off as all natural. The arctic sea ice was just melted by the AMO, as a for instance.
Chris Dudley says
Peter (#122),
I think you must have gone to the pub with Mr. Zeno while you were at the meeting. He is a great one for semantics. Or maybe you only got half way to the pub on account of the engrossing conversation?
Here is a persuasive argument that today’s emissions are important: Suppose we set them to zero. The processes that remove carbon dioxide from the atmosphere continue to function. The concentration of carbon dioxide begins to fall dramatically. The rate of temperature increase declines immediately and becomes zero within decades. It is a strong effect. If the absence of today’s emissions is that important, then so is their presence.
In partitioning blame, we are looking at the effects of climate change. Blame goes to those who made it dangerous, who carried us over the threshold. Those are the recent emitters, not the past emitters. Intent is also important. Those large emitters who are increasing their emissions are culpable. Those who are cutting their emissions are not.
And, we don’t need any great effort to exact restitution. WTO rules on environmentally motivated trade tariffs are adequate for making an effective instrument.
Jim Larsen says
127 Chris D said, “Those large emitters who are increasing their emissions are culpable. Those who are cutting their emissions are not.”
Humans generally fall into two categories today: Those with large emissions who are in a financial bind and so are emitting a tad less, which conveniently allows them to crow about their happenstance morality. Then there’s those whose finances consist of coins, and have the audacity to consider buying a 70MPG(?) micro-car, thus incurring your wrath.
Peter Thorne says
Chris (#127)
An alternative, equally valid, viewpoint is that blame is by the proportion of responsibility. You can’t blame the single CO2 atom that took us over a threshold for 100% of the damage that accrues. That is backwards to the point of absurdity. Under a natural law tenet blame is by proportional responsibility for consequence regardless of whether that consequence was intended or not.
Which returns me to my original point that for today’s extreme event (not tomorrow’s) yesterday’s polluter physically has more responsibility than today’s does precisely because there are substantive physical lags in the climate system. Their equal amount of emissions has a disproportionately larger impact on today’s climate state than today’s polluter does.
CO2 also has no recognized half life. Its not like the snow in your desk top snow tumbler. It will take millenia for all the CO2 to be scavenged. A fair proportion of the CO2 we emitted in the late 19th Century is likely still hanging around in the atmosphere.
And, finally, last time I checked climate model runs that ceased emissions today tended to take Centuries to millenia to stabalize their global mean temperatures because those long-term components equilibriating does matter in a fully blown AOGCM. But I would defer to (and welcome insights from) Gavin or others more expert on the specific point as to how long AOGCMs take to stabalize if we stop emitting anything tomorrow. Beyond seconding Jim (121) that the immediate effect would be a substantive spike as the short-lived aerosol cooling effect dropped off a proverbial cliff.
Hank Roberts says
> Chris Dudley
> … declines immediately ….
Can you post a citation supporting what you’re saying?
“the lifetime of the surface air temperature anomaly might be as much as 60% longer than the lifetime of anthropogenic CO2 ….”
Lifetime of Anthropogenic Climate Change: Millennial Time Scales of Potential CO2 and Surface Temperature Perturbations
DOI: 10.1175/2008JCLI2554.1
Cited by: http://scholar.google.com/scholar?hl=en&lr=&cites=11690433131824561442&um=1&ie=UTF-8&sa=X&ei=wolkUKCJCKS6iwLSuICgDQ&ved=0CCkQzgIwAA
Chris Dudley says
Hank (#130),
Don’t quote out of context. The answer is right there.
Chris Dudley says
Peter (#129),
I’ve demonstrated that more recent emissions are more important for today’s warming than earlier emissions, you have just asserted the opposite. As they say in Monty Python, that is mere contradiction, not argument.
I’m still working on your last question in #123. I want to implement one of Hansen’s temperature response functions. But I’d guess now the answer will be just the opposite of what you’d expect.
Chris Dudley says
Jim (@121),
Why is there a huge spike in temperature if the oceans are delaying the full temperature response? There could be a rapid change in forcing, but it is not all that important to my calculation since I am gauging against the climate sensitivity for a doubling of carbon dioxide. If it spikes a little higher, it just turns over to cooling all the sooner. I’m using Kharecha and Hansen, 2008 Global Biogeochem. Cycles, 22, GB3012 eqn. 1 for this. Time scales are given there.
Jim Larsen says
133 Chris D asks about the temperature spike:
Aerosols are generally accepted (though with disagreement) as being a significant cooling effect. If we stop emitting carbon today, within a few months the aerosols will be essentially washed out by rain. Do the math, and things warm up way fast. After that, it’s a 1000 year slide to cooler weather.
David B. Benson says
Chris Dudley @125 — Thanks for the link.
I ran a simple two box model against GISS net forcings. In doing so I found that an ‘air+upper ocean’ component had a time constant of about 1+ years and an ‘intermediate ocean’ component of 30 to 60 years, maybe longer.
An electrical circuit analogy is two coupled series RC circuits so that both capacitors share the resistor from the voltage supply and the second capacitor’s sole resistor is connected to the + side of the first capacitor.
Chris Dudley says
Jim (#134),
Yes, but cooling against what? My target temperature in the lag calculation is the full fast feedback climate sensitivity effect. Aerosols are a cooling effect on that, so really, I’m calculating as though the aerosols are already washed out. And, the sooner we meet the target temperature as it falls owing to moving to the caves, the sooner cooling starts (albeit from a higher perch).
Ron R. says
Per my last comment, seems like a good reason to experiment with a virus that can kill hundreds of millions if it escapes doesn’t it? .
Let’s hope it doesn’t escape.
http://www.lauriegarrett.com/index.php/en/blog/3186/
Superman1 says
Peter, Chris,
Every credible paper I have read on the temperature consequences of rapid/immediate elimination of fossil fuel combustion presents three components to temperature. First is the 0.8 C over pre-industrial already realized. Second is the ‘climate warming commitment’ due to the lags from effects of CO2 and associated emissions, and the estimates of temperature increase for this component seem to concentrate around 0.6-0.7 C. Third is the aerosol forcing from the rapid precipitation of the fossil sulphates that would occur at the termination of fossil fuel combustion, and here the estimates range from 0.5-1.5 C. So, total temperatures that would occur in a few decades after elimination of fossil fuel combustion today would be in the range of 2.0-3.0 C. And, these computations do not include some of the major feedbacks.
I have seen no credible evidence that temperatures could be stabilized at 2.0-3.0 C, due to the triggering and enhancing of synergistic positive feedbacks. In fact, I have seen no credible evidence that we are not already in that destabilizing phase at 0.8 C. What are our options under these scenarios?
Any useful actions we take depend on the climate dynamical mode. I liken it to the state of a reactive energy production system. Consider a fusion reactor, for example. It can operate in three main modes: driven, ignition, burn. In the driven mode, the energy out is some multiple of the energy in. In the ignition mode, the system is transitioning to self-sustainability, and is starting to lose memory of the energy input. In the burn mode, the system is fully self-sustaining (although internal quenching is possible if waste [combustion products] cannot be removed adequately), and has lost memory of the energy input.
In climate change, I believe we have gone past the driven mode, and are either at burn or ignition. To extricate the planet from this catastrophe, two steps are required. First, fossil fuel combustion has to terminate essentially immediately, to prevent the problem becoming even more intractable. Second, excess CO2 and other greenhouse gases in the atmosphere have to be removed, possibly in concert with other temperature reduction measures such as aerosol addition to insure that the self-sustaining positive feedbacks are quenched.
But, this whole discussion, while interesting, is equivalent to the Medieval argument as to how many angels can dance on the head of a pin. There is not only no chance of fossil fuel combustion ending rapidly, but all the evidence points to as much exploration and extraction of fossil fuels as allowed by governments, and increasing global demand for fossil fuels in the foreseeable future. And, the real culprits are not the fossil fuel companies and their media and denialist henchmen, although their hands are certainly dirty and covered with blood. They are the fossil fuel equivalent of the drug cartels. The real culprits are the intensive energy use addicts, you and me. We drive the whole process. PBS and WSJ could present in-depth stories tomorrow about the reality of climate change as outlined above, and it would not make one whit of difference in fossil energy consumption among the American people. The Koch Brothers are wasting their money on the Heartland Institute and their ilk. If the truth were told, it would have no noticeable difference on behavior with respect to fossil fuel use.
Chris Dudley says
Superman1,
I think you must be double counting somewhere. Ending emissions at a carbon dioxide concentration of 400 ppm means a factor of 1.4 over the pre-industrial concentration. ln(1.4)/ln(2)is about 0.5 so we should see only half the effect of a doubling at most. For a sensitivity of 3 C per doubling, that comes out to an maximum 1.5 C of warming where you give a range of 2-3 C. Thus you are implying a sensitivity of 4 to 6 C per doubling or more. Only a quarter of that range (4-4.5 C) has much support.
Ric Merritt says
Re Superman1 #138 and Chris Dudley #139: Superman1 has been at this on a couple different threads. It’s perfectly legit to refer to feedbacks, and it can even be good to speculate about feedbacks, if you don’t present speculation as settled or consensus. But Superman1 needs to acknowledge that his bottom line after feedbacks is different (way warmer) than that of most others.
Having a different opinion of the bottom line from most professionals isn’t terrible all by itself. But failure to even acknowledge the opinion gap is a flag, a very large and red one. Superman1, if your claim is that your bottom line is close to the science consensus, you’re going to have to cite a wide variety of scientists who live and work near the middle of the metaphorical road.
Superman1 says
Ric Merritt and Chris Dudley #139 and #140,
It’s difficult to identify a ‘consensus’ on projected temperature increases from ceasing fossil fuel combustion essentially immediately. There is a wide spread, as Ric points out. Further, I would be leery of any ‘consensus’ in such a politically and commercially sensitive area as climate change. Any ‘consensus’ could very well be the ‘consensus’ the research sponsors want to hear. In the Arctic ice projection sphere, I would have put much more weight on someone who knew the business, such as Peter Wadhams, than the ‘consensus’ reached by the IPCC Report in 2007. In the end, one has to go with their best judgment as to which of the myriad numbers is most credible.
Additionally, we have been basing our discussions on unclassified papers, model results, and measurements. The DoD and intel agencies need the most accurate projections available, in order to be able to do credible outyear planning and procurement. What I’d really like to see are the numbers these agencies are assuming for their models and for their planning, and the methane and other climate-critical measurements their classified sensors are producing. My guess is their numbers are far closer to the ones I use, even on the high side perhaps, compared to other numbers I have seen in the threads.
Nevertheless, a recent letter to Climatic Change (Correlation between climate sensitivity and aerosol forcing and its implication for the “climate trap”: A Letter. Katsumasa Tanaka & Thomas Raddatz) summarizes the situation as follows:
“Climate sensitivity and aerosol forcing are dominant uncertain properties of the global climate system. Their estimates based on the inverse approach are interdependent as historical temperature records constrain possible combinations. Nevertheless, many literature projections of future climate are based on the probability density of climate sensitivity and an independent aerosol forcing without considering the interdependency of such estimates. Here we investigate how large such parameter interdependency affects the range of future warming in two distinct settings: one following the A1B emission scenario till the year 2100 and the other assuming a shutdown of all greenhouse gas and aerosol emissions in the year 2020. We demonstrate that the range of projected warming decreases in the former case, but considerably broadens in the latter case, if the correlation between climate sensitivity and aerosol forcing is taken into account. Our conceptual study suggests that, unless the interdependency between the climate sensitivity and aerosol forcing estimates is properly considered, one could underestimate a risk involving the “climate trap”, an unpalatable situation with a high climate sensitivity in which a very drastic mitigation may counter-intuitively accelerate the warming by unmasking the hidden warming due to aerosols.”
“Furthermore, our illustration shows that, with the large spread in the interdependent simulations after the emission shutdown, the global temperature overshoots the common climate policy target of 2°C warming in the case of CS>5°C. Furthermore, the rate of warming after an emission shutdown exceeds another common target of 0.2°C/decade even with a small CS.”
I can quote other studies and other temperature ranges, but a 2 C temperature increase is more than adequate for the argument I make.
Chris Dudley says
Peter (#123),
So, I took some time to implement the Hansen, Sato, Kharecha and von Schuckmann (2011) Green’s function formalism for their intermediate (fig. 9) temperature response function and you will be unhappy with the result. In that formalism, as soon as emissions are set to zero, the temperature not just of the the fast feedback response without lag starts to fall as expected, but also the ‘lagged’ climate system response to that forcing begins to fall immediately. There is indeed about a 25 year lag on the growing portion of the curve, but no lag on turning to cooling. This is because their function makes 17% of its final response in the first year. I suspect that this comes from the need to accommodate the climate response to volcanic eruptions.
To me, this seems unphysical since the energy imbalance remains so the temperature should still rise.
So, we’ll just continue with my 30 year lag implementation which targets the fast feedback sensitivity temperature response. You asked to have the industrial revolution delayed by a century but then have total emissions
up to today be the same. For ease of implementation, I delayed the onset of emissions by 112 years, took every third year of the original emissions curve and multiplied that new curve by three to preserve the total.
Rather than rising to 400 ppm, the foreshortened concentration curve rises to 433 ppm. That corresponds to a peak target temperature of 1.8 C above pre-industrial rather than 1.5. However, even though the target temperature is higher as those more impetuous emitters move to their caves, their actual temperature is lower than ours on move day by 0.3 C since they are lagging a later curve. Not to worry though. They reach a temperature 0.02 C higher than the maximum we’d reach (1.16 C above pre-industrial) and only 11 year latter than us. So, there is very little difference in temperature behavior despite the rather different emissions profiles. On beyond the year 2100 things are nearly identical. This echos what a number of scientist have said about quantity of emissions, rather than time profile, being what is important in the long run. And, since the highest temperature reached is very nearly identical, the overall (dangerous) consequences are the same. So, no, your proposed change does not demonstrate that timing is all that important. And, it is still the most recent emissions that are responsible for taking us over the threshold for dangerous climate change as I demonstrated earlier.
http://pubs.giss.nasa.gov/abs/ha06510a.html
Chris Dudley says
Superman1 (#141),
James Hansen does not always seem to say what funders want to hear. He’s derived a climate sensitivity for fast feedbacks including natural aerosols of 3 \pm 0.5 C per doubling. Since your concern is anthropogenic aerosols, you should be using the paleoclimate derived figure to make a separation. Clearly, your concerns seem overblown if that approach is taken.
Chris Dudley says
Jim (#128),
Sorry I missed that one. You speak of happenstance morality but I am thinking of new EPA regulations and new CAFE standards, rather intentional efforts to avoid harm.
Let us not forget that some countries can and will skip fossil fuel tech based development. And they will not have to pay twice for their energy infrastructure as we will since they do not face sunk costs. Those which insist on pushing forward with fossil fuel based development are causing dangerous climate change in addition to short changing themselves with outdated tech.
David B. Benson says
Chris Dudley @142 — What makes you think that the energy imbalance remains?
I expect the result you obtained from your ’emissions’ set to zero computational experiment; I assume that also included setting aerosols to zero.
Chris Dudley says
David (#145),
The energy imbalance remains owing to the elevated concentration of carbon dioxide in the atmosphere, which takes a while to decline and reach a point where the actual temperature and the temperature predicted for equilibrium agree.
You can see this if you save one of the scripts I’ve posted here to a file and use the command @filename.txt at the command line of this program: http://fl.net23.net/
Probably best to save the script in the lib subdirectory in that program package so you don’t have to fiddle with the !path variable.
Jim Larsen says
144 Chris D said, “Let us not forget that some countries can and will skip fossil fuel tech based development. And they will not have to pay twice for their energy infrastructure as we will since they do not face sunk costs. Those which insist on pushing forward with fossil fuel based development are causing dangerous climate change in addition to short changing themselves with outdated tech.”
Hmmm, no. Fossil fuels were/are free energy other than externalities. Pure-t-profit and externalities. Well, MAYBE 10% of the price is cost, but… The West used that free energy to build their economies. Also free minerals. Essentially, you’re saying, “We brutalized the planet and DESERVE to continue to do so (at a slightly reduced rate) unencumbered by mere foreigners’ desire to share in the planetary birthright. Obviously, foreigners have NO right to OUR birthright.”
Nope. Somebody who unfairly consumes nearly all resources has NO right to not share those resources fairly. Once the USA drops below the average CO2 emissions of the planet for the entire history of the industrial revolution, THEN you’ve got a point. Until then, you’re just slurping at the trough.
And you never answered my three questions.
1. The USA has REFUSED to increase efficiency, and is now implementing a warped MPGe policy which will allow for the continued spewing of carbon (and even worse, CH4) by pretending about half of the carbon and ALL of the CH4 doesn’t happen when considering electric-based carbon-spewing vehicles even though biofuels and synfuels are the most likely path to motor vehicle solutions. Does that make the USA culpable?
2. How does the size of a country matter with regard to culpability?
3. Dang, I forgot the third… but hey, look it up and answer, OK?
Chris Dudley says
[edit – this is all now OT]
Superman1 says
Chris Dudley #143,
“James Hansen does not always seem to say what funders want to hear. He’s derived a climate sensitivity for fast feedbacks including natural aerosols of 3 \pm 0.5 C per doubling. Since your concern is anthropogenic aerosols, you should be using the paleoclimate derived figure to make a separation. Clearly, your concerns seem overblown if that approach is taken.”
I have examined about two handfuls of papers addressing mainly, or more typically in part, the climate effects of immediate cessation of fossil fuel combustion. They all agree that we have made a ‘warming’ commitment to increased future temperatures, but they disagree substantially on the magnitude of that commitment. The range seems to be from about 1.2 C to perhaps triple that. It depends on values assumed for climate sensitivity and aerosol forcing, both of which have potentially wide discrepancies in their values.
Hansen’s CS estimation as you have shown is but one among many, although Hansen certainly has impeccable credentials. In his CS derivation, there seems to be the implicit assumption that it is not dependent on the levels of CO2, on the temperature, or on the rate of CO2 concentration increase. I have no idea of how sensitive the results are to these assumptions.
Further, it is based on an atmosphere of eons ago in contact with a physical boundary that existed eons ago. How that relates to the atmosphere of today in contact with the physical boundaries of today is unclear to me. The implied assumption appears to be conditions are sufficiently similar between then and now for CS estimation purposes. I have no way of knowing how true that is.
So, we have myriad estimates of committed temperature increase based on myriad estimates of CS and myriad estimates of AF, made by researchers whose personal agendas may or may not be from the same sheet of music. Under those conditions, I have no idea what it means to generate a ‘probability distribution’ of expected outcomes, and base some type of climate mitigation around some ‘mean’. Given the magnitude of the stakes involved in guessing wrong, I would be more inclined to use the precautionary principle and focus on the potential of higher projected temperatures until more convincing opposing data is forthcoming. I’m willing to stay with 2 C at this point, although, as I stated in my previous post, a committed increase of 1.5 C probably wouldn’t alleviate my concern about entering into the ‘ignition/burn’ phase appreciably.
Back to your comments. The first sentence of your response relates to a point I have made in a number of posts, including the one to which you responded. That is the influence of research sponsors on what gets published and disseminated. From the chapter on Climate Catastrophe in Ahmed’s Crisis of Civilization, I have extracted the following interesting excerpt:
“The United Nations IPCC report has commendably shifted the debate on climate change by publicly affirming firstly an overwhelming scientific consensus on the reality of human emissions generated climate change, and secondly a startling set of scenarios for how global warming will affect life on Earth by the end of this century if existing rates of increase of CO2 emissions continue unabated. But according to a growing body of scientific evidence, the IPCC’s findings in 2007 were far too conservative – and dangerous climate change is more likely to occur far sooner, with greater rapidity, and higher intensity, than officially recognized by governments.
INACCURACIES IN THE INTERGOVERNMENTAL PANEL ON CLIMATE CHANGE 2007
A number of British researchers expressed grave reservations shortly after the release of the UN IPCC Fourth Assessment Report. In particular, David Wasdell, who was an accredited reviewer of the IPCC report, told the New Scientist that early drafts prepared by scientists in April 2006 contained “many references to the potential for climate to change faster than expected because of ‘positive feedbacks’ in the climate system. Most of these references were absent from the final version.” His assertion is based “on a line-by-line analysis of the scientists’ report and the final version,” which was agreed in February 2007 at “a week-long meeting of representatives of more than 100 governments.” Below we highlight three examples from Wasdell’s analysis:
1) In reference to warnings that natural systems such as rainforests, soils and oceans would be less able in future to absorb greenhouse gas emissions, the scientists’ draft report of April 2006 warned: “This positive feedback could lead to as much as 1.2 degrees Celsius of added warming by 2100”. The final version of March 2007 though only acknowledges that feedback exists and says: “The magnitude of this feedback is uncertain.”
2) The April 2006 draft warned that global warming will increase the amount of water vapour released into the atmosphere, which in turn will act like a greenhouse gas, leading to an estimated “40-50 percent amplification of global mean warming”. In the final March 2007 report this statement was replaced with “Water vapour changes represent the largest feedback”.
3) In relation to the acceleration of breakup of arctic and antarctic ice sheets, the April 2006 draft paper talked about observed rapid changes in ice sheet flows and referred to an “accelerated trend” in sea-level rise. The government-endorsed final report of March 2007 said that “ice flows from Greenland and Antarctica … could increase or decrease in future.”54
4) The conclusion that “North America is expected to experience locally severe economic damage, plus substantial ecosystem, social and cultural disruption from climate change related events” was removed from the final version.55
In other words, the IPCC Fourth Assessment Report excluded and underplayed direct reference to the overwhelming probability of the rapid acceleration of climate change in the context of current rates of increase of CO2 emissions and positive feedbacks. Wasdell put it down to possible political interference, and there are reasonable grounds for this conclusion.
As noted by Mike Mann, director of the Earth System Science Center at Pennsylvania State University, and a past lead author for the IPCC: “Allowing governmental delegations to ride into town at the last minute and water down conclusions after they were painstakingly arrived at in an objective scientific assessment does not serve society well.”56
The possible watering-down of the IPCC’s 2007 Fourth Assessment Report is part of a wider pattern. In the same month, a joint survey by the Union of Concerned Scientists and Government Accountability Project concluded that 58 per cent of US government-employed climate scientists surveyed complained of being subjected to: 1) “Pressure to eliminate the words ‘climate change,’ ‘global warming’, or other similar terms” from their communications; 2) editing of scientific reports by their superiors which “changed the meaning of scientific findings”; 3) statements by officials at their agencies which misrepresented their findings; 4) “The disappearance or unusual delay of websites, reports, or other science-based materials relating to climate”; 5) “New or unusual administrative requirements that impair climate-related work”; 6) “Situations in which scientists have actively objected to, resigned from, or removed themselves from a project because of pressure to change scientific findings.” Scientists reported 435 incidents of political interference over the preceding five years.57 Such large-scale systematic political interference with climate science lends credence to the concern that climate scientists feel unable to voice their real views about the urgency posed by global warming.”
[Response:David Wasdell is not a reliable source (see here or discussions in the comment thread here) – gavin]
Hank Roberts says
Jim — look at the sidebar. Right hand side of every page at RealClimate.
Those are the hosts’ recommendations.
Those are discussions of related topics, like those you want people focusing on, including the one you forgot.
Many of us read and some participate in those discussions, or are involved in real life politics rather than blogging.
Personally I recommend reading EcoEquity — there’s a lot there going back years, about what you’re interested in.
If you read nothing else on that page, read footnote 1 and follow the link to the 4-minute video there.
The questions you want talked about — have been and are being talked about extensively.
Point to well thought out discussion already in progress.