What is the long term sensitivity to increasing CO2? What, indeed, does long term sensitivity even mean? Jim Hansen and some colleagues (not including me) have a preprint available that claims that it is around 6ºC based on paleo-climate evidence. Since that is significantly larger than the ‘standard’ climate sensitivity we’ve often talked about, it’s worth looking at in more detail.
We need to start with some definitions. Sensitivity is defined as the global mean surface temperature anomaly response to a doubling of CO2 with other boundary conditions staying the same. However, depending on what the boundary conditions include, you can get very different numbers. The standard definition (sometimes called the Charney sensitivity), assumes the land surface, ice sheets and atmospheric composition (chemistry and aerosols) stay the same. Hansen’s long term sensitivity (which might be better described as the Earth System sensitivity) allows all of these to vary and feed back on the temperature response. Indeed, one can imagine a whole range of different sensitivities that could be clearly defined by successively including additional feedbacks. The reason why the Earth System sensitivity might be more appropriate is because that determines the eventual consequences of any particular CO2 stabilization scenario.
Traditionally, the decision to include or exclude a feedback from consideration has been based on the relevant timescales and complexity. The faster a feedback is, the more usual it is to include. Thus, changes in clouds (~hours) or in water vapour (~10 days) are undoubtedly fast and get included as feedbacks in all definitions of the sensitivity. But changes in vegetation (decades to centuries) or in ice sheets (decades(?) to centuries to millennia) are slower and are usually left out. But there are other fast feedbacks that don’t get included in the standard definition for complexity reasons – such as the change in ozone or aerosols (dust and sulphates for instance) which are also affected by patterns of rainfall, water vapour, temperature, soli moisture, transport and clouds (etc.).
Not coincidentally, the Charney sensitivity corresponds exactly to the sensitivity one gets with a standard atmospheric GCM with a simple mixed-layer ocean, while the Earth System sensitivity would correspond to the response in a (as yet non-existent) model that included interactive components for the cryosphere, biosphere, ocean, atmospheric chemistry and aerosols. Intermediate sensitivities could however be assessed using the Earth System models that we do have.
In principal, many of these sensitivities can be deduced from paleo-climate records. What is required is a good enough estimate of the global temperature change and measures of the various forcings. However, there are a few twists in the tale. Firstly, getting ‘good enough’ estimates for global temperatures changes is hard – this has been done well for the last century or so, reasonably for a few centuries earlier, and potentially well enough for the really big changes associated with the glacial-interglacial cycle. While sufficient accuracy in the last few centuries is a couple of tenths of a degree, this is unobtainable for the last glacial maximum or the Pliocene (3 million years ago). However, since the signal is much larger in the earlier periods (many degrees), the signal to noise ratio is similar.
Secondly, although many forcings can be derived from paleo-records (long-lived greenhouse gases from bubbles in the ice cores most notably), many cannot. The distribution of sulphate aerosols even today is somewhat uncertain, and at the last glacial maximum, almost completely unconstrained. This is due in large part to the heterogenity of their distribution and there are similar problems for dust and vegetation. In some sense, it is the availability of suitable forcing records that suggests what kind of sensitivity one can define from the record. A more subtle point is that the ‘efficacy’ of different forcings might vary, especially ones that have very different regional signatures, making it more difficult to add up different terms that might be important at any one time.
Lastly, and by no means leastly, Earth System sensitivity is not stable over geologic time. How much it might vary is very difficult to tell, but for instance, it is clear that from the Pliocene to the Quaternary (the last ~2,5 million years of ice age cycles), the climate has become more sensitive to orbital forcing. It is therefore conceivable (but not proven) that any sensitivity derived from paleo-climate will not (in the end) apply to the future.
We’ve often gone over the Charney sensitivity constraint for the Last Glacial Maximum. There is information about the greenhouse gases (CO2, CH4 and N2O), reconstructions of the ice sheets and vegetation change, and estimates of the dust forcing. A recent estimate of the magnitude of these forcings is around 8 +/- 2 W/m2 (Schneider von Deimling et al, 2006). This implicitly includes other aerosol changes or atmospheric chemistry changes in with the sensitivity (or equivalently, assumes that their changes are negligible). So given a temperature change of about 5 to 6ºC, this gives a Charney sensitivity of around 3ºC (ranging from 1.5 to 6 if you do the uncertainty sums).
Hansen suggests that the dust changes should be considered a fast feedback as well (as could the CH4 changes?) and that certainly makes sense if vegetation changes are included on the feedback side of the equation. Since all of these LGM forcings are the same sign (i.e. they are all positive feedbacks for the long term temperature change), that implies that the Earth System sensitivity must be larger than the Charney sensitivity on these timescales (and for this current geologic period). So far so good.
Hansen’s first estimate of the Earth System sensitivity is based on an assumption that GHG changes over the long term control the amount of ice. That gives a scaling of 6ºC for a doubling of CO2. This is however problematic for two reasons; first most of the power of this relationship is derived from when there were large N. American and European ice sheets. It is quite conceivable that, now that we are left with only Greenland and Antarctica, the sensitivity of the temperature to the ice sheets is less. Secondly, it subsumes the very special nature of orbital forcing – extreme regional and seasonal impacts but very little impact on the global mean radiation. Hansen’s estimate assumes that an overall cooling of the same magnitude of the LGM would produce the same extent of ice sheets that was seen then. It may be the case, but it is not a priori obvious that it must be. Hansen rightly acknowledges these issues, and suggests a second constraint based on longer term changes.
Unfortunately, prior to the ice core record, our knowledge of CO2 changes is much poorer. Thus while it seems likely that CO2 decreased from the Eocene (~50 million years ago) to the Quaternary through variations related to tectonics, the exact magnitude is uncertain. For reasonable values based on the various estimates, Hansen estimates a ~10 W/m2 forcing change over the Cenozoic from this alone (including a temperature-related CH4 change). The calculation in the paper is however a little more subtle. Hansen posits that the long term trend in the deep ocean temperature in the early Cenozoic period (before there was substantial ice) was purely due to CO2 (using the Charney sensitivity). He then plays around with the value of the CO2 concentration at the initiation of the Antarctic ice sheets (around 34 million years ago) to get the best fit with the CO2 reconstructions over the whole period. What he ends up with is a critical value of ~425 ppm for initiation of glaciation. To be sure, this is fraught with uncertainties – in the temperature records, the CO2 reconstructions and the reasonable (but unproven) assumption concerning the dominance of CO2. However, bottom line is that you really don’t need a big change in CO2 to end up with a big change in ice sheet extent, and that hence the Earth System sensitivity is high.
So what does this mean for the future? In the short term, not much. Even if this is all correct, these effects are for eventual changes – that might take centuries or millennia to realise. However, even with the (substantial) uncertainties in the calculations and underlying assumptions, the conclusion that the Earth System sensitivity is greater than the Charney sensitivity is probably robust. And that is a concern for any policy based on a stabilization scenario significantly above where we are now.
Russell Seitz says
Since Hansen’s latest estimate is bound to arouse controversy, I’ve posted a revised graphic precis of past sensitivity estimates- Levenson’s 2006 compilation did not include Arrhenius second thoughts, published a decade after he produced the Victorian estimate that set the roller coaster in motion:
http://adamant.typepad.com/seitz/2008/04/target-of-fear.html
Alastair McDonald says
Re #36 where outeast wrote:
“… William Connolley … modelled the impacts of the loss of Arctic sea ice … “Well, it would lower the albedo, though perhaps not by as much as you might expect due to clouds and sun angle. Our study found little *long term* impact, because the ice largely regrew each winter. That included albedo effects.”
Hope that’s of interest.”
It is interesting but not surprising. It sounds as though Connolley limited the SST to -1C and so a major positive feedback from the greenhouse effect of water vapour would have be suppressed. Moreover, as CEP Brooks explained, it is highly unlikely that the sea ice will reform in winter if there is no multi-year ice there to provide a surface where the air can be cooled well below the freezing point of sea water.
Moreover, he was using an atmosphere only model, and even the full Ocean-Atmosphere model cannot replicate the melting Arctic ice. Vellinga and Wood (2002) had already tried to replicate the rapid warming at the end to the Younger Dryas with that model but its forcing was insufficient.
The problem is that the modellers are using the wrong paradigm! See https://www.realclimate.org/index.php/archives/2008/04/target-co2/index.php?p=509#comment-84027
I think I would rather take Hansen and Broeckers ideas than Connolley. Broecker believes that sea ice caused the Younger Dryas http://www.amnh.org/sciencebulletins/earth/f/glaciers.20050331/essays/59_1.php
and Hansen believes that albedo is the main forcing. So although I am interested to hear of Connolley’s work I am far from convinced that loss of the Arctic sea ice will not lead to disaster :-(
Thomas Lee Elifritz says
A concentrated solar system in desert area 250km by 250km would supply all the Worlds current electricity demands.
Ok, fair enough. Now simulate what the local effects of such a large array would be when going to the reflective, the black to transparent state on nanosecond time scales.
Such will be the state of condensed matter physics very shortly. If we can do that locally, we should be able to do it globally. Not only does that solve the power problem, but that also solves the thermostatic problems.
Blair Dowden says
Re #40: Thanks for the Toggweiler & Russell paper. I had not realized the importance of the increasing vertical temperature gradient on wind speeds. One would expect a different global wind (and ocean current) pattern than the current one driven by the polar-equator temperature difference. This could have important effects on some regional climates.
However, I think increased ocean circulation would reduce CO2 levels by removing it from the atmosphere faster than it can be returned on the other end of the cycle. The deep ocean is presumably under saturated relative to the high atmospheric CO2 levels of today and the future, and would therefore absorb some of the CO2. So I still do not fully understand Hansen’s statement that I quoted.
My main problems with the Hansen paper remain:
1) I think he is extrapolating a high glacial era climate sensitivity into that for a warmer climate where it will be significantly less.
2) He is using the conditions for ice sheet formation as the criteria for ice sheet melting, not taking the thermal inertia of the ice sheet into account.
[Response: Regarding your second point, you are confusing thermal inertia with hysteresis. The thermal inertia of the ice sheets says that to deglaciate, the warming must persist for a sufficiently long time. As Gavin notes, it is uncertain how long “long” is, but it’s probably a good bit more than a century. Hysteresis, on the other hand, would say that no matter how long you wait, the deglaciation happens at a lower CO2 threshold than the initiation, since the ice sheet creates conditions that tend to maintain itself. Rob DeConto’s work shows there is some hysteresis in this problem, but not much, when Milankovic effects are taken into account. In fact, if the initiation happened during Milankovic conditions that are favorable to glaciation, while you increase CO2 at a time that’s favorable for deglaciation, the deglaciation could well happen at a lower CO2 level than the initiation (though that’s not a case Rob computed explicitly). –raypierre]
bill mckibben says
For anyone who’s more or less convinced by Hansen, please join us at 350.org. Our slightly insane goal is to make 350 the most well-known number on the planet in the next 18 months. We need artists and musicians and political organizers (on Saturday 350 bicyclists rode in circles in Salt Lake City). It’s not completely insane, either–our team organized 1400 rallies in all 50 states last April for domestic climate action. Now we’re just trying to do it for the, you know, whole planet
For my money, Hansen’s paper is the most important development in the climate debate since the IPCC 1995 report. I think it’s going to turn out to be our last real shot at turning public opinion, and with it political action, away from incremental change and towards transformative action. Many thanks to realclimate for keeping us all apprised, and please join us at 350.org
Chuck Booth says
Re # 29 Jim Bullis: “…why would CO2 solubility decrease if surface water warmed slightly and deeper water warmed more?”
Because CO2 solubility is inversely related to temperature (http://jcbmac.chem.brown.edu/myl/hen/carbondioxideHenry.html)
Thomas says
53:
If we are using solar power, then at least for the active wavelength of the device(s) we will have high absorption, i.e. panels will have very low albedo. The best way to mitigate this is to decrease overall energy demand, and increase the PV efficiency. Max efficiency today is around 40%. But current human energy needs are roughly 1 part in 10,000 of insolation, so even at 10% efficiency we are only increasing solar heat absorbed at the surface by .1%.
But the larger point, that our species controls a substantial surface area of the planet, and perhaps if albedo management were made a priority, we could provide sufficient negative shortwave forcing by this means? For affecting the earths climate system, there is no need for short time scale albedo changes. Of course in a science fiction sense, coupling a weather model, with the ability to rapidly control the albedo of a significant part of the earths surface would allow some degree of control over the weather.
Some other comments, for the experts to reply, as I’m not capable of properly evaluating:
regarding paleoclimate icesheet decay rates: it can be argued that Northern ice sheets were highly vulnerable to collapse into the sea, i.e. a significant chunk of the Laurentide was grounded over Hudsons bay, and also major parts of the European over the North Sea? GIS does not seem to possess this sort of vulnerability to rapid collapse, although the WAIS might be vulnerable. Another thing I don’t hear discussed is how the albedo of a melting part of an ice sheet might evolve over time. If buried dust accumulates on the surface of the ice as it melts (this is common in Alpine glaciers, which are very much dirtier in general -and probably have larger dust particles) how low might the summertime albedo become?
I was unconvinced by Connolley’s ice free arctic summer runs. I think he treated the ice/ocean as a fixed boundary condition for the GCM. I would think that the greatest energy anomaly would be the increased solar absorption, which is largely absorbed by the ocean. Some of this heat will be “liberated”, by the later freezeup in the fall. But probably more importantly the incremental heat would likely be transported deeper into the ocean. I suspect it would take decades for this change in oceanic circulation to have much of a global impact.
FurryCatherder says
Re #53: I’m not sure what you’re trying to get to with that question. An actual 250km x 250km solution wouldn’t work for myriad reasons, so trying to figure out what good or bad things would happen with it are pointless.
Re #45: What sequestration points to is that we have to stop recycling anything that contains carbon and comes from renewable sources. We also need to come up with the Backyard Carbonizer, that can replace the trashcan and recycle bin as the disposal place of first resort. A small parabolic reflector pointed at the bottom of a large cast iron pot should reduce most carbon based rubbish to carbon in short order. Better yet, get those thermal depolymerization guys rolling out their plants faster.
pete best says
Re #42, I believe it is in relation to where Antartica is today, ie the south pole. 600 million years ago, it did not exist officially and probably not 300.150 million years ago either. Plate tectonics and all that.
pete best says
http://video.google.co.uk/videoplay?docid=9171659355384722877&q=james+hansen&total=186&start=0&num=10&so=0&type=search&plindex=1
If we just use all of the conventional oil and gas reserves we will hit 450 ppmv of CO2 and risk a different planet. James Hansen.
Pekka Kostamo says
RE #54: I think there is a major component of hysteresis just due to the thickness of the East Antarctic ice. Melting starts only when the temperature at the surface altitude of 3 km rises above 0 degC for some substantial time each year.
I believe a balmy summer day there means presently about -15 degC. A 6 degC rise in global average might just about cause some surface melting at that altitude because of polar amplification. Maybe this is one of the tipping points.
Return of the ice cover will start on the bare ground at a low altitude and will then require lower global average temp conditions.
Greenland and Western Antarctica are likely to melt first precisely because of their low altitude surfaces.
Barton Paul Levenson says
Russell Seitz posts:
[[Since Hansen’s latest estimate is bound to arouse controversy, I’ve posted a revised graphic precis of past sensitivity estimates- Levenson’s 2006 compilation did not include Arrhenius second thoughts, published a decade after he produced the Victorian estimate that set the roller coaster in motion:]]
I note that you have Hansen’s estimate as the final one. Did you miss the repeated notice that that was a long-term sensitivity and not a short-term sensitivity? Apples and oranges, buddy.
Lowell says
After reading the paper, I note it is interesting that the only CO2 proxies around the 34 million year ago Antarctic glaciation period are in fact …
… 1500 ppm (not 450 ppm) ???
The only references within 10 million years of the period are from Pagani et al 2005 (1,500 ppm +/- 500 ppm) and Retallic 2001 (1,000 ppm +/- 500ppm).
Chuck Booth says
Possibly of relevance to this discussion:
Stanford Report, April 2, 2008
Phytoplankton species deviates from norm: No CO2 absorbed in photosynthesis-
Findings could affect scientists understanding of amount of carbon dioxide phytoplankton pull from atmosphere
A widespread species of ocean-dwelling microorganisms has been found to employ a never-before-seen alternative method of photosynthesis…The discovery has implications not only for scientists’ basic understanding of photosynthesis—arguably the most important biological process on Earth—but also for the amount of carbon dioxide that phytoplankton pull from the atmosphere….
“There is a new twist on photosynthesis here, and that has to be accounted for when it comes to CO2 modeling,” Bailey said, adding that, in some cases, the models may overestimate the amount of carbon fixation that occurs in nutrient-poor waters.
It is not yet clear what the finding might mean to studies of long-term global warming, he said, but it will have to be incorporated into any models that include carbon fixing by phytoplankton as a factor…
http://news-service.stanford.edu/news/2008/april2/plant-040208.html
SecularAnimist says
Isn’t the question of a “target CO2” level moot, given that anthropogenic CO2 emissions are not only increasing every year, but accelerating, and all indications are that global fossil fuel use and associated emissions will continue to increase for years or decades before they peak and begin to decline?
The “target” that really matters is the year in which global CO2 emissions will be less than the previous year and will thereafter rapidly decline to near zero. It seems implausible that humanity will achieve that “target” in time to prevent catastrophic warming and climate change.
D Price says
Re #53 when I say 250km by 250km that’s just the total area. They do not all have to be in the same piece of desert. They can be scattered in smaller units if need be.
Erik Hammerstad says
Re #47 Ike. According to James Annan the number for ocean heat content rise in the Levitus Science 2000 paper was an arithmetic error, see end of http://julesandjames.blogspot.com/2008/04/frogs-and-blogs.html
Otherwise I agree that ocean warming is a key issue, but its now problematic in that Argo is not showing any warming since 2004 while the combination of Jason and Grace finds the warming to be continuing at the same rate as earlier. The latest presentation on this issue is here http://ibis.grdl.noaa.gov/~leuliett/presentations/osm_2008.pdf
Thomas Lee Elifritz says
It seems implausible that humanity will achieve that “target” in time to prevent catastrophic warming and climate change.
In 1908 it seemed implausible that one could step onto a modern jetliner and fly anywhere in the world, or that man could walk on the moon. There was this one guy, Konstantin Tsiolkovsky who thought it might be possible.
The fact that now people understand that not only do we need to reduce carbon output to zero, but we have to remediate the atmosphere back to a previous CO2 level, is the first step. If we can do that, 300 or 320 ppm is just as reasonable, indeed, most people studying the problem understand that 350 is probably the upper limit. Hansen et al. have merely quantified it to the best of their abilities, and within the state of the art of science.
You’re on a spaceship, carbon dioxide is increasing, you know you have a problem, yet you continue to breathe. Breathing is necessary to solve that problem that you know exists, and you know you do have to solve it.
There is of course a time limit with this problem.
Anyone that now claims there isn’t a problem to be solved have been adequately warned, and the minimum bar is set.
Good luck! On this post I linked to my blog. I don’t come in here much, only when something of significance happens or is published. Hansen again has risen to the challenge.
GeologyJoe says
We’re doomed. Move inland ;)
Hank Roberts says
> moot
Nope. Else we’d be living with whatever level of pollution industry gave us at their high points.
You never know what is enough until you know what is more than enough.
Blair Dowden says
Re Raypierre’s response to #54: Thank you for clarifying the difference between thermal inertia and hysteresis. Orbital forcings are obviously critical to the formation of the Laurentian Ice Sheet, but on the timescale of the ice sheets of Antarctica and Greenland, would they not just be noise in the average temperature signal?
I am trying to get a feel for how much hysteresis there is in the Antarctic ice sheet. Three million years ago CO2 was at 400ppm, temperatures were 2 or 3 degrees higher and sea level was about 30 meters higher. Since CO2 is now almost 400ppm, does it follow that if it stays at that level we will get 30 meters, or is it more like 15, 20 or 25?
Lawrence Brown says
What I get from this latest paper on estimating a climate sensititivity factor is that it’s a moving target.It reinforces the fact that there are many un-certainties involved. Feedback parameters are complex in their relationships.
Increasing surface temperature increases evaporation and adds to atmospheric H2O which is a greenhouse gas that adds to temperature rise, but this can also increase cloud cover which increases albedo which has a cooling effect and leads to temperature decrease. Go figure! It’s a problem, and must add to the headaches already confronting climate modelers.
CobblyWorlds says
#54 Blair Dowden,
I agree that at first glance it might seem that increased overturning should increase CO2 uptake because of high current CO2 levels, but the deeps actually have high concentrations of CO2.
Have you read David Archer’s post on reduced ocean uptake? https://www.realclimate.org/index.php/archives/2007/11/is-the-ocean-carbon-sink-sinking/
specifically 5th para down.
It would have been better for me to have linked to it earlier, but I got distracted and forgot. Sorry.
D Price says
One thing a poster mentioned is the location of Antarctica. While all the other continents have shifted position over millions of years Antartica doesn’t seem to have moved at all from the South Pole. Is there a reason for this?
Blair Dowden says
Re #73: CobblyWorlds, thanks, I read that article but I guess it did not sink in (pardon the pun). The idea is that the deep ocean has excess carbon dioxide because of decaying organic matter. The present ocean circulation is not sufficient to recycle it back to the atmosphere quickly enough. But I have just learned that wind speeds, and presumably ocean circulation, are higher with warmer temperatures. Therefore CO2 must have been building up in the deep ocean for millions of years. If that rate is not very slow there must be a large imbalance.
As is often the case, I get a piece of information, but not the whole picture. I hope someone can make sense of it for me.
Blair Dowden says
Here is my picture of the global wind – ocean current system. There are two main sources of atmospheric circulation. The horizontal gradient (temperature difference between equator and poles) decreases with increasing global temperature, while the vertical gradient increases. It is not clear which one is stronger. Ocean circulation depends on wind strength, although maybe the vertical gradient is more effective. Ocean circulation appears to have been weaker at the last glacial maximum. This would suggest that increasing greenhouse gases will increase ocean circulation. But in the Cretaceous, when CO2 levels were very high, the ocean circulation was so weak that the ocean became stratified, and led to ocean anoxic events where the lower levels had no oxygen. There is a piece of the puzzle missing here.
Hank Roberts says
> deep ocean
Blair, I’d be real curious to know the sources for any actual observations — since there are a fair number of photographs of whale skeletons on the deep ocean floor, that suggests there’s not enough CO2 dissolved there to make the pH effective at dissolving bone, so I’d wonder where measurements were taken.
There are a few, for example this fascinating observation:
http://www.pnas.org/cgi/content/full/103/38/13903
Lakes of liquid CO2 in the deep sea
(footnotes there all worth following)
But that’s associated with a hydrothermal system, not decaying material.
wayne davidson says
Is not often that Dr Gavin Schmidt is wrong about something with respect to climate. But I just learned that he failed on his prediction that world wide warmer temperatures will come back at year end.
They will not, It came back full force in March just past with a staggering +1.4 C anomaly for the Northern Hemisphere. http://data.giss.nasa.gov/gistemp/ . My respect for Gavin as a tireless advocate of good climate is however not lowered at all, he taught many of us about sensitivity, and so we shall remember that temperatures are not so hard to predict , with sensitivity in mind there can be variances….
Jim Eaton says
Re: #73: D Price Says: “One thing a poster mentioned is the location of Antarctica. While all the other continents have shifted position over millions of years Antartica doesn’t seem to have moved at all from the South Pole. Is there a reason for this?”
I think you are referring to what John Lang wrote in #42: “Antarctica is an unusual case in plate techtonics since continental drift has placed Antarctica around the south pole for most of the past 600 million years including various times when it was locked together with other continental plates.”
Unless I am missing something, I thought much of what became Antarctica was equatorial from the Neoproterozoic Era until sometime after Gondwana drifted southward (late Ordovician). It wasn’t until the Mesozoic that Antarctica made its way down to the south pole (about 140my).
Alastair McDonald says
Re #74 where D Price ask why Antarctica has not moved.
I have never seen a a reason given, but then I have not seen that question asked before.
My own idea is that it is due to centrifugal/centripetal forces. Because Antarctica is centred on the South Pole the centripetal forces cancel each other out. India, on the other hand, moved away from the region of the Southern Ocean and was accelerated as it approached the equator. That acquired momentum has resulted in it crashing into Asia and building the Himalayas.
Cheers, Alastair.
P. Lewis says
Re D Price
The thought is essentially incorrect. However, it hasn’t moved much compared with how far the other continents have moved since ~90 Mya though. See the powerpoint animation on the linked page.
It seems to me (possibly wrongly, since tectonics isn’t my bag) that the movement of the other continents has constrained Antarctica’s movement over that last period.
Pascal says
raypierre
you say:
“[Response: We should also recall the implications of Dave Archer’s “long tail” of anthropogenic CO2. Something like 20% of the anthropogenic CO2 will stick around for much more than 1000 years. That tells us something about how much peak CO2 we can tolerate without committing the planet to extreme consequence of the very long term warming. For example, if we reached a peak CO2 of 1200 ppm, then the long term tail contains about 280 + (1200-280)/5 or 464 ppm CO2. That would mean that even if the deglaciation took a very long time to set in, the CO2 would indeed stay above the threshold level for a long time. This is only a crude illustrative calculation, and to do it right would take a proper ocean carbon cycle model, but I hope it gives the idea of what Hansen’s calculation implies even if the catastrophic changes require a very long time to act. –raypierre]”
this is logical.
But Hansen’s calculation gives us a CO2 target at 350ppm.
If I use your equation, I find a short term target at 630 ppm to get a 350 ppm tail.
If I use 450ppm now, we get without problem, a 314 ppm tail.
So why 350 ppm now or in few decades?
Because 314 ppm is a dangerous concentration?
I understand that the phenomenons are non-linear, but we know that a long time ago.
So what is the deep interest of this new Hansen’s article?
Pascal says
hum
following my precedent post.
314ppm implies 5.3 ln(314/280) = 0.607W/m2 forcing.
even with 1.5°C.m2/W temperature increase is, “only”, 0.9°C since preindustrial age, close to today increase.
It’s very surprising that such a minor temperature anomaly is so dangerous.
Ray Ladbury says
Blair, The deep oceans are quite isolated from the surface. The water is colder, saltier and denser. CO2 dissolves more readily under pressure and at low temperature, and this further increases density and stratification. In the very deep ocean, CO2, CH4 and H2O condens out and form clathrates. I’ve had friends on submarines, and they confirm that even in a hurricane, things are very quiet in deep water. You hear about damage to drilling platforms and pipes, etc. What people forget is that that is all pretty shallow. The ocean is another world.
Aaron Lewis says
The value of climate sensitivity should inform important economic and political decisions. As a decision tool, Hansen’s more conservative value may be a more prudent and wiser value to use than a lower value, which might be more consistent with the body of atmospheric science.
The value we select will affect how we deal with climate warming. A larger number says that, “Now, we must act more aggressively!” A smaller number says,”We can procrastinate and survive.”
If we pick a larger number, and act now, later we can say, “Oh, the actual climate sensitive number is a littler smaller.” If we pick a smaller number, and thereby fail to act in time, there will be no later. Hansen’s larger value is more prudent.
The kind of Arctic Sea Ice retreat that we have had over the last 3 years, was predicted by the IPCC models not to occur for another 40 years. If the IPCC says 40, and the correct answer is 4 years, then we cannot do engineering based directly on the IPCC predictions and projections. We must incorporate safety factors of 100 or 1,000, or even a 100,000 where large populations are at risk. Even Hansen’s number may not include a sufficient safety margin for direct use as an engineering basis of design where large numbers of people are at risk.
Our climate sensitivity number is not merely a matter of what can be defended in a publication; it is a matter of what the wise policy is.
pete best says
It is interesting to note that Dr Hansens new work suggesting an increased climate sensivivity of limiting GHG emissions to no higher than 450 CO2e which corresponds to a average temperature increase of less than 2C (high probability) means that he is taking the stance (politically) that we can use all of the conventional oil and gas but must leave the coal in the ground.
Coal is not being left in the ground just as unconventional oil sources arn’t either. Oil is currently fetching $110 a barrel on the open world market and that is making people want to produce more of it inline with global economic growth of between 2% and 3% per annum. CTL is coming online this year outside of south africa, China, Indiam, USA etc all have projects of this nature and come 2020 a billion barrels a year could be coming from this source. The Athabasca tar sands are also going to end up supply around 1.5 to 2 billion barrels a day come 2020. As conventional oil surges in price due to the effects of peak so do alternative oil sources come online from fossil fuel sources. This is why we must have a biofuel solution that takes oil away in order to have any chance of not reaching 450 CO2e in record time.
New coal fired power plants are coming online to without CCS at the present time, retrofitted by 2020, lets hope so but it is a tall order.
James Hansen just seems to of reduced our chances of limiting sea rise and atmospheric warming somewhat.
Khebab says
I have a question:
How confident are we that the the Co2-temperature lag is ~800 years?
My understanding is that it is difficult to estimate the ice age-gas age difference in ice core records. I have found a recent article that is suggesting that the 800 years lag is maybe overestimated.
http://www.clim-past.net/3/527/2007/cp-3-527-2007.pdf
I think it is an important question because this lag is the main evidence supporting the GHG amplification theory, am I right?
Can you point me to pertinent peer-reviewed literature?
Wotan says
The 6 C rise is no more than a guess. It has authenticity since this is the figure that Arrhenius arrived at in his calculations. However, Arrhenius also calculated that the rise in CO2 we have experienced since around 1900 when he published the paper should have produced 2 degrees of warming and it just hasn’t. What is not pointed out is that Arrhenius withdrew his predictions when the nature of infrared absorption became better understood and he realised that his projections were far too big.
John Smith says
Americans today (being the worlds primary energy consumers) are an example of the kinds of organisms that evolve in a degraded energy system. Evolution being only the transformation of matter-energy from an available into an unavailable state. As matter-energy becomes depleted or degraded recessive genetic characteristics grow more frequent.
Neil B. says
Since you’re talking about CO2 sensitivity, maybe this concept of mine works better on this thread. It does involve “tactical” issues not just the science: One thing to pin skeptics on, considering their rejection of the clearness of “global warming”: Ask, “OK, so you’re not sure that global warming is definitely occurring, or that if it is, not sure it is mostly man-made. But do you at least accept the action and presence of “greenhouse gases?” Water vapor is one GHG [some skeptics like to talk of how the effects of water vapor swamp those of CO2], do you also accept that CO2 is also a “greenhouse gas?” And if so, shouldn’t we at least concerned about a long-term rise in its concentration, even if we can’t agree on just what the outcome has been and will be?” That would take away a lot of steam from their evasions, and force some acknowledgment of the causal stresses in any case.
BTW, with possible solar cooling etc. we really should IMHO take a closer look at the interaction of all stimuli and not focus narrowly on the CO2 issue.
tyrannogenius
Kevin Underhill says
Re: #54 (specifically Ray Pierrehumbert’s comment on that post)
Ray says:
“Hysteresis, on the other hand, would say that no matter how long you wait, the deglaciation happens at a lower CO2 threshold than the initiation, since the ice sheet creates conditions that tend to maintain itself.”
Should that not be:
“Hysteresis, on the other hand, would say that no matter how long you wait, the deglaciation happens at a HIGHER CO2 threshold than the initiation, since the ice sheet creates conditions that tend to maintain itself.”
(caps only to show the suggested change)
It sounded like Ray was trying to present a general case in the preceding to explain hysteresis, before going on to outline a particular possibility where the hysteresis effect would disappear, and you could get a scenario that appears to be contrary to hysteresis:
“In fact, if the initiation happened during Milankovic conditions that are favorable to glaciation, while you increase CO2 at a time that’s favorable for deglaciation, the deglaciation could well happen at a lower CO2 level than the initiation”
I think I understand this later point by Ray. So to toss out numbers (made up by me, for illustrative purposes only) to see if I understand:
Perhaps in one set of conditions, the glaciation trip point is 350ppm CO2 and the deglaciation trip point is 375ppm. CO2 drops below 350ppm and glaciation occurs. And perhaps CO2 continuess to drop, say to 275ppm. Then a long time after glaciation occurs, other things change (for example: orbit, precession, etc.) and the new glaciation trip point is down to 300ppm, and the deglaciation trip point is 325ppm. So even though there is still hysteresis in the new trip points, any apparent hysteresis with respect to the original glaciation event has effectively disappeared, and you could get deglaciation with CO2 only rising back to 325ppm, even though the original glaciation occured at the higher level of 350ppm.
Am I following OK?
-Kevin
CobblyWorlds says
#76 Blair Dowden,
“As is often the case, I get a piece of information, but not the whole picture. I hope someone can make sense of it for me.”
Same here.
Understanding “reality” is like chasing the horizon. However far you go, you’ve always got further.
To add to what Ray said, check out Ekman Transport: http://oceanmotion.org/html/background/ocean-in-motion.htm and note in particular:
“At a depth of about 100 to 150 m (330 to 500 ft), the Ekman spiral has gone through less than half a turn. Yet water moves so slowly (about 4% of the surface current) in a direction opposite that of the wind that this depth is considered to be the lower limit of the wind’s influence on ocean movement.”
Then try to imagine that in the context of figure 1 of Toggweiler & Russell.
#78 Wayne Davidson,
La Nina’s can be very hard to predict, check out the spread on this current model ensemble forecast: http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/enso_advisory/figure5.gif
This year will be a cool one globally: http://news.bbc.co.uk/1/hi/sci/tech/7329799.stm
My favourite from Dr Schmidt was his “further education in radiation physics.”
https://www.realclimate.org/index.php?p=58
IMHO the result is a very informative and thought provoking discussion.
Wayne Davidson says
Cobbly, ENSO being difficult? I guess so, in some ways…. But I am a bit
perplexed by why there are no Density Weighted Temperature charts out there, anywhere? DWT’s would be the equivalent of a sun disk measurement, a snapshot of the temperature of the entire atmosphere, much less variable than surface temperatures, extremely useful for trend making. As far as 2008 being cold,
I don’t think so, not anymore.
Jim Eaton says
Just a comment for those who can’t understand why CO2 levels at several hundred parts per million can be significant.
This week, Hawaii Volcanos National Park was shut down because sulfur dioxide levels reached 9.1 parts per million in the atmosphere. That’s well above the 2 parts per million that triggers a declaration of Hawaii County’s highest alert level, a Code Purple.
Some compounds can have major effects, even at seemingly low percentages.
pete best says
http://blogs.reuters.com/environment/2008/04/11/coaly-smoke-green-ire-over-huge-india-coal-plant/
4GW coal fired power plant to be built in India using super critical coal technology for cleaner burning. No CCS in sight. It will produce upto 23 million tonnes of CO2 per annum and is partly sponsored by the IMF/WB.
With a shelf life of 60 years we are are sure looking like we are tackling climate change don’t we.
Matt says
Before I finish reading this excellent piece, can I propose a second, economic, based sensitivity.
If we had a trend for storm damage from global warming, as a percentage of all storm damage, then we have a liability reserve account for mitigation. The sensitivity of CO2, therefore, becomes whether the mitigation payments made to “green” individuals is sufficient to induce enough individuals into green mode, and what level of storm damage will we accept over the long term..
In other words, does co2 liabilities equal mitigation costs.
Earl Killian says
Could you do a post on the subject suggested by your sentence “Secondly, it subsumes the very special nature of orbital forcing – extreme regional and seasonal impacts but very little impact on the global mean radiation.”?
sidd says
On 10th April 208, Khebab wrote:
“this [icecore T-CO2] lag is s the main evidence supporting the GHG amplification theory, am I right? ”
i do not think this is the case, that the lag is the main evidence supporting GHG amplification. the physics is fairly well understood, that the main mechanisms for removal of volcanic CO2 are photosynthetic fixing, weathering of rocks and CaCO3 formation by oceanic life. These are suppressed during glaciation, resulting in accumulation of CO2 in the atmosphere.
but i could be wrong, ifso i await correction.
sidd
CobblyWorlds says
#93 Hello Wayne,
With regards this years global average temperature, my opinion would still be swayed most by the behaviour of this La Nina, and as the model spread shows, that’s likely to persist, but how intense seems to be unsure. I am aware of the trends in the Northern Hemisphere last year it’s warming was still on an upward exponential, despite a slight cooling on a global basis. See the three latitude plot from GISS: http://data.giss.nasa.gov/gistemp/graphs/Fig.B.lrg.gif
So whilst I still think that the proposition that we’re going to have a slightly cooler year this year than last is reasonable. I think this year is likely to be greater than or (almost) equal to 1 degree anomaly for GISS 23.6-90degN.
DWT data – I’d have to think about it, but not now, my learning curve looks something like the GISS northern hemisphere anomaly on the link above. ;)
Whichever way the Arctic goes this year it’ll be fascinating, but I still wouldn’t bet against Orheim’s previous statement.
PS Ben Saunders has been stymied by kit failure in his bid for the North Pole speed record: http://north.bensaunders.com/journal/entry/the-fortress-02-04-2008-18-38-00/
“The ridge was monstrous: nowhere was it less than two stories high, it stretched as far as either horizon…
…as I turned to face north, I was greeted with a troubling sight: another giant ridge, then beyond that an endless view of rubble ice so smashed up that it made anything I saw at the start of this expedition seem like child’s play…
An interesting thought occurred as I scrambled on. It’s quite possible that no one else has seen multi-year sea ice (ice that’s thick enough to survive the summer) in this state before… The consensus among the experts at Eureka was that the ice on the Canadian side this year was more fractured than they’d ever seen before.”
Jim Eager says
Re Matt @ 96: “If we had a trend for storm damage from global warming, as a percentage of all storm damage, then we have a liability reserve account for mitigation….
In other words, does co2 liabilities equal mitigation costs.”
Why limit the account to storm damage? Why not account for loss of agricultural output due to heat and water stress and salinization? Or the collapse of fisheries? Or the displacement of populations due to flooding of low-lying coastal regions and spreading desertification?
Storm damage is only one dimension of the future impact of climate change.