Guest Commentary by Jim Bouldin (UC Davis)
How much additional carbon dioxide will be released to, or removed from, the atmosphere, by the oceans and the biosphere in response to global warming over the next century? That is an important question, and David Frank and his Swiss coworkers at WSL have just published an interesting new approach to answering it. They empirically estimate the distribution of gamma, the temperature-induced carbon dioxide feedback to the climate system, given the current state of the knowledge of reconstructed temperature, and carbon dioxide concentration, over the last millennium. It is a macro-scale approach to constraining this parameter; it does not attempt to refine our knowledge about carbon dioxide flux pathways, rates or mechanisms. Regardless of general approach or specific results, I like studies like this. They bring together results from actually or potentially disparate data inputs and methods, which can be hard to keep track of, into a systematic framework. By organizing, they help to clarify, and for that there is much to be said.
Gamma has units in ppmv per ºC. It is thus the inverse of climate sensitivity, where CO2 is the forcing and T is the response. Carbon dioxide can, of course, act as both a forcing and a (relatively slow) feedback; slow at least when compared to faster feedbacks like water vapor and cloud changes. Estimates of the traditional climate sensitivity, e.g. Charney et al., (1979) are thus not affected by the study. Estimates of more broadly defined sensitivities that include slower feedbacks, (e.g. Lunt et al. (2010), Pagani et al. (2010)), could be however.
Existing estimates of gamma come primarily from analyses of coupled climate-carbon cycle (C4) models (analyzed in Friedlingstein et al., 2006), and a small number of empirical studies. The latter are based on a limited set of assumptions regarding historic temperatures and appropriate methods, while the models display a wide range of sensitivities depending on assumptions inherent to each. Values of gamma are typically positive in these studies (i.e. increased T => increased CO2).
To estimate gamma, the authors use an experimental (“ensemble”) calibration approach, by analyzing the time courses of reconstructed Northern Hemisphere T estimates, and ice core CO2 levels, from 1050 to 1800, AD. This period represents a time when both high resolution T and CO2 estimates exist, and in which the confounding effects of other possible causes of CO2 fluxes are minimized, especially the massive anthropogenic input since 1800. That input could completely swamp the temperature signal; the authors’ choice is thus designed to maximize the likelihood of detecting the T signal on CO2. The T estimates are taken from the recalibration of nine proxy-based studies from the last decade, and the CO2 from 3 Antarctic ice cores. Northern Hemisphere T estimates are used because their proxy sample sizes (largely dendro-based) are far higher than in the Southern Hemisphere. However, the results are considered globally applicable, due to the very strong correlation between hemispheric and global T values in the instrumental record (their Figure S3, r = 0.96, HadCRUT basis), and also of ice core and global mean atmospheric CO2.
The authors systematically varied both the proxy T data sources and methodologicalvariables that influence gamma, and then examined the distribution of the nearly 230,000 resulting values. The varying data sources include the nine T reconstructions (Fig 1), while the varying methods include things like the statistical smoothing method, and the time intervals used to both calibrate the proxy T record against the instrumental record, and to estimate gamma.
Figure 1. The nine temperature reconstructions (a), and 3 ice core CO2 records (b), used in the study.
Some other variables were fixed, most notably the calibration method relating the proxy and instrumental temperatures (via equalization of the mean and variance for each, over the chosen calibration interval). The authors note that this approach is not only among the mathematically simplest, but also among the best at retaining the full variance (Lee et al, 2008), and hence the amplitude, of the historic T record. This is important, given the inherent uncertainty in obtaining a T signal, even with the above-mentioned considerations regarding the analysis period chosen. They chose the time lag, ranging up to +/- 80 years, which maximized the correlation between T and CO2. This was to account for the inherent uncertainty in the time scale, and even the direction of causation, of the various physical processes involved. They also estimated the results that would be produced from 10 C4 models analyzed by Friedlingstein (2006), over the same range of temperatures (but shorter time periods).
So what did they find?
In the highlighted result of the work, the authors estimate the mean and median of gamma to be 10.2 and 7.7 ppm/ºC respectively, but, as indicated by the difference in the two, with a long tail to the right (Fig. 2). The previous empirical estimates, by contrast, come in much higher–about 40 ppm/degree. The choice of the proxy reconstruction used, and the target time period analyzed, had the largest effect on the estimates. The estimates from the ten C4 models, were higher on average; it is about twice as likely that the empirical estimates fall in the model estimates? lower quartile as in the upper. Still, six of the ten models evaluated produced results very close to the empirical estimates, and the models’ range of estimates does not exclude those from the empirical methods.
Figure 2. Distribution of gamma. Red values are from 1050-1550, blue from 1550-1800.
Are these results cause for optimism regarding the future? Well the problem with knowing the future, to flip the famous Niels Bohr quote, is that it involves prediction.
The question is hard to answer. Empirically oriented studies are inherently limited in applicability to the range of conditions they evaluate. As most of the source reconstructions used in the study show, there is no time period between 1050 and 1800, including the medieval times, which equals the global temperature state we are now in; most of it is not even close. We are in a no-analogue state with respect to mechanistic, global-scale understanding of the inter-relationship of the carbon cycle and temperature, at least for the last two or three million years. And no-analogue states are generally not a real comfortable place to be, either scientifically or societally.
Still, based on these low estimates of gamma, the authors suggest that surprises over the next century may be unlikely. The estimates are supported by the fact that more than half of the C4-based (model) results were quite close (within a couple of ppm) to the median values obtained from the empirical analysis, although the authors clearly state that the shorter time periods that the models were originally run over makes apples to apples comparisons with the empirical results tenuous. Still, this result may be evidence that the carbon cycle component of these models have, individually or collectively, captured the essential physics and biology needed to make them useful for predictions into the multi-decadal future. Also, some pre-1800, temperature independent CO2 fluxes could have contributed to the observed CO2 variation in the ice cores, which would tend to exaggerate the empirically-estimated values. The authors did attempt to control for the effects of land use change, but noted that modeled land use estimates going back 1000 years are inherently uncertain. Choosing the time lag that maximizes the T to CO2 correlation could also bias the estimates high.
On the other hand, arguments could also be made that the estimates are low. Figure 2 shows that the authors also performed their empirical analyses within two sub-intervals (1050-1550, and 1550-1800). Not only did the mean and variance differ significantly between the two (mean/s.d. of 4.3/3.5 versus 16.1/12.5 respectively), but the R squared values of the many regressions were generally much higher in the late period than in the early (their Figure S6). Given that the proxy sample size for all temperature reconstructions generally drops fairly drastically over the past millennium, especially before their 1550 dividing line, it seems at least reasonably plausible that the estimates from the later interval are more realistic. The long tail–the possibility of much higher values of gamma–also comes mainly from the later time interval, so values of gamma from say 20 to 60 ppm/ºC (e.g. Cox and Jones, 2008) certainly cannot be excluded.
But this wrangling over likely values may well be somewhat moot, given the real world situation. Even if the mean estimates as high as say 20 ppm/ºC are more realistic, this feedback rate still does not compare to the rate of increase in CO2 resulting from fossil fuel burning, which at recent rates would exceed that amount in between one and two decades.
I found some other results of this study interesting. One such involved the analysis of time lags. The authors found that in 98.5% of their regressions, CO2 lagged temperature. There will undoubtedly be those who interpret this as evidence that CO2 cannot be a driver of temperature, a common misinterpretation of the ice core record. Rather, these results from the past millennium support the usual interpretation of the ice core record over the later Pleistocene, in which CO2 acts as a feedback to temperature changes initiated by orbital forcings (see e.g. the recent paper by Ganopolski and Roche (2009)).
The study also points up the need, once again, to further constrain the carbon cycle budget. The fact that a pre-1800 time period had to be used to try to detect a signal indicates that this type of analysis is not likely to be sensitive enough to figure out how, or even if, gamma is changing in the future. The only way around that problem is via tighter constraints on the various pools and fluxes of the carbon cycle, especially those related to the terrestrial component. There is much work to be done there.
References
Charney, J.G., et al. Carbon Dioxide and Climate: A Scientific Assessment. National Academy of Sciences, Washington, DC (1979).
Cox, P. & Jones, C. Climate change – illuminating the modern dance of climate and CO2. Science 321, 1642-1644 (2008).
Frank, D. C. et al. Ensemble reconstruction constraints on the global carbon cycle sensitivity to climate. Nature 463, 527-530 (2010).
Friedlingstein, P. et al. Climate-carbon cycle feedback analysis: results from the (CMIP)-M-4 model intercomparison. J. Clim. 19, 3337-3353 (2006).
Ganopolski, A, and D. M. Roche, On the nature of lead-lag relationships during glacial-interglacial climate transitions. Quaternary Science Reviews, 28, 3361-3378 (2009).
Lee, T., Zwiers, F. & Tsao, M. Evaluation of proxy-based millennial reconstruction methods. Clim. Dyn. 31, 263-281 (2008).
Lunt, D.J., A.M. Haywood, G.A. Schmidt, U. Salzmann, P.J. Valdes, and H.J. Dowsett. Earth system sensitivity inferred from Pliocene modeling and data. Nature Geosci., 3, 60-64 (2010).
Pagani, M, Z. Liu, J. LaRiviere, and A.C.Ravelo. High Earth-system climate sensitivity determined from Pliocene carbon dioxide concentrations. Nature Geosci., 3, 27-30
John P. Reisman (OSS Foundation) says
OT: I have a new project.
The simplest, fastest way to make a difference regarding sound policy on greenhouse gas emissions: Sign the Petition!
http://www.climatelobby.com
I designed it to fit anywhere in the world. As results come in, I will begin reporting on signatures by country.
Bob says
#88, Barton:
My first thought was axial tilt, but I guess you could presume that your hemispherical world is in permanent equinox.
Alternately… do GHG’s have a sort of accidental heat distributing effect? That is, heat would cause infrared radiation to be radiated up from the surface, but not always straight up. It would radiate up at many angles. At the same time, energy is radiated back down by GHGs, but again, not straight down, but in a range of angles. In this way, heat could zig zag from the equator toward the poles (and from one side of the equator to the other, but for your model, you could just bounce back the energy that leaves your hemisphere, i.e. assume that what you are losing over the equator is what you are getting back from the other side).
Separately, how are you treating reflectivity? Is it a function of angle of incidence? If so, perhaps you have it set too high. For a rough surface like true terrain, even if the reflectivity is high at high angles, and the angle relative to the average surface of your hemisphere is higher closer to the pole, it would still hit raised objects (boulders, trees, hillsides) more directly. Perhaps you need to adjust the reflective parameters downwards at higher angles of incidence to account for a non-uniform (i.e. non-smooth) surface.
David B. Benson says
Brian Fox (82) — The 800 year figure comes from the end of LGM and the beginning of the warming to the Holocene. I know of two sources for this and doubt it not at that time. However, the data for the transition to the interglacial in MIS 11 has CO2 leading temperature at Vostok.
The point of the paper currently under discussion is that CO2 follows temperature after a short, 80 year pause. I actually find that rather long, but ok, let’s use that for now. But that’s not what the LGM->Holocene data says! Well, something else was also active, such as strong uptake by plants, or…
The central point is that climate is rather complex and our proxy data doesn’t enable us to see the full story.
Susan Kraemer (91&92) — Go find the full melt sea level rise estimates for GIS, WAIS and EAIS. Add a little more for the Patagonian ice fields and the Tibetan glaciers. That’ll give a upper bound. Also, please read David Archer’s “The Long Thaw”.
L. David Cooke says
RE: 88
Hey BPL,
A fixed, even stepped, 9 band zonal model, will clearly bake the lower zones and freeze the upper zones. Missing the 12 Deg. shift in the Zones every 3 months is critical to the reason for your extremes at the edges.
If I were you, first, I would reduce the zones. There are three simple zones, (though if you want to add seasonality I would suggest five zones). Secondly, you are clearly missing seasonality. For simplicity the use of a alternating 3/5 zone model might work for this purpose.
Taking your simple model further is where it can get interesting. It would be great if you could install regional variations with multiple latitude ranges for each zone. To achieve a rough demonstration of what I think I saw regarding insolation in the data record would be cool.
The final part you might want to reconsider is the inter-zonal flow of heat content. As of right now there is insufficient data to support this value from what I can find in the public domain.
In my opinion, there is a small problem were you are plugging in the inter-zonal exchange. As you go up in latitude, I believe the exchange for a given inter-zonal transfer of heat content would be greater, (due to reduced surface area and non-reduced heat content area). You may want to attempt a estimate based on the GEOS East and West water vapor or even IR images as an approximation of the amount of flow as it appears from a geostationary low earth orbit.
(Keep in mind the heat content added to the Poles would generally be near the tropopause and not play into ground temperatures. Instead the Polar inter-zonal heat content would likely evaporate high altitude clouds.)
(I was hoping in 2000 that we would soon have ground station Lidar similar to the NCAR tests in OK at each NWS station/metro-airport. With the change over to the GPS based sodium/water vapor tracking UCAR Cosmic Suominet dataset and the CloudSat/Calipso tools this should provide a fairly good heat content snapshot. 16 of these overpasses per day could provide a fairly good low resolution macro model of the inter-zonal heat flow, though how to get it into your model is beyond me.)
Rather then building a basic model I would be more inclined to try to build a spreadsheet or database matrix for the graphics interface; but, each to their own. (I am unlikely to attempt it on my own due to being mathematically challenged. I would not be able to form the necessary equations…)
Cheers!
Dave Cooke
Charlie Laurel says
#97–The article is dated August 29, 2000. Not a new report. Perhaps see Hansen’s new book “Storms of My Grandchildren” for his recent views (and a story connected to the article you point to).
Susan Kraemer says
#103 thanks, David Benson.
So, per David Archer – we would have ultimately 50 meters sea level rise. Anybody got a link to maps of what the remainder of our continents would look like with 50 meters higher sea levels?
Septic Matthew says
106, Susan Kraemer: So, per David Archer – we would have ultimately 50 meters sea level rise. Anybody got a link to maps of what the remainder of our continents would look like with 50 meters higher sea levels?
I thought that even a 20 foot sea level rise was only a typo, and that there was no reliable forecast of anything like that much. Are we talking here about the year 3010 or something?
Bob says
#106, Susan Kraemer —
0 to 14m. Set the water level with the drop down at the top. Zoom out and move around (it’s just google maps). Florida, Delaware and New Orleans are “fun” to look at under 14m of water.
100 meter sea rise maps
30 and 100 meter sea rise maps for the U.S. east coast
Bob says
#106, Susan Kraemer —
66m sea rise map
Note: all maps are presented without warranty or guarantee (i.e. you have to trust the people that generated and hosted them, and that could be anyone).
Thomas says
Thanks, Jim. BTW, I’m about an hours drive South of you.
[Response: Ah, beautiful downtown Stockton?]
By my last paragraph, I’m questioning whether the industrial age has brought the planet into what would essentially be a new geologic age, which some have called the Anthropocene. The premise is that the roughly 7 billion humans have so changed the way to world works, that all the empirical findings about the way the planetary system responds are brought into question. When I think about land cover, I think a great deal of it is affected by human actions other than changes in atmospheric composition/climate. For instance farmland is different than praire/forest. Grazed land is different from natural land, i.e. vegetation and soil are affected over time, even if the denisty of humans on the ground is quite low. Even for semi-pristine areas, hunting of things like apex predators and grazers is changing the way these ecosystems function. So the question naturally arises, that aside from the potential for nonlinearities, is the current/future planet likely to produce the same value of gamma as the preindustrial planet? Even in the ocean, quite aside from climate and chemical changes, we’ve removed most of the larger fish, so the whole ecology is likely to function differently than in the past.
[Response: Very good points that I wish people were more aware of. There is an element that wishes to ascribe to have the world believe we think all problems are due to climate change. No, rather climate change creates new problems, and exacerbates or complicates some other ones driven by other factors]
Now if I go back to my original argument. I was making essentially a mathematical statement: Consider the function DELTAT(T), which represents the change in atmospheric CO2 concentration (from some baseline amount) with global temperature. As long as DELTAT is continuous and has a first derivative at T, by definition gamma is that derivative. Obviously over a sufficiently large change of T the slope may change. But I can see no reason a priori to expect that an infinetessimal change in T won’t affect the permafrost carbon storage. There should in general always be some areas of marginal permafrost, and the boundary of permafrost/no permafrost might be expected to continually change as the temperature is changed. Hence a measurement of gamma over even a small range of temperature should sample the permafrost effect. None of which invalidates concerns over potential nonlinearities and tipping points. But I hope it gives some credibility to the paper’s claim that gamma can be estimated in this way.
[Response: Excellent underlying concept and argument for their approach. The extent to which it will apply to thawing permafrost remains unclear however, because we don’t know how much permafrost thawing is implicated in their empirical estimates–possibly very little.]
kasphar says
#100 and #105
Marvellous how science has progressed in the past 10 years. Where will the thinking be in another 10?
Hank Roberts says
For Susan Kramer, I scanned quickly through what Google Scholar finds searching sea level rise map
Short answer, “it depends” — for example, this publication offers some maps for short term (a century or two) changes but with caveats, including specific cautions about this to journalists who may repring their maps:
http://epa.gov/globalwarming/climatechange/effects/downloads/maps.pdf
“Efforts to project flooding and shoreline change require (1) data on land and water surface elevations, and (2) a model of coastal processes. Some questions can be answered with elevation data and no model. For example, if mean high water has an elevation of 1 meter, then in areas with little wave erosion, the 1.5-meter contour is a good estimate of the area that would be inundated at high tide if the sea rises 50 centimeters, assuming that no measures to hold back the sea are implemented. At the other extreme, along the typical ocean-coast barrier island, a good model of erosion is important; but the precise location of the 1.5-meter (5-foot) contour may be almost completely irrelevant.4 In areas where wetlands dominate, one needs both good elevation information and a model of how wetlands erode and accrete, as well as a scenario regarding future shore protection efforts….
…
… Those interested in the elevations of specific locations should consult a topographic map. Although the map illustrates elevations, it does not necessarily show the location of future shorelines. Coastal protection efforts may prevent some low-lying areas from being flooded as sea level rises; and shoreline erosion and the accretion of sediment may cause the actual shoreline to differ from what one would expect based solely on the inundation of low land.”
Edward Greisch says
91, 92 106 Susan Kraemer: The maximum possible sea level rise is something like 400 feet. Any more requires boiling oceans. Note I said maximum possible, not expected. I think David Archer is correct. The average shore is at about a 1/2 degree slope from horizontal. You can get the expansion of water from the Chem-Phys Handbook.
PS: Get yourself a degree in Physics, or at least take the freshman and sophomore courses. It is really essential in your line of work now as a protection against misinformers.
Barton Paul Levenson says
Dave,
Thanks for your comments. I will take them seriously.
The equation I’m using is Ladv = kA (BTs(i) – LastTs) where where kA is a proportionality konstant, BTs is the surface temperature in badn i, and LastTs is the mean global annual surface temperature on the previous iteration. This follows a suggestion Tom Huber made in 1997.
Completely Fed Up says
Susan, got a map with contour lines on it? One with a 50m contour?
PS Al Gore’s presentation had one with, IIRC, a 20m rise and shows Florida afterward.
Completely Fed Up says
Pablo: “I am not a physical scientist, and I rely on people like you to characterize that error as “infant-school level”.”
You don’t NEED a physics education to know, Pablo. All you need to know is that blankets keep you warm. Then G&T’s assertion that greenhouse gasses cannot warm the earth is ridiculous.
This is my REAL 100% TRUE-BLOOD beef with people: you’d all rather not think.
I completely get when you would learn from Gavin’s link because that’s information that doesn’t require common sense alone.
But that element is 100% common sense based. All it needs is a little clarity.
I suppose that is why they used so many equations: scare people off by making them THINK that this is a serious and learned physics paper and that someone with a physics education is the only one that has a chance of reading it.
But waaay back my new spectrum turned up. Wasn’t working. Ohshitohshit.
My grandad was there. But rather than suppose that since he was an invalid or too old to understand computers, he said “Have you plugged it in fully?”.
HE DIDN’T LET WHAT HE DIDN’T KNOW STOP HIM FROM THINKING.
You shouldn’t too.
And your lack of physics education should have led you to “how do blankets work”. G&T won’t have answered and you’d get shouted down on WUWT. But you could either argue the rebuttals over there or come here and ask about the ones where some law of physics is being applied and you haven’t been *taught* how to view it.
Not knowing isn’t wrong.
Not thinking is beneath you.
Completely Fed Up says
Fox: “Personally I’d have thought that if a temperature rise caused a rise in CO2 then the rise in CO2 would be the same regardless of the cause of temperature ”
But you attributed the same correlation when the cause changed.
So why would they be the same result?
If you’re warmed because you have a blanket on, that’s one result. If you’re warm because of a chemical burn, that’s another.
Yet your body knows the extra heat on the skin surface. So why is the result (death/discomfort) different?
Your skin knows it’s burning because the cause is different.
Why do you insist that there must be only one cause for one effect?
Barton Paul Levenson says
Or Bob, rather. Sorry. It’s 5 in the morning and I’ve been up since I got a wrong number at 1:30…
Ray Ladbury says
Susan Kraemer,
Here’s a map I found. It’s not “Waterworld,” but a pretty significant land area–and even moreso if you consider current population densities.
http://sb350.pbworks.com/f/1247713617/700px-Global_Sea_Level_Rise_Risks.png
Dr Nick Bone says
Re: 57. Thanks for your comment Jim, though I don’t see how it would be hard to get to 560ppm fairly soon. For instance, the IPCC A1B scenario reaches this level by about 2060, and we are tracking it closely so far.
[Response: At a rate of 1.87 ppm per year (Mauna Loa, 1998-08, http://cdiac.ornl.gov/ftp/trends/co2/maunaloa.co2), you don’t get to 560 until about 2100–Jim]
However, I also set up a small spreadsheet to look at some other scenarios. Even the current level of 390 ppm produces big long-term problems, as it gives ~1.4 degrees of Charney warming, another ~1.4 of slow feedback (albedo) warming and then ~50 ppm of extra CO2 from carbon feedbacks. Adding the warming effect of that ~50 ppm and then summing to the limit, we end up with CO2 at around 460ppm, and warming of ~4.3 degrees. That is in the range of melting all ice sheets on the planet (back to the Eocene).
[Response: 50 is from where? From this study, 2.8 x 10.2 ppm/degree = 28.6 ppm. And it also assumes your albedo feedbacks are correct. Jim]
What about a 350ppm “target”? Still no good, even if we suck CO2 out of the atmosphere to achieve it. The result after allowing for all the slow feedbacks, including carbon feedbacks, is a creep back up to ~400 ppm and ~3 degrees of warming. We are in the Pliocene with ~20-30 metres of sea level rise. Remember this is the “radical” suggestion that environmental groups are campaigning for (and which is endorsed by low-lying island states).
How about an ultra-radical reduction to 300ppm, as endorsed by (say) the “Climate Code Red” authors? Cranking through the feeback limits takes us back up to ~320 ppm and ~1.1 degrees of warming. A little bit warmer than we are now: enough to remove the summer Arctic sea ice, and continue the melt-down on Greenland and West Antarctica. Ultimately we still get several metres of sea level rise.
Also, this assumes that gamma remains in the region of ~16ppm for future warming. If it is actually as high as 30ppm, then here are the results.
[Response: Where are you getting these numbers?
560ppm -> 850ppm and 9.5 degrees C of warming
390ppm -> 580ppm and 6.3 degrees
350ppm -> 500ppm and 5 degrees
300ppm -> 375ppm and 2.5 degrees
Notice that the gamma effects are surprising large in the last two cases, because the feedback series converges rather slowly.
Before anyone accuses me of “alarmism”, I should stress that these changes wouldn’t happen all at once. They are the ultimate effects, accumulated over several centuries, though that will be of little comfort to the inhabitants of those centuries. (Sea level just keeps rising, crop belts keep shifting, communities have to relocate again and again, and more and more species go extinct.)
Completely Fed Up says
“I thought that even a 20 foot sea level rise was only a typo, and that there was no reliable forecast of anything like that much. Are we talking here about the year 3010 or something?”
We’re talking “if all the ice went bye-bye”, I think.
20ft was if Greenland and the WAIS went, not if all continental ice wend bye-bye.
In fact 50m might be merely if all greenland and antarctica went bye-bye.
Pablo N. says
Completely Fed Up: I’m just grateful that RealClimate exists to direct us to concise rebuttals. That’s all. Didn’t mean to give an impression that I don’t think. Back to work I go.
Brian Fox says
Thanks to all who have attempted to relieve me of my ignorance on this.
#98 dhogaza made an interesting point about partial pressure CO2 above the oceans, which made me think that it doesn’t matter whether the process is equilibrium or not. If it is then the driving force for absorption in sinks is reduced, so atmospheric CO2 will be higher, and if it’s an irreversible CO2 release process then more CO2 is added, again raising atmospheric concentrations.
A couple of folk have made the point that both the value of gamma and it’s time dependence depend on starting conditions eg level of glaciation, which makes me wonder to whether attempting to reduce CO2 feedback to a single constant is a simplification too far. Certainly seems to have confused me…
[Response:Few parameters are scale independent. As long as you define the scale, you’re fine–Jim]
Anyone able to answer the point about what value of gamma is assumed in or derived from coupled climate/carbon cycle models and to what extent this study informs model development ?
[Response: The models’ average are just slightly higher than the median reported by the authors. About 6 of the 10 are within a couple ppm of it. Pretty good correspondence overall]
Completely Fed Up #117
“Why do you insist….”
I’m genuinely bemused as to why you think I’m insisting on anything. Just interested in understanding, that’s all. It does seem to be a general point of agreement that forcing has the same effect regardless of source, which allows climate feedback to be applied equally to, for instance, solar and CO2 forcings. So it seems odd to expect carbon cycle feedback to be different. I don’t at all insist that it is though, very happy to have it explained.
I’ll stop now and go and re-read AR4 on the carbon cycle.
Completely Fed Up says
“Didn’t mean to give an impression that I don’t think.”
But you didn’t, Pablo. At best you avoided the effort. You assumed that this paper could only be answered by someone with a physics training.
You dnn’t: just ask yourself what would be the consequence of something you DO know about if their paper were true.
If blanketing of the earth by heat retaining CO2 didn’t work, how would heat-retaining blankets work?
Hank Roberts says
> Septic Matthew says: 9 February 2010 at 11:41 PM
> I thought that even a 20 foot sea level rise was only a typo
Look back at our previous exchange of comments on that.
“20 feet by 2100” is the famous Associated Press Wire typo, widely repeated in text and headlines and still possible to find uncorrected if you look for it.
What matters to us is _rate_of_change_.
John P. Reisman (OSS Foundation) says
#116 Completely Fed Up
You mean blankets actually keep you warm?
…so I’ve been freezing needlessly all time since G&T because I was naive enough to believe that lots of equations on a page must mean something, no matter who wrote them…
Shucks, and I was just starting to get used to the antibiotics…
(cough, cough)
;)
—
The Climate Lobby
Sign the Petition!
http://www.climatelobby.com
L. David Cooke says
RE: 102
Hey Bob,
I believe the BPL modeling experiment was in relation to a thought shared in an earlier thread regarding polar amplification. The simple radiation model he is working with as you note, would mainly only provide a reasonable explanation for the Temperate Zone or average.
In the case of total average energy your suggestions may be correct; however, in a simple model such as this, a variation in the 5-10% range is acceptable. As long as the inter-zonal transfer is correct, the gray body temperature emitted at 45 Deg. should be close to the real world measured value, if you have accurate standard values for negative and positive feed-backs.
(Without seasonality you have to throw out about 40% of the extremes of the upper and lower limits. (Hence the polar terminal 202 C would be closer to 230K min and the equatorial terminal value of 357 would be closer to 323K max creating an average at 45 Deg of approximately 276K, if my math is correct. I fear this ends up being about 8 to 10 Deg. below the accepted average or stated another way within 2.8% of the accepted real world value. (I have been here before only rather then using a thermo-dynamic integral I estimated it using longhand… and MS Excel linked via ODBC to MS Access.))
By integrating high resolution seasonality into the model at the offset it should achieve a similar output without the need to apply parametric corrections to the results. This should deliver a more accurate terminal value. This also would allow multiple variables to be run together with a more accurate energy transfer at the interaction points.
(Also note: It is likely that much of the difference between the model and the real world accepted value would likely be due to the factors you are suggesting. Where as the difference between the seasonal versus non-seasonal model values at the extreme latitudes would likely be the total inter-zonal transport. (At issue is partial radiation during the latitude transfer, which involves speed and heat content.))
Cheers!
Dave Cooke
David N says
QUOTE:Comment 11
Second, using meatball physics, [etc.]
[Response: meatball physics? :)]
ENDQUOTE:
You know, meatball physics – where you assume a spherical meatball….
(sorry…)
[Response:
Physics was never my strong suit. We think in terms of inverted conical tree boles (when we think at all)–Jim]
Edward Greisch says
120 Dr Nick Bone: You are not alarmist, but I am greatly alarmed. 6 degrees C total of warming is the for sure extinction point for Homo Sapiens. I have listed the references in comments to RealClimate so often that I hesitate to do so again. Here is a shortened list of kill mechanisms:
The #1 kill mechanism is famine. See “The Long Summer” by Brian Fagan and “Collapse” by Jared Diamond. The rain moves, disrupting agriculture. Already happening.
See: “Six Degrees” by Mark Lynas
http://www.marklynas.org/2007/4/23/six-steps-to-hell-summary-of-six-degrees-as-published-in-the-guardian
Lynas lists several kill mechanisms, the most important being famine and methane fuel-air explosions. Other mechanisms include fire storms.
H2S bubbling out of hot oceans is the final blow at 6 degrees C warming:
“Under a Green Sky” by Peter D. Ward, Ph.D., 2007.
http://www.sciam.com/article.cfm?articleID=00037A5D-A938-150E-A93883414B7F0000&sc=I100322
So, how do we keep the total warming below 2 degrees C? Are we already doomed? We don’t have time to set up a self-sustaining colony on Mars. How do we convince Republicans to quit filibustering? How do we out-spend the coal industry on advertising? I’m forwarding your comment to my senators.
Andy says
Comment by Completely Fed Up – 20ft was if Greenland and the WAIS went, not if all continental ice wend bye-bye.
No, 20 feet if either WAIS OR Greenland melts completely. Actually, more like 15 feet for WAIS and 23 feet for GIS. Also, I see 50 meters for all ice. I think it is closer to 80 meters. But really, who cares at that point?
Garrett says
@#130
I live near Cleveland where we are 128m above sea level. So, that means I’m safe :P
Nick Bone says
Re: 120 and Jim’s comments.
For the first calculations, I assumed gamma of about 16ppm per degree C, based on the transition from LGM of 180 ppm to pre-industrial CO2 at 280ppm with a temperature change of about 6 degrees. This was also in line with comments 1 and 2 above, and with the paper’s estimates for the latter period (1550-1800) which you suggested were more plausible. I fully agree that if gamma is as low as 10.2 ppm then the impacts are not nearly so great (will recalculate if you like).
[Response: OK, thanks.]
Equally, the impact is not as great if the albedo sensitivities are much lower (I was using Hansen’s estimates here, again based on the gap between LGM and pre-industrial.)
For the latter calculations, I considered what would happen hypothetically for a gamma of 30ppm per degree C, which is in the long tail of this paper. Not well supported but not ruled out either.
[Response: OK]
The exact numbers in both cases came from an iterative procedure:
1. Take a CO2 level left in the atmosphere from human impacts.
2. Calculate long-term warming response (after allowing for albedo feedbacks).
3. Calculate additional CO2 as a result of the warming response found at step 2.
4. Calculate additional warming induced by this additional CO2 from step 3.
5. Calculate additional CO2 induced by the additional warming from step 4.
[Response: Yes, got that, just wasn’t sure on the rationale for your numbers–thanks–Jim]
and so on until the series converges. It was a fairly simple Excel sheet: if the process is not described well enough by the above then I will mail you the sheet.
Pablo N. says
Fed Up: Chill out. I realize that if I had copied and pasted Tscheuschner’s name correctly into the search box, I would have found the damn page myself and wouldn’t be getting dressed down by someone who assumes I’m lazy. Be fed up all you want, but I suggest not taking out your angst on people who are trying to keep up.
[Response: Some articles, like that one, are common denialist talking points and they can trigger responses. Keep coming back, there are lots of helpful people here who can answer honest questions–Jim]
Thomas says
Jim, thanks for the comments, you have an amazing amount of patience, and I suspect that is what is needed, as without several iterations to try to achieve clear thought, some people will not understand it. [And that is not meant to be disrespectful, heck I wish I had had someone to explain concepts that I didn’t quite grasp].
[Response: Thanks Thomas. You touched on a suite of important ideas in your post–I wanted to respond to more of them, but alas, time constraints. Maybe will happen still. No disrespect felt whatsoever.]
In any case I’m about a half hour West of Stockton, but I have a son at UofP.
[Response: Ah, much better, out in the Delta somewhere, maybe near the river? I love it out there.]
jyyh says
Thank you Jim (for the reply), I think I need to go to read the entire study.
[Response: And thank you for making the effort to learn. Yes, always try to go as close to the source of the information as you can. ps -there is a brief synopsis of the article in that issue of Nature also, that might help too.]
Edward Greisch says
132 Nick Bone: I’m waiting for the results of your new calculation. Can you give it a time scale? What do we have to do by when? We should have taken action a long time ago? Now we have to do geo-engineering as well?
Completely Fed Up says
PAblo “I realize that if I had copied and pasted Tscheuschner’s name correctly into the search box, I would have found the damn page myself”
I missed the bit where you said that. So apologies because without that it looked like you went “Ohh, scary maths!” and turned off.
You don’t have to understand everything.
Read some of Feynman’s autobigraphies. He’s always going on about how he makes a real-life analogue of some mathematical setup and then when someone says “and the conclusion is…” he looks at that real-life analogue and goes “no, that’s not what this fuzzy spiky thing does”.
Remember: even Feynman wasn’t born with QCD information built in. He had to learn it.
Jim: the problem for me isn’t that it’s a common counterpoint by dittos, but that you don’t even have to know anything about science beyond what a few terms mean which everyone who went to school until 16 had been taught.
Rather than a prime example of denialist idiocy, it’s a prime example of how bad their science is and a prime example of people believing something just because it looks technical and says what they want.
Just about ANYONE can see the problems in G&T’s paper.
Just about NOBODY who complains against AGW bothers to look.
Completely Fed Up says
“Also, I see 50 meters for all ice. I think it is closer to 80 meters. But really, who cares at that point?”
Someone whose house is at 27m ASL?
:-)
Kevin McKinney says
@#137–
Yes, G & T and similar work do have problems evident to those who, like myself, are equipped with little more than logic and a little determination to think.
For me, the classic example from G & T is probably the bit where they attempt to use a pot on a stove as a “counterexample” to the greenhouse effect. (Great analogy, other than the fact that radiative physics are pretty much irrelevant to the pot on the stove, while being the very essence of the greenhouse effect.)
A different example would be the “we don’t need no stinking greenhouse effect” meme which comes out in various blog discussions. This is the idea that “a non-conducting N2/O2 atmosphere” will by itself raise planetary temperatures by somehow “keeping heat in.” (It seems intuitively obvious to some, apparently, but those folks must be overlooking the fact that zero heat can leave the atmosphere–any atmosphere– by conduction in the first place. “Nothing from nothing leaves nothing,” and this line of er, thought, clearly gives us “bupkes.”)
Ray Ladbury says
CFU and Pablo N.,
Look Pablo is clearly one of the white hats. He’s trying to learn–and that’s what this forum is about. Let’s help him all we can and try to make learning a pleasurable experience. It is remaining blinkered and stupid that we want to make unpleasant.
[Response: Bingo.
Jim]
Completely Fed Up says
“140
Ray Ladbury says:
11 February 2010 at 9:08 AM
CFU and Pablo N.,
Look Pablo is clearly one of the white hats.”
Where did I say he was a denialist?
NOWHERE.
Where did I say he was trolling?
NOWHERE.
What did I say?
“You don’t need to know physics to know his paper is bull”.
[Response: CFU, Ray’s not accusing you of calling him a denialist, he’s just urging you to be as helpful as possible to those with legitimate questions. We can clearly see that Pablo qualifies in that regard, and in fact is a prototypical example of the type of person we want to help. Until someone demonstrates that s/he stubbornly refuses to listen to rational arguments or read suggested info, the onus is on us to make an honest effort (which you did in your initial response). People come here with all manner of levels of understanding. Now, no more on this please, let’s move forward–Jim]
Septic Matthew says
125, Hank Roberts: What matters to us is _rate_of_change_.
That was why I asked if the expected date was 3010.
John E. Pearson says
Pablo N. says: Fed Up: Chill out.
I agree.
Completely Fed Up says
“[Response: CFU, Ray’s not accusing you of calling him a denialist, he’s just urging you to be as helpful as possible to those with legitimate questions.]”
And I’m being helpful: I’m pointing out that there’s a lot more available to someone than just going to authority and asking.
For some of the most egregious stuff made up to “prove” AGW is wrong (and G&T’s workings is among the very best examples), it’s enough just to read it skeptically and go “if this was right, what do I think this means for what I *do* know?”.
[Response: Yes, it’s helpful when we all stick to substantitive arguments, and generally not when we don’t. No more on this topic–Jim]
[edit]
Dr Nick Bone says
Re 132 and 136: I’ve put a spreadsheet and summary table here.
Note that I calculated a range of cases for the table, using Earth System Sensitivity of 2xCharney (as estimated by Hansen et al) or 1.5xCharney (as estimated by Lunt et al) and gamma ranging from 10 to 30.
I also looked at the case for Charney sensitivity alone, which gives an estimate of the effects prior to any slow albedo feedbacks. I used the typical estimate of 3 degrees C for CO2 doubling, though you can vary this if you like.
Re: 129. Edward, I haven’t read Lynas’ book (or Ward’s for thar matter). Also, I’m not a climate scientist, so can’t comment with any real authority on them: my PhD is in a different field. But judging by the summary you linked to, the 5 and 6 degree sections seem mis-calibrated. From what I’ve read, most of the Eocene was hotter than 6 degrees above pre-industrial (with the PETM representing an especially big spike on top of an already hot background inherited from the Paleocene). Though the climate was very, very different from today’s, the world avoided being burned up by methane fireballs or suffocating from H2S poisoning. Those effects seem to require much higher warming still. Maybe we get them with 4xCO2 from industrial pollution, and a big methane belch, but that’s just me speculating (and maybe Lynas is speculating too).
Jim Bouldin says
Bob, see inline responses to your post #30, as I see it.
Jim
Pablo N. says
I realize now that this comment thread was the wrong place to pose my question – I should have spent a little more time searching, myself. Sorry to muddy the stream. And I seem to have miscommunicated my appreciation for RC as a source for non-climate scientists to get trusted refutations of the blizzard of denialist talking points which we are all dealing with inside and outside of the blogosphere. This, I know, was one of the reasons why RC was established, as non-climate scientists are integral to discrediting the talking points in public and pushing for action.
The other motivation behind RC was to have enlightened discussions about emerging research. So, thank you, Jim, for your presentation. Despite setting myself up as someone who knows squat about the greenhouse effect and unwilling to learn, I understood a great deal.
[Response: Great! There’s a LOT of material to learn from here, so keep at it!–Jim]
Edward Greisch says
138 Completely Fed Up: My house is 600 feet above sea level now. I’m safe, sea level wise. How about your house?
139 Kevin McKinney: “equipped with little more than logic and a little determination to think.” You need to do EXPERIMENTS in your own LABORATORY. The ancient Greeks had logic. It got them almost nowhere. Galileo did experiments. That is why Galileo was the inventor of SCIENCE. Prior to Galileo, there was no science.
What Science is all about is really quite simply and elegantly stated in the book: “Science and Immortality” by Charles B. Paul 1980
University of California Press
The Eloges of the Paris Academy of Sciences (1699-1791)
page 99: “Science is not so much a natural as a moral philosophy”. [Drylabbing [fudging data] will get you fired.]
page 106: Nature isn’t just the final authority, Nature is the Only authority.
Scientists do not vote on what is the truth. There is only one vote and Nature owns it. We find out what Nature’s vote is by doing Scientific [public and replicable] experiments. Scientific [public and replicable] experiments are the only source of truth.
145 Dr Nick Bone: Thanks. I hope you are right. The Scientific American article also says 6 degrees, but I think I heard elsewhere that the End-Permian Great Death was maybe 20 degrees C of warming.
David B. Benson says
Septic Matthew (107) — A sea level rise of 50 meters will take quite long to happen, many centuries most likely. However, at current and projected levels of CO2, this seems rather likely; that isn’t even all of the ice. Please do read David Archer’s “The Long Thaw”.
Completely Fed Up says
“148
Edward Greisch says:
11 February 2010 at 3:20 PM
138 Completely Fed Up: My house is 600 feet above sea level now. I’m safe, sea level wise. How about your house?”
About 400ft ASL.
Why do you want to know? (heck, why did you feel the need to let me know?)