The long-awaited NAS synthesis report on surface temperature reconstructions over the last few millennia is being released today. It’s a long (155 page) report and will take a while to digest, but we applaud the committee for having tried to master a dense thicket of publications and materials on the subject over a relatively short time.
It is probably expecting too much for one report might to put to rest all the outstanding issues in a still-developing field. And given the considerable length of the report, we have little doubt that keen contrarians will be able to mine the report for skeptical-sounding sentences and cherry-pick the findings. However, it is the big picture conclusions that have the most relevance for the lay public and policymakers, and it is re-assuring (and unsurprising) to see that the panel has found reason to support the key mainstream findings of past research, including points that we have highlighted previously:
1) The authors of the report accurately report the considerable uncertainties that were acknowledged by seminal earlier studies. In particular, Mann et al 1999, which was entitled (emphasis added) “Northern Hemisphere Temperatures During the Past Millennium: Inferences, Uncertainties, and Limitations, emphasized the uncertainties and caveats, particularly with regard to reconstructing large-scale surface temperature patterns prior to about AD 1600
The report makes due note of this (pg. 119 of the report):
The Mann et al. large-scale surface temperature reconstructions were the first to include explicit statistical error bars, which provide an indication of the confidence that can be placed in the results. In the Mann et al. work, the error bars were relatively small back to about A.D. 1600, but much larger for A.D. 1000–1600. The lower precision during earlier times is caused primarily by the limited availability of annually resolved paleoclimate data: That is, the farther back in time, the harder it is to find evidence that provides reliable annual information. For the period before about A.D. 900, annual data series are very few in number, and the non-annually resolved data used in reconstructions introduce additional uncertainties.
2) The authors accurately note that, despite those uncertainties, the key conclusions reached by those studies (i.e., that hemispheric-scale warmth in recent decades is likely unprecedented over at last the past millennium) have been substantiated by many other studies, and the confidence in those conclusions appears greater, not lesser, after nearly an additional decade of research (pg. 109 of the report):
The basic conclusion of Mann et al. (1998, 1999) was that the late 20th century warmth in the Northern Hemisphere was unprecedented during at least the last 1,000 years. This conclusion has subsequently been supported by an array of evidence that includes the additional large-scale surface temperature reconstructions and documentation of the spatial coherence of recent warming described above (Cook et al. 2004, Moberg et al. 2005, Rutherford et al. 2005, D’Arrigo et al. 2006, Osborn and Briffa 2006, Wahl and Ammann in press), and also the pronounced changes in a variety of local proxy indicators described in previous chapters (e.g., Thompson et al. in press). Based on the analyses presented in the original papers by Mann et al. and this newer supporting evidence, the committee finds it plausible that the Northern Hemisphere was warmer during the last few decades of the 20th century than during any comparable period over the preceding millennium.
3) Despite the attempts of some commentators to attempt conflate the evidence for the existence of human influences on climate with the validity of a single reconstruction (e.g. that of Mann et al) it is quite clear that the evidence for anthropogenic impacts on climate is quite strong irrespective of whether or not the original “hockey stick” is correct. The report makes repeated note of this key point, for example on page 4 of the report:
Surface temperature reconstructions for periods prior to the industrial era are only one of multiple lines of evidence supporting the conclusion that climatic warming is occurring in response to human activities, and they are not the primary evidence.
and again on page 9 of the report:
The reconstruction produced by Dr. Mann and his colleagues was just one step in a long process of research, and it is not (as sometimes presented) a clinching argument for anthropogenic global warming, but rather one of many independent lines of research on global climate change.
4) That it is time for the paleoclimate research community to move ahead of the now tired debates about the ‘Hockey Stick”. The authors, as most scientists currently working in the field, recognize the need to work to reduce current uncertainties by aiming to:
- obtain additional, improved, and updated paleoclimate proxies of past climate that can aid in decreasing the existing uncertainties,
- focus greater attention on the relative strengths and weaknesses of alternative types of proxy information such as tree-ring, corals, and ice cores (see e.g. here and here) so that more robust climate reconstructions can be formed making use of complementary information available from “multi-proxy” networks,
- pay attention to legitimate (rather than specious, e.g. here and here) issues with regard to the strengths and weaknesses of alternative paleoclimate proxy reconstruction methods [it was encouraging, for example, that the authors of the report favor the use of some combination of the standard measures of the fidelity or “skill” of paleoclimate reconstructions (“RE” and “CE”) generally used by paleoclimate researchers, and dismiss as without merit the use of simple correlation coefficients], and
- move beyond the often inappropriate focus given to hemispheric mean temperature, and give greater future attention to the detailed spatial patterns of past temperature changes, as well as reconstructions of precipitation and atmospheric circulation variables. These can provide greater insight into the underlying dynamics of the climate system and the key role that dynamical modes such as the El Nino/Southern Oscillation may play in climate change.
While we agree with the bottom-line conclusions in the report, this is not to say that we don’t also have some criticisms:
The report provides an unbalanced discussion of some significant technical details. For example, there is quite a bit of discussion of possible biases involved in centering conventions used in Principal Component Analysis (PCA). Yet the report only vaguely alludes to the fact that published work [see both Wahl and Ammann (2006) and Rutherford et al (2005) cited in the report] clearly demonstrates that this doesn’t introduce any significant bias as long as statistically significant patterns in the data are not discarded.
The report calls into question the confidence in certain fairly specific previous conclusions, e.g. the tentative conclusion in Mann et al (1999) that the 1990s and 1998 were the warmest decade and year, respectively, of the past 1000 years. There are two important points here left unmentioned in the report: (1) Mann et al (1999) attached the qualifier “likely” to these conclusions, which in standard (e.g. IPCC) parlance corresponds to a roughly 2/3 probability, i.e., implies slightly better than even odds of being true, a fairly conservative conclusion. The conclusion was followed by the statement “More widespread high-resolution data which can resolve millennial-scale variability are needed before more confident conclusions can be reached…”. (2) The conclusions regarding the decade of the 1990s and the year 1998 follow from reasonable assumptions. The late 20th century Northern Hemisphere average warmth, according to Mann et al (1999) and all subsequent studies, appears anomalous in at least the past 1000 years. So the base state about which higher-frequency (e.g. interannual) fluctuations occur was substantially higher at the end of the late 20th century then during any earlier comparable period. Unless the interannual fluctuations in hemispheric mean temperature during earlier centuries were significantly greater in amplitude than during the 20th century (and there is no obvious evidence that they were), then it reasonably follows that the thresholds reached during the 1990s or during 1998–an anomalously warm decade and year respectively from the perspective of the instrumental record–are unlikely to have been breached in earlier centuries.
The committee does not seem to have grasped fully the significance of some very recently published results that are cited in the report, notably the papers by Rutherford et al (2005) and Wahl and Amman (2006) that further demonstrate the robustness of the Mann et al (1998;1999) conclusions and subject some published criticisms of those conclusions to rigorous scrutiny. The authors cite somewhat uncritically the Von Storch et al (2004) study arguing that climate field reconstruction techniques can significantly underestimate long-term trends, despite the fact that errors have now been acknowledged in that study (see here and here), and that an independent study not cited, but published well before the report was drafted, comes to very different conclusions. This reflects one of a number of inevitable minor holes in this quickly prepared report.
One of our main criticisms though doesn’t involve the report itself, but the press release that accompanied it. We’ve noted before the importance of making sure that the press will be able to correctly contextualise a release and the bad consequences of that not happening. Well, in this case the press release annoucing the publication of the report was often inconsistent with what was actually stated in the report. It was titled: ‘High Confidence’ That Planet Is Warmest in 400 Years; Less Confidence in Temperature Reconstructions Prior to 1600 which is not news at all and almost trivially true. However, it is likely to be mis-interpreted to imply that there is no confidence in reconstructions prior to 1600, which is the opposite of the conclusion of the report. Additionally, the text appears to have confused the key distinction between our knowledge of global mean temperature in past centuries (which is very limited owing to the sparseness of long available proxy data in the Southern Hemisphere, and for which a reconstruction was not attempted by Mann et al or most other researchers), with our knowledge of Northern Hemisphere mean temperature (which is considerably better; hence the emphasis of this quantity in past work).
Finally, it is worth pointing out and emphasising that the report provides absolutely no support for the oft-heard claims that the original hockey stick was the result of ‘programming errors’, or was ‘not reproducible’, or there was some scientific misconduct involved. These claims were always spurious and should now finally be laid to rest. Hopefully, we can all start to move forward with the science again.
Hank Roberts says
> 43, 32
Gar, thanks for the short list of links for hoping the next decade will allow turning things around.
I wish someone from Mr. Gore’s website would look in here and participate — and pick up this sort of information!
The switch
— “not enough info to change, oops, now it’s too late to change” —
as predicted has been the theme of the PR advocacy “science” baloney factories.
John L. McCormick says
RE: #48 Eric, I guess I set myself up by asking for China’s renewable energy equipment purchase orders. So, I ask if you or others can advance the discussion on China’s (and the US) reversal of its appetite for fossil fuels.
From British Petroleum’s 2005 energy statistics I see the following 2005 coal and electric generation data for China compared to US:
Coal (Million tons of oil equivalent)
US…….575.4……1.9% increase over 2004
China…1081.9…..10.9% increase over 2004
Electric generation (terrawatt-hours)
US……4239…….2.0% increase over 2004
China…2475……12.6% increase over 2004
China is reportedly building a new coal-fired electric generation station every week.
Private automobile ownership among newly employed Chinese shows every indication of continuing its fast pace.
According to the May 12 China Daily:
Total demand for new vehicles is forecast to climb by 12 percent to 5.6 million units this year, with sales of passenger cars rising 15 percent to 2.6 million units, it said.
I am more interested in the macro side of global energy demand because the new and existing fixed assets propping up the energy demand growth
are the sources of climate-forcing gases.
Without diminishing the importance of Chinese’ use of efficient roof top water heaters, I see no connection between that information and my opinion that adapting (NOT AT THE EXPENSE OF EVERY EFFORT TO MITIGATE CLIMATE-FORCING GASES) to what is about to hit us in the face is not a cop-out..it is a critical part of our discussion.
Eli Rabett says
It would be pretty hard for heat and electrical generation in China NOT to become more efficienct, given the state of the art even ten years ago when I visited there.
There has been a top level commitment to increased efficiency. For example see http://tinyurl.com/n27vz
“China is shifting the national energy policy by putting its first priority on energy conservation and improving the energy efficiency from its previous emphasis on energy exploitation.
The move will help the country control the emission of carbon dioxide to meet possible Kyoto Protocol obligations years ahead of schedule, experts said. ”
“…According to the government’s blueprint, the energy consumption for every 10,000 yuan (US$1,210) gross domestic product (GDP) is expected to drop by 16 per cent from 2.68 tons of coal equivalent in 2002 to 2.25 tons in 2010.
By 2020, the average consumption will further reduce to 1.54 tons of coal equivalent, 43 per cent lower the level in 2002.
By 2010, the energy consumption efficiency of major industrial products, such as steel, aluminum and electricity, is expected to reach the level of developed countries in the early 1990s. ”
“…After experiencing GDP growth of 204 per cent since 1990, Chinese carbon dioxide pollution increased by 44.5 per cent to 3.31 billion tons in 2002.”
Fernando Magyar says
Re: 48
As someone who lives in hurricane prone South Florida and experienced massive power outages after Wilma, I was most struck by the gorgeous sunny weather that lasted for almost a week after the hurricane had passed. It ocurred to me that this was the week when power outages were most severe. No hot water, no electricity, no refigeration, and no gasoline because most gas stations didn’t have any working pumps. As I rode my bicycle past the downed trees and power lines I saw roof top after rooftop (most of them intact by the way) bathed in golden Florida sunshine and almost nobody was taking advantage of it. It bothers me to this day. Especially given that there seems to be a massive campaign under way to promote gasoline powered generators for emergency home use in this area. I won’t even get into problems such as potential carbon monoxide poisioning stemming from the improper use of such generators. Anyone who sat in long lines trying to get gasoline from the gas stations should wonder if that is such a good idea. Not to mention that there are already reports of home owners storing gasoline unsafely in their homes. Now wouldn’t it seem that South Florida among a few other states in similar circumstances might be the ideal candidates for some kind of emergency solar and wind powered survival kits. I think it should be possible for FEMA to promote something like that instead of gasoline powered generators. I did some research and with off the shelf components such as the equipment commonly availabe on your average sailboat or RV it should be feasible. Add to that a Solar evacuated tube hot water heater… I have relatives who own a small farm in Brazil and they use a few coils of black garden hose which works pretty well.
For a few hundred dollars retail it is possible to add a 12v DC refrigerator. I think it would sure beat trucking in 18 wheelers with ice and water that sit idling for days in parking lots without knowing where they are supposed to unload.
Is it just me or does anyone else see a potential for planting the seeds of change here? If anyone has the means to put these kits together I have a feeling there might be a market down here. Heck I’ll go door to door and sell them. Certainly Home Depot should carry them right next to the gasoline generators with the big red signs saying “Warning: Danger, improper operation of this device could cause serious injury or death to you and the global environment”. Sorry sometimes I get a little frustrated waitng around for things to start changing! I even have an acronym: H.A.E.S.S. Hurricane aftermath emergency survival system. BTW even if you don’t believe in global warming the consensus seems to be that we are in for a deacade or two of increased hurricane activity so there will still be a need…
John L. McCormick says
RE: #53, Eli, you are reading too much into a Dec., 2004, China Daily story. I will agree the Chinese gvernment realizes it has a masive energy challenge ahead and efficiency will be a priority in its planning.
However, China has virtually no domestic oil production and very limited economically recoverable reserves of petroleum. Thus, it is shopping around the world for supply while it is now capitalizing on coal liquefaction plants to provide domestic sources of oil. They are not building carbon sequestration into those facilities and we can be sure the continuing improvement in the Chinese lifestyle will cause greater pressure on the electric sector.
Efficiency improvements are vital to China’s future but it is working from a base of 1.35 billion real and potential energy consumers.
How, does the China of 2006 figure into your comment?
Eric Swanson says
We’ve drifted way off topic, but here goes anyway.
RE: #52 – I agree that China is investing rapidly in coal for electric generation, which won’t be good as a result of the high CO2 emissions which will result. And yes, it appears that China is fast becoming another automobile nation, with freeways being built rapidly. Furthermore, there are reports that China’s air traffic is also increasing rapidly. Their oil imports are surging as a result. As you point out, they still have a way to go to catch up with our total electric production, which would still only give their 1,200 million people about 1/4 the kWh per person as we enjoy in the U.S. There is a small opportunity for them to do better, as they make the compact flourescent lights that use only 1/3 the kWh as incandescent bulbs. That with greater attention to overall efficiency and the use of more solar may reduce their impact for a while. The trends are all quite scary to my mind, as our government in the U.S. still appears to be unwilling to admit that there is much of a problem.
RE: #54 – Yes, there are lots of ways to prepare for an emergency. Campers have refrigerators that operate on propane, as well as electricity. Some can be powered 3 ways, with 12 volt as well as 120v. However, the typical RV refrigerator costs a bundle, perhaps $700. PV arrays aren’t cheap at about $4 per kW and one would also need storage, typically batteries. If the backup is to provide AC, then there’s the extra cost of an inverter. Generators are cheap, since they (hopefully) won’t be used very often. Most people could get by with 1000 watts, unless one wanted to heat water or pump a deep well. All one needs do is stockpile a 55 gallon drum of gasoline (or 2?). If one wants to go whole hog, a relatively large PV array can be installed and hooked to the AC grid, thus one actually gets a dollar return for having the backup system.
Duncan Munro says
This is a link to the online version of the NAS report. This report is very easy to read, as it was written specifically to be understandable by laypeople. It is nearly 200 pages long, but much of the content consists of graphs and illustrations, plus a long bibliography, so that it is easy to read the entire report in a couple of hours.
http://www.nap.edu/catalog/11676.html#toc
Duncan Munro
Don Baccus says
50: Stephen McIntyre told BBC News that he felt the report had upheld “virtually all of our technical criticisms of MBH and did not reject or refute any direct points that we made.”
This is a bit like William Dembski’s declaring the verdict in the Dover ID trial a victory for his side …
Fernando Magyar says
Re: 56
Not to take too much more of time on this topic, I am aware of the costs involved in setting up a large grid tied PV system. I was thinking something much more modest we are talking emergency and very basic survival for a few days. Not luxury living in a five star hotel. Keep in mind that a few million users were completely without power for a few days after Wilma and they were literally in the dark for the duration. I came up with a minimal system for a cost of around $1700.00 dollars for off the shelf components including solar panels, a charge converter,an inverter to run a small ac TV, a couple deep cycle batteries a very small 16 quart portable 12v dc freezer/refigerator.
As for storing 55 gallon drums of gasoline, I think that our local fire marshalls have a tendency to frown on that practice within the confines of the city. Especially if large segments of the population were to suddenly adopt that practice all at once.
Mark Zimmerman says
Grid tied systems must shut down in power outages. The utilities understandably do not want their workers zapped when trying to repair outages.
For back-up power, I recommend a diesel generator and biodiesel. Biodiesel is not flamable and nontoxic; non-polluting when used as fuel and relatively green-house gas neutral.
Eli Rabett says
Given that in a decade China had a 204 % increase in GDP with only a 45% increase in CO2 emissions, and a lot of possible future improvements in efficiency I would not be so pessimistic as John McCormick.
Another, important point, is that as the standard of living and energy efficiency increases in China, much of the Asian black cloud forcing will disappear which will couteract increases in CO2 generation.
[Response: Almost every thread seems to eventually drift into discussions of this sort, and I agree it’s interesting. However, it’s way, way off topic for this article. Please, let’s find some other site for discussing issues like China’s energy future, and get back to things related to the problem of estimating the climate of the past few millennia. –raypierre]
Steve Sadlov says
RE: #40 – I really must wonder the degree to which most “climate scientists” have been exposed to, or have taken the time to learn about, the sorts of finite element modeling systems commonly used in mechanical engineering, fluid dynamics, aeronautical engineering, hydraulics, thermal engineering and electrical engineering/electromagnetic engineering. I think many in this field, having developed their models in-house during the early days of supercomputers, may be sort of insular in their views regarding modeling and regarding just what is possible today. To be fair, I must give proper credit to the work that has been done to date. That said, a bit more openness to investigating new modeling techniques and tools, leveraging demonstrated effective applications used widely in industry, may allow some of the fundamental problems of developing effective GCMs to be solved.
[Response: Wonder no longer. First off, there have been a number of attempts to use finite element modelling in climate related problems – particularly in the ocean, but none have yet made it into the fully coupled models. It’s worth stepping back and thinking about why that is. Finite element modelling is at its best in dealing with situations where you know a priori where the interesting small scale stuff is likely to happen so that you can concentrate your resolution and resources there. This works very well for aerofoil modelling (where you know where the wake is etc.), coastal ocean modelling (where you where the topogrpahy is) etc. It doesn’t work well in situations where the small scale features are either ubiquitos or can appear anywhere. In those circumstances you are just as well off with a finite difference scheme on a regular grid and a uniform increase in resolution as resources allow. There is a substantial overhead in adopting a finite element grid over a finite difference scheme, so there has to be a big improvement in results for the same computational effort. In the atmosphere, you need resolution in the boundary layer, near the tropopause, in the tropics, in mid-latitudes, near mountains etc. This ends up requiring more resolution almost everywhere, and so it ends up being just as straightforward to stick with a finite difference model and having better resolution everywhere. It’s more of close call for ocean models, but (as far as I’m aware) nobody has a functioning ocean model (including all the physics of mixing, isopycnal diffusion etc.) that is global in extent that could be coupled to an atmospheric model (though I am aware of efforts in that direction). However, the biggest problems with GCMs are not related to the grid or to advection, but to parameterisations of unresolved physics (clouds, moist convection, mixing etc.). Finite element modelling is completely irrelevant to that. – gavin]
isaac held says
Following up on Gavins’s response to Steve (#62):
Check http://www.nrlmry.navy.mil/PDE/2006/Program6.htm for a taste of the most recent developments in numerical methods for the fluid dynamics in atmospheric models (there is a bit on ocean models as well). By the way, most ice sheet models are finite element models.
Steve Sadlov says
Good discussion. RE: #62 – 63. The overhead challenge should be less of a factor, with increasingly inexpensive computing resources and supercomputer power in much more commerical, commodity servers. As for finite element vs finite difference, personally, I am not willing to abandon the former just yet. It would be particularly interesting to understand what the implications would be based on a finite element model geared toward picking up small scale turbulance and small scale thermal gradients. Maybe at some point, some sort of hybrid finite difference – finite element construct would be something to consider. Interesting link – the navy site.
Geert Brethouwer says
Re # 64. Whatever numerical technique you use (finite difference, finite element, spectral elements) the grid will be too course to resolve the full range of turbulence scales. The smallest turbulence scales are of the order of a centimeter in the atmosphere. Even in the forseeable future it is impossible to resolve that. So the use of subgrid models for the turbulence in atmospheric (and also ocean) models is unavoidable, also when
finite elements are used. The question is just which method is the most accurate and efficient (a finer grid will of course increase the accuracy) and it not sure that finite elements are better in that respect.
Mark Zimmerman says
Inhofe’s Senate Committee issed a statement yesterday in response to the AP report that climate scientists liked Gore’s movie. They cite the Academy of Sciences’ report as debunking the hockey stick!
“The AP also chose to ignore Gore’s reliance on the now-discredited “hockey stick” by Dr. Michael Mann, which claims that temperatures in the Northern Hemisphere remained relatively stable over 900 years, then spiked upward in the 20th century, and that the 1990’s were the warmest decade in at least 1000 years. Last week’s National Academy of Sciences report dispelled Mann’s often cited claims by reaffirming the existence of both the Medieval Warm Period and the Little Ice Age”
http://epw.senate.gov/pressitem.cfm?party=rep&id=257909
[edited]
shargash says
Re: #66: Wow! is all I can say.
“And recent articles have chosen to ignore astronomy’s reliance on the now-discredited heliocentric theory of the solar system by reaffirming that the sky is blue and water is wet.”
David donovan says
#66.
In the U.S. is there any law against the government releasing patentently false and/or misleading information ? Guess I am not surprised at this comming from Inhome’s comittee with the U.S. surpreme cout agreeing to hear the case related to the EPA and CO2 emissions.
Ferdinand Engelbeen says
Re #39
Just back from a few weeks Ireland and a one-day course in Oxford (Unravelling Climate Change Models), I was not aware of the publication of the NAS report, until reading the discussions here and at the other side of the barricade…
I learned in Oxford with a simple climate model, that it is possible to change the different responses to forcings within large margins, where different sets have nearly the same (temperature) result for the 1900-2000 period (despite a truncation of the solar series). Like halving the sensitivity for 2xCO2 + halving the effect of aerosols + tripling the sensitivity for solar. The main difference was in the “projections”, where the expected warming was halved too, no matter what scenario was used.
Is this plausible? That largely depends of natural variability in the pre-industrial past. If there was a huge variability (as indicated by reconstructions of Esper, Moberg and Huang) then much of the last century warming is natural and the response to CO2 is at the low side. If there was little variance (as indicated by MBH98/99 and Crowley and Lowery,…), then the response to CO2 must be higher to cover the 1900-2000 period.
I hope that the millennium project really explores the full wide of the past variance and doesn’t restrict the responses by limits (like a minimum response to thropospheric aerosols). I just have read the report of the Cubasch e.a. (2006) simulations, which give more Moberg-like results, but I have the impression that they used a quite huge overall sensitivity…
Dave R. says
Perhaps someone can answer three questions for me:
1) If I understand it properly, this report says that the variances in surface temperature measurements of the paleoclimate using proxies (e.g. ice cores) are too large to definitively compare those temperatures to today’s. Quote from the report: “Very little confidence can be assigned to statements concerning the hemispheric mean or global mean surface temperature prior to about A.D. 900 because of sparse data coverage and because the uncertainties associated with proxy data and the methods used to analyze and combine them are larger than during more recent time periods.”
Are the large variances in paleoclimate surface temperature measurements also present in measurements of CO2 concentrations using ice cores?
[Response: You need to distinguish our ability to calculate global or hemispheric means (which require enough data to average out the regional variations) from the quality if any one record going back into the past. Prior to 900AD the number of well dated high resolution series drop dramatically, and so that spatial averaging becomes much more problematic. This is a very different issue to the lower resolution but much longer data series like the Vostok ice core or the ocean sediment cores. While they do get less well resolved as you go back in time (due to compression effects etc.), they are useful for giving local climate information back much further (> 100,000 years). The greenhouse gas measurements are in some ways more useful because the gases are relatively well mixed in the atmosphere and so one point in the globe is enough to capture the main features over time. Indeed the match between the different ice cores for both CO2 and CH4 indicate that this is a good assumption. So in summary, the NAS conclusion is appropriate to mean temperatures, not to local records or the GHG history. -gavin]
2) In both the Vostok and the more recent EPICA ice core analyses, CO2 increase has been shown to lag temperature increase by several hundred years. The hypothesis seems to be that some other forcing starts the warming, and CO2 released by the warming continues it. So what stops the warming? If CO2 causes a temperature increase, and a temperature increase causes more CO2 (released from melting tundra, etc.), why does the warming slow and reverse?
[Response: The ‘other forcing’ is the effects of orbital variations which are a pace-maker for the glacial/inter-glacial cycles. CO2 responds to those changes, and in turn causes more changes but like many feedbacks the effect converges (and doesn’t run away) because the strength of the feedback (the ‘gain’) is sufficiently small (though still positive). (I’ll have a post explaining this a bit better up shortly…) – gavin]
3) Are the Vostok and EPICA data (estimates and variances) available somewhere for download?
[Response: NCDC World Data Center for Paleoclimatology. http://www.ncdc.noaa.gov/paleo/icecore.html ]
Thanks in advance.
Bryan Sralla says
1) Questions from a skeptical (but willing to learn) geologist. Are there any conclusive analogs from the geological record [Tertiary or Quaternary] where CO2 increases precede a significant warming event? Granted this would not exclude CO2 as a major climate forcer, but a good “real world” analog is always nice to strengthen any scientific argument.
It seems to me the models are forecasting a cause-effect relationship (CO2-initiated warming event) that appears not to have occured in nature for at least 65 my. That definately causes this contrarian to at least ponder the validity of the current equations and assumptions.
[Response: Well, the PETM (~55 Mya) is probably the most likely candidate for an ‘clean’ carbon driven climate response. But since most changes in climate are complicated by feedbacks, you need to be more sophisticated in your search for analogs. For instance, while acknolwedging that the glacial/interglacial cycles are paced by Milankovitch cycles, examining the record, it’s clear that GHGs (CO2+CH4+N2O) were important feedbacks and contributors to the cooling (about half the LGM signal for instance). – gavin]
2) As a novice weather buff, I understand from ensemble weather forecasting that small changes in the initial state cause significant loss of skill at short forecast lead times. Even the ensemble mean loses all skill and approaches climatology after a relatively short lead time. How then do we have confidence of our skill to forecast a more complex climate system with many more loosely-contstrained variables, across mult-decadal periods? How do we know that largely unconstrained variables in the model such as potential variations in the THC, changes in the intensity of the hc, (ect. ect.) will not regionally counterbalance any averaged temperature change, and send certain climate zones (ie. North Atlantic, Europe) off in entirely different directions than now forecasted by the models? Are there really examples of a weak THC and very warm paleoclimates in the North Atlantic? If you could tackle a couple of these it would be appreciated.
[Response: The answer is (partly) contained in your question! The effects of the initial conditions die out after a while (a month of so for the atmosphere, longer though for coupled models) and the models converge to a statistically steady climatology. The projections into the multi-decadal future are based on the changes of that climatology – not the individual weather paths. When however it comes to big shifts in the ocean circulation, we are not as confident – partly because we don’t have many good validation tests for those magnitudes of changes (but see here for some ideas) – and partly because of large range of responses in the models themselves (of course, the two issues are linked). We can test the sensitivity to the initial ocean state quite easily, and within any one model this seems to be small for our current conditions. But we can’t easily test the dependency on parameterisation choices – except through these ‘ensembles of opportunity’ like the AR4 models – and they don’t provide a statistically satisfying set. Hence the large amount of discussion on the topic and continued debate. – gavin]
Ron Taylor says
Re 71.1) “It seems to me the models are forecasting a cause-effect relationship that appears not to have occured in nature…”
The superpositioning of Antarctic ice core data of temperature and CO2 concentration appear to show that they are in virtual lockstep, with sometimes one, sometimes the other apparently leading. That suggests to me a highly coupled dynamic system in which an external forcing that changes one parameter will trigger, through the coupling, same direction changes in the other. Much of the coupling is in the form of positive feedbacks that can substantially exacerbate the initial perturbation.
External forcing can be, for example, things like variations of solar radiance, orbital effects, or the addition of greenhouse gases to the atmosphere. To believe that the latter is not such a forcing requires suspension of the laws of physics. It is a little like adding a blanket to the bed, getting too warm in the night, then complaining in the morning that someone must have changed the thermostat setting.
This is the perspective of an engineer who has been concerned about AGW for almost thirty years. It is no doubt simplistic, but I trust the experts on this site to correct any conceptual errors.
Bryan Sralla says
Re: #72 “with sometimes one, sometimes the other apparently leading.”
Like you, I was previously under this assumption. I have never found any paleoclimate research that soundly backs this up however. In every study of paleoclimate (that I am aware of), where this relationship could be determined, temperature leads CO2 in every case. This seems to hold all the way back to at least the Eocene. (a proposed mechanism for the Paleocene event(biogenic methanogenesis) seems odd, but I will not dispute it here) That is a long time across numerous glacial-interglacial cycles!! It at least seems like an interesting pattern to me.
“To believe that the latter is not such a forcing requires suspension of the laws of physics.”
I only wish atmospheric physics was as easy to understand as your blanket analogy. If it had been, I might have been a meteorologist!!
Hank Roberts says
Bryan, you do understand the physics about how CO2 traps infrared? Or is that something you are doubtful about?
If you accept the physics, you understand why once CO2 increases in the atmosphere, solar heat is trapped and the planet warms up.
Is it that part of the description you don’t understand, why a change in CO2 traps heat?
Bryan Sralla says
Re: #74 Bryan, you do understand the physics about how CO2 traps infrared? Or is that something you are doubtful about?
Yes. So that we are all on the same playing field, Wien’s displacement law and the First Law of Thermodynamics are all on solid ground, although admittedly I don’t remember Wien’s constant off the top of my head (It’s been a few years since my last physics class).
The blanket analogy from Taylor however gives some insight into how many approach this complex climate problem. Its much like learning mathematics from a standard textbook. At the beginning of any unit, there is a fundamental principle, and a few very straight forward equations with only a few variables. (ie the blanket problem) At the end of the unit, there are usually a few more challenging problems that might include several variables. “Real” climate prediction is like trying to work all the challenging problems in the textbook simultaneously with an enormously long list of variables obtained from word problems that are printed backward, and coming up with the right answer. Not an easy problem, even if you have a brand new calculator with a fresh set of batteries!
Hank Roberts says
OK. And have you read the compilation here?
http://illconsidered.blogspot.com/2006/04/historically-co2-never-causes.html
Ron Taylor says
Bryan, I am not suggesting that a detailed analysis of the problem is simple. However, it seems to me that at least the short-term (century or two) effect of adding massive amounts of CO2 to the atmosphere can only have one of three possible results: (1) the earth gets warmer, (2) the earth gets cooler, or (3) no effect at all. What physical process or processes could lead one to conclude that the answer is (2) or (3)? The models, which, while not perfect, are incredibly sophisticated and the best we have, point to (1). I am not trying to be overly clever here, I just do not understand the basis of your argument.
By the way, Jim Hansen used the blanket analogy long before I did.
Bryan Sralla says
Standard radiative physics based on a correct treatment of the top-of-atmosphere balance– physics going back at least to Arrhenius– yields a surface warming of about 1C in response to a doubling of CO2, when water vapor feedback is neglected (Real Climate, 2006).
We all agree that the physical equations suggest only a small warming due to the the insolating effect of doubling CO2 concentrations. Obviously other feeback mechanisms (water vapor) must come into play to get the type of AGW that is being hyped.
In thermodynamics, a minus sign on work is needed, because when work is done by the system, energy (heat) flows out of it. I just do not yet accept the notion that all the processess by which the dynamic ocean/atmosphere system recieves energy, performs work, then evacuates the energy to stay in balance, are well understood. For example, what is the exact interaction between the ocean conveyor and the climate? Since the THC must ultimately be fueled by solar IR, it would seem it must have an increased imput of energy to then perform more work (speed up) and in turn give off heat to the atmosphere, which is then evacuated out of the system though further work (atmoshpheric circulation). Does heat energy primarily flow from the ocean to the atmoshphere, or to some degree, the other way around? The model suggestions that the THC will slow down (and may eventually collapse) while much of the North Altlantic is warming drastically, seems to be in contrast to paleoclimate indicators that suggest the system does not work this way.
In a final thought, I like numerical modeling. I use them in my field of structural geology. It is a great tool to help the scientist understand the interaction of the physical processes involved, and quantify uncertainty. One must use a model with great caution however to make forecasts. It must show repeated skill to be used as a predictive tool. Even the most robust model with a good data set is subject to uncertainty. After all, it is a model and not reality. That will be my last blog for now on this website. I hope it gives the readers some insight into the nature of my skeptisism. I now have to go back to my real job of exploring for more greenhouse gas.
Steve Sadlov says
RE: #78 – I like your approach, very agnostic. I have increasingly come to realize that in order to understand what is happening with the climate, there needs to be a major bump up in the use of quantum physics. It’s almost as if we need to work in this broad range, from the macro level currently embodied in the GCMs as well as the quantum level drilling down a whole lot more into what is happening in terms of all the actual interactions between the actual EM / photonics and all the matter in question. Probably what has been missing in the main stream “climate science” thread or research has been the presence of some really good photonics spectra, cosmic radiation, thermo, atomic physics folks and others along these lines. Without having a more in depth understanding of the true, real world energetics that are in play from the macro down into the sub micro, we are all pounding sand.
Dan says
There are several excellent discussions regarding the minimal role of cosmic radiation here. Just type in “cosmic radiation” or “cosmic rays” in the Search bar at the top of the page.
Robert Curtin says
Minimal role of cosmic radiation. Interesting I guess that the tempature of the earth’s surface must be more dependant on internal nuclear decay? Sorry I’m sure you meant to say changes in cosmic radiation. The problem is of cource that any measure of changes in cosmic radiation are not going to go back very far. What I find disturbing about these discussions is that there is very little discussion of what temperature curve the human element is plotted on to. There may be a lot of evidence of global warming but it is not very predictive of future conditions without some understanding of what the temperature conditions would be minus this or that factor.