A few weeks ago I was at a meeting in Cambridge that discussed how (or whether) paleo-climate information can reduce the known uncertainties in future climate simulations.
The uncertainties in the impacts of rising greenhouse gases on multiple systems are significant: the potential impact on ENSO or the overturning circulation in the North Atlantic, probable feedbacks on atmospheric composition (CO2, CH4, N2O, aerosols), the predictability of decadal climate change, global climate sensitivity itself, and perhaps most importantly, what will happen to ice sheets and regional rainfall in a warming climate.
The reason why paleo-climate information may be key in these cases is because all of these climate components have changed in the past. If we can understand why and how those changes occurred then, that might inform our projections of changes in the future. Unfortunately, the simplest use of the record – just going back to a point that had similar conditions to what we expect for the future – doesn’t work very well because there are no good analogs for the perturbations we are making. The world has never before seen such a rapid rise in greenhouse gases with the present-day configuration of the continents and with large amounts of polar ice. So more sophisticated approaches must be developed and this meeting was devoted to examining them.
The first point that can be made is a simple one. If something happened in the past, that means it’s possible! Thus evidence for past climate changes in ENSO, ice sheets and the carbon cycle (for instance) demonstrate quite clearly that these systems are indeed sensitive to external changes. Therefore, assuming that they can’t change in the future would be foolish. This is basic, but not really useful in a practical sense.
All future projections rely on models of some sort. Dominant in the climate issue are the large scale ocean-atmosphere GCMs that were discussed extensively in the latest IPCC report, but other kinds of simpler or more specialised or more conceptual models can also be used. The reason those other models are still useful is that the GCMs are not complete. That is, they do not contain all the possible interactions that we know from the paleo record and modern observations can occur. This is a second point – interactions seen in the record, say between carbon dioxide levels or dust amounts and Milankovitch forcing imply that there are mechanisms that connect them. Those mechanisms may be only imperfectly known, but the paleo-record does highlight the need to quantify these mechanisms for models to be more complete.
The third point, and possibly the most important, is that the paleo-record is useful for model evaluation. All episodes in climate history (in principle) should allow us to quantify how good the models are and how appropriate are our hypotheses for climate change in the past. It’s vital to note the connection though – models embody much data and assumptions about how climate works, but for their climate to change you need a hypothesis – like a change in the Earth’s orbit, or volcanic activity, or solar changes etc. Comparing model simulations to observational data is then a test of the two factors together. Even if the hypothesis is that a change is due to intrinsic variability, a simulation of a model to look for the magnitude of intrinsic changes (possibly due to multiple steady states or similar) is still a test both of the model and the hypothesis. If the test fails, it shows that one or other elements (or both) must be lacking or that the data may be incomplete or mis-interpreted. If it passes, then we a have a self-consistent explanation of the observed change that may, however, not be unique (but it’s a good start!).
But what is the relevance of these tests? What can a successful model of the impacts of a change in the North Atlantic overturning circulation or a shift in the Earth’s orbit really do for future projections? This is where most of the attention is being directed. The key unknown is whether the skill of a model on a paleo-climate question is correlated to the magnitude of change in a scenario. If there is no correlation – i.e. the projections of the models that do well on the paleo-climate test span the same range as the models that did badly, then nothing much has been gained. If however, one could show that the models that did best, for instance at mid-Holocene rainfall changes, systematically gave a different projection, for instance, of greater changes in the Indian Monsoon under increasing GHGs, then we would have reason to weight the different model projections to come up with a revised assessment. Similarly, if an ice sheet model can’t match the rapid melt seen during the deglaciation, then its credibility in projecting future melt rates would/should be lessened.
Unfortunately apart from a few coordinated experiments for the last glacial period and the mid-Holocene (i.e. PMIP) with models that don’t necessarily overlap with those in the AR4 archive, this database of model results and tests just doesn’t exist. Of course, individual models have looked at many various paleo-climate events ranging from the Little Ice Age to the Cretaceous, but this serves mainly as an advance scouting party to determine the lay of the land rather than a full road map. Thus we are faced with two problems – we do not yet know which paleo-climate events are likely to be most useful (though everyone has their ideas), and we do not have the databases that allow you to match the paleo simulations with the future projections.
In looking at the paleo record for useful model tests, there are two classes of problems: what happened at a specific time, or what the response is to a specific forcing or event. The first requires a full description of the different forcings at one time, the second a collection of data over many time periods associated with one forcing. An example of the first approach would be the last glacial maximum where the changes in orbit, greenhouse gases, dust, ice sheets and vegetation (at least) all need to be included. The second class is typified by looking for the response to volcanoes by lumping together all the years after big eruptions. Similar approaches could be developed in the first class for the mid-Pliocene, the 8.2 kyr event, the Eemian (last inter-glacial), early Holocene, the deglaciation, the early Eocene, the PETM, the Little Ice Age etc. and for the second class, orbital forcing, solar forcing, Dansgaard-Oeschger events, Heinrich events etc.
But there is still one element lacking. For most of these cases, our knowledge of changes at these times is fragmentary, spread over dozens to hundreds of papers and subject to multiple interpretations. In short, it’s a mess. The missing element is the work required to pull all of that together and produce a synthesis that can be easily compared to the models. That this synthesis is only rarely done underlines the difficulties involved. To be sure there are good examples – CLIMAP (and its recent update, MARGO) for the LGM ocean temperatures, the vegetation and precipitation databases for the mid-Holocene at PMIP, the spatially resolved temperature patterns over the last few hundred years from multiple proxies, etc. Each of these have been used very successfully in model-data comparisons and have been hugely influential inside and outside the paleo-community.
It may seem odd that this kind of study is not undertaken more often, but there are reasons. Most fundamentally it is because the tools and techniques required for doing good synthesis work are not the same as those for making measurements or for developing models. It could in fact be described as a new kind of science (though in essence it is not new at all) requiring, perhaps, a new kind of scientist. One who is at ease in dealing with the disparate sources of paleo-data and aware of the problems, and yet conscious of what is needed (and why) by modellers. Or additionally modellers who understand what the proxy data depends on and who can build that into the models themselves making for more direct model-data comparisons.
Should the paleo-community therefore increase the emphasis on synthesis and allocate more funds and positions accordingly? This is often a contentious issue since whenever people discuss the need for work to be done to integrate existing information, some will question whether the primacy of new data gathering is being threatened. This meeting was no exception. However, I am convinced that this debate isn’t the zero sum game implied by the argument. On the contrary, synthesising the information from a highly technical field and making it useful for others outside is a fundamental part of increasing respect for the field as a whole and actually increases the size of the pot available in the long term. Yet the lack of appropriately skilled people who can gain the respect of the data gatherers and deliver the ‘value added’ products to the modellers remains a serious obstacle.
Despite the problems and the undoubted challenges in bringing paleo-data/model comparisons up to a new level, it was heartening to see these issues tackled head on. The desire to turn throwaway lines in grant applications into real science was actually quite inspiring – so much so that I should probably stop writing blog posts and get on with it.
The above condensed version of the meeting is heavily influenced by conversations and talks there, particularly with Peter Huybers, Paul Valdes, Eric Wolff and Sandy Harrison among others.
Rod B says
Gusbobb (117), as a fellow skeptic whose skepticism lies, in part, in the same area, I’ll try to clarify some of your simpler questions. A CO2 molecule absorbs IR radiation only in discrete frequencies, though there are large numbers of lines within the primary CO2 absorption band of 15microns, +/- 2. These frequencies are based on the quantum energy levels of bond vibration, and are somewhat problematic ala quantum mechanics. This absorption does not increase the real temperature of CO2 molecules.
The re-emission of energy from vibrational energy will also be within the same frequency lines, though because of the Doppler effect may appear to be a different frequency to a subsequent CO2 molecule. [A similar effect can apply to the initial CO2 molecule absorbing primary radiation.] Re-emission stems from the molecule’s tendency to return to its internal equilibrium ala equipartition. More likely, especially at lower denser altitudes, the CO2 molecule will relax (lose) its absorbed vibration energy to another molecule’s translation energy (1/2mv^2) via collision. The collision is most likely with the predominant O2 or N2 molecules and will evidence a temperature rise in the collidee – produce atmospheric heating. Other stuff can also increase the atmosphere’s temperature.
O2 or N2 will do its common transferring energy back and forth through collision (including an occasional translation transfer to, say, CO2), spreading the wealth, as it were, per Maxwell. It might also relax its translation energy through emission – up or down, but this emission is of Planck’s blackbody nature, not the discrete IR absorption/emission limited to vibration and/or rotation energy levels/modes unique to CO2, methane, H2O, etc. (but which can also emit blackbody-like.) Escaping radiation comes from GHG re-emissions at high sparse altitudes, atmospheric blackbody radiation, or some that sneaks its way directly from the surface.
If I’ve erred here we’ll know soon enough.
Rod B says
Hank (124), I always wondered how you did an RC cite. Thanks! Now if you’ll explain how to include smiley faces, I’ll be happy…. though maybe not Gavin
Martin Vermeer says
Rod B #151:
Yep ;-) (made by just typing a text smiley inside blanks, and verifying with preview)
Only your last paragraph requires a further comment. You write
True, but. At Earth atmospheric densities N2, O2 etc. are practically transparent in the thermal IR, which means according to Kirchhoff-Bunsen that they emit very little. It’s really the greenhouse gases doing the job of both absorbing and emitting, but in local thermodynamic equilibrium with the non-greenhouse gases. Kirchhoff-Bunsen says that the emission intensity for a frequency is equal to the product of Planck’s curve (for that frequency and local temperature) and absorptivity of the air parcel in %.
When you look down at the atmosphere from above, you see what is piecewise a Planck black-body radiation curve. Not as a whole. The reason is that for some frequencies, e.g., within the 15µ absorption band, the radiation seen by the satellite comes from a level 10 km or more above ground, where it is cold; whereas at other frequencies (outside absorption bands) we are looking straight down to the ground which is much warmer. So the net result is a “chimera” of Planck curves emitted at different temperatures.
It is worth playing with Dave Archer’s software. It shows (as a 1-D model simulation) what gets radiated to space. If you remove everything but CO2, you’ll see how around wave number 700 cm-1 (i.e., 15µ) radiation comes from a level high up where temps are 220K (yellow Planck curve), i.e., the tropopause and above. Outside the band, radiation comes from the level of 297K (bit below the green Planck curve) which is the tropical ground temperature. Etcetera. So you look down to different levels depending on frequency. This is only a model, but real measurements from real satellites looking like this exist too.
Ray Ladbury says
Rod,
One nit. You continue to distinguish between “blackbody” and quantum radiation. There is no distinction. Quantum interactions with surrounding matter are simply the mechanism by which the radiation field comes into equilibrium with itself and with the surrounding matter. Of course the radiation field can only interact with the materials that have absorption bands that overlap its wavelengths. For CO2, that band is around 15 microns. No matter what you do, you won’t get CO2 to radiate outside this band (you can distort the band, but only so far), and diatomic gasses such as N2 and O2 will only radiate as a result of a collisional interaction that alters their magnetic symmetry.
Alastair McDonald says
Re #151
Hi RodB,
You do smiley faces by typing : – ) without spaces :-) and sad faces by typing : – ( :-(
You wrote “This absorption does not increase the real temperature of CO2 molecules” which is not strictly correct. I am assuming that by “real temperature” you mean kinetic energy of the molecules. All gas molecules have this type of energy, and that is what is measured with mercury thermometers. The absorption of infrared radiation by greenhosue gases causes these molecules to be vibrotaionally excited, and much of this energy is lost to other air molecules through collisons raising the kinetic energy (real temperature) of the air. The warmer air molecules then share their kinetic energy with the CO2 molecules so raising the real temperature of the CO2.
CO2 does not emit Planck black body radiation. Only solids and liquids do that. But the CO2 vibration and vibrotation lines tend to merge and produce a band emission to space which can look like a segment of the Planck function.
HTH,
Cheers, Alastair.
Paul Middents says
Re #151
Rod B writes
“These frequencies are based on the quantum energy levels of bond vibration, and are somewhat problematic ala quantum mechanics.”
What is problematic about them?
Alastair McDonald says
Re #153
Martin,
You wrote:
“Kirchhoff-Bunsen says that the emission intensity for a frequency is equal to the product of Planck’s curve (for that frequency and local temperature) and absorptivity of the air parcel in %.”
Can you give me a reference for that fact? Kirchhoff died in 1887, and Bunsen in 1899 but Planck did not publish his function until 1900, although one was already thought to exist. As I understand it, it was not until 1915 that Einstein showed that gases obey Kirchhoff’s law.
Cheers, Alastair.
Lynn Vincentnathan says
RE #134 & your claim that scientists are arrogant, acting as if they know it all.
Let me go over (again) my gripes with both science and denialism: science requires 95% confidence there’s a link before making a claim (I also demand that in the social science research I and my students do); and denialists require either 99% or 101% confidence.
Scientists have to avoid the false positive (making untrue claims) to preserve their reputation. But those concerned with living in the world (people, policy makers, environmentalists) should be concerned with avoiding false positives (failing to address true problems).
I call this “THE MEDICAL MODEL” – we’d be disconcerted if our doc told us he/she was only 94% confident our lump was cancerous, and to come back in a year to see if it could get up to 95% so they could operate.
We buy insurance without much certainty a tornado or other harm will damage our house. The virtue is called prudence.
So too we should have rigorously started mitigating AGW at least by 1990 (5 years before the first scientific studies started reaching 95% confidence or the .05 alpha significance level in 1995) — esp since doing so actually saves us money and helps the economy (I found out to my amazement). Any other strategy is fool-hardy, profligate, and immoral.
What we need is not so much more and more science (tho keep up the good work, all you scientists!), but more and more persons with strength of character and morality, persons who are not ashamed to stand up, do the right thing, and mitigate AGW.
Geoff Wexler says
Re:#151
Black (grey) air?
Martin Vermeer has answered you already and I’ll try to express it another way. We are now down to the level of quantum theory. Matter can only radiate by jumping between quantum levels and in this case, the transition which involves the emission of photons has to satisfy another condition. If you start with something symmetrical like an oxygen molecule and make it vibrate it will still be symmetrical. That means that the negative charge in the molecule cannot have moved relative to the positive. The symmetrical vibrating molecule will not experience any interaction with a vibrating electric field. An incoming photon cannot excite such a vibration and conversely such a vibration cannot decay by emission of a photon. But things are quite different for triatomic molecules like CO2 and H2O which have electrically asymmetrical vibrational states.
So can air never radiate infra-red at all? I am now speculating and opening myself to correction. Might it be possible (at very low level) by breaking the symmetry in a collective way? The first possibility might be during an oxygen-nitrogen (and possibly an O2-O2 collision) collision when we have a different set of quantum states existing for a short time. Has this ever been considered? Tyndall never observed it , as far as I know but that was along time ago. It is thus likely to be far too small to affect the greenhouse.
A quite different question. A low level contrarian argument is based on the assertion that CO2 is only a tiny proportion of the air. The rough answer of course is that CO2 and H2O are 100% of what matters and that this argument diverts attention to the less significant non absorbing parts of the atmosphere. But what about a more careful thought experiment, what would happen to our greenhouse if the oxygen and nitrogen were removed? This would remove a heat sink and alter the convection and lapse rate. Is the answer obvious?
CL says
Re 158
Well said, Lynn Vincentnathan. But strength of character and morality are not sufficient, when there are such powerful forces which negate efforts to mitigate AGW. Many decades of indoctrination and propaganda have produced massive inertia. Maybe it takes a sort of social Kuhnian paradigm shift, requiring that the old thinking in the old (and powerful) heads has to die before anything really changes….e.g.
http://www.orionmagazine.org/index.php/articles/article/2962
Martin Vermeer says
Alastair #157:
Good question! According to Wikipedia, Kirchhoff discovered black-body radiation already in 1862.
Kirchhoff’s law can be proven from the principle of thermodynamic equilibrium. I’m sorry to use Wikipedia again, but I don’t own any statistical physics textbook; any good one should do.
(I didn’t manage to find anything on Einstein 1915 regarding Kirchhoff’s law. Did Einstein do experimental work? I know he worked theoretically on stimulated emission.)
What Max Planck did was find a closed analytical expression for this empirical curve (a surprisingly fractious problem!). And reluctantly conclude that a reasonable physical interpretation would be that the radiation comes in “packets” of energy proportional to the frequency… and the rest is history :-)
Ken Rushton says
Back to the Present – it looks like a blob of unusually warm air from eastern Siberia is about (May 6th) to flow over the thinnest part of the Artic ice. Will the reported “unusually thin ice” melt through a month or two early, or is the ice actually not quite so fragile?
I suspect somewhere between the two. See: NH Jetstream and Northern Sea ice
Hank Roberts says
> http://www.orionmagazine.org/index.php/articles/article/2962
Excellent article.
Those with old fading eyes like mine, note, this link is “orion” like the constellation, not “onion” like the humor magazine.
Martin Vermeer says
Re #159 Geoff Wexler:
This was shortly mentioned in the comments to raypierre’s Venus article:
https://www.realclimate.org/index.php/archives/2008/03/venus-unveiled/#comment-82917
Martin Vermeer says
#125 Hank, didn’t think I used any math, or did I?
SecularAnimist says
Lowell wrote (#133): “Methane concentrations in the atmosphere have stabilized and are probably falling now so all the doom and gloom discussion about Methane releases are not supported by the facts.”
That is incorrect. According to the latest data from NOAA, methane levels rose sharply in 2007 after a decade of stability:
Greenhouse Gases, Carbon Dioxide and Methane, Rise Sharply in 2007
ScienceDaily
Thursday 24 April 2008
Excerpt:
Unfortunately, “gloom and doom” discussion about methane is supported by the facts.
Ray Ladbury says
Martin and Alastair, I recommend the derivation of the Planck distribution and related material in Landau and Lifshitz, “Statistical Mechanics”–elegant and straightforward. Well, as straightforward as anything ever is in stat mech.
Sue says
A brief social science on the end of this article:
All sciences benefit from skilled synthesizers, especially those who can make the core ideas of a field accessible to those outside the field. However, the current organization of higher education, and the way resources are allocated within it, provides the greatest reward for those who publish small pieces of new data in referred journals over those who write the book length pieces necessary for synthesis. To accomplish a change of emphasis in paleo-climatology would require reforming the entire higher education enterprise — something that would be a good idea, but not likely to happen quickly enough to make a difference for this particular project.
Philip Machanick says
It’s fun watching all those people picking over computer architecture vs. performance details I worked through for my PhD c. 1996 (John Mashey may even remember it: he was an examiner).
No one has asked me to help with performance problems since then yet a lot of the “big iron” problems of the early 90s have come back as commodity performance problems with multicore.
But anyway, back to the main question: I find it really encouraging that the paleo climate is starting to get more attention. Of course there are huge difficulties but anything that sheds more light on where we are going has to be good (except for the doubters, whose standard of proof will increase to 200% — Lynn Vincentnathan #158).
Chuck Booth says
Re #134 Greg Smith:
Presumably you are similary dismissive of the modelling done by earth scientists? For example:
Determining Chondritic Impactor Size from the Marine Osmium Isotope Record
François S. Paquay, Gregory E.
Science 11 April 2008: 214-218.
The difference in osmium concentrations and isotopes between seawater and asteroids allows reconstruction of impact occurrence and size, including for the Cretaceous.
Ferdinand Engelbeen says
Re #166 and others about methane levels…
One need to look at the real figures… The “sharp” increase of methane levels in 2007 is pushing methane levels from about 1790 to 1795 ppbv, not much beyond the year-by-year variability of methane levels in the past decade, which are near stable.
See: http://www.esrl.noaa.gov/gmd/ccgg/iadv/ and look for CH4.
Some interesting point is that the d13C content of methane is slightly increasing. This may point to a change in source (either more human induced – like from rice paddies) or more fossil methane, but I have no figures seen (yet) to know the d13C content of different methane sources. Unfortunately d13C measurements in methane only started at the moment that methane levels stabilised.
About historical events: One can’t compare the current situation with the PETM event, where enormous levels of methane were released, far more than natural oxydation could transform into CO2 and water. Better compare to the previous interglacial, the Eemian.
The Eemian in average was up to 3°C warmer than current, during at least 5,000 years. This caused melting of a (large?) part of the Greenland ice sheet and temperatures in the Alaskan tundra of at least 5°C warmer than current. If we may suppose that the higher temperatures were also present in the rest of the Arctic, then we can have a clue what the permafrost thawing can give for extra methane.
Methane levels in ice cores followed temperature trends during the whole Eemian (see here) and reached a maximum of about 700 ppbv at maximum temperature, from a level of 650 ppbv at a temperature equal to the current temperature. Today we have reached 1795 ppbv, largely due to human emissions, where land use change may be the largest contribution. Thus even if the temperature increases to Eemian levels, that would add not more than 50 ppbv to the current methane levels…
Greg Smith says
Re posts 147 and 170
I don’t regularly post on any forums but do feel compelled to answer both of the above. Geologists and geophysicists use models constantly to try and evaluate what we can’t see or directly measure, that is, what is under the earth. We also have a pretty good appreciation of palaeoclimate both on a long term (ie geological) time frame and also over a human scale. We need to understand the latter to make predictions of rock facies away from known data points such as a well bore or mine. It is making predictions under uncertainty that is our business and the more successful of us are better at putting together the disparate pieces of often unreliable or fragmentary information required to make predictions. We also understand that no model, no matter how sophisticated with wonderful visualisation etc is accurate. This is why we use Monte Carlo simulation so much. Even the most sophisticated reservoir modelling software used to make predictions of how, say, a petroleum reservoir will perform will only give an approximation of the real thing and will be subject to the accuracy of the algorithms used and the quality of the imput data. Such simulators which until the last ten years required supercomputers to run them, may be modelling 30 to 40 variables and are nothing like trying to model the atmosphere.
In addition I have not read anywhere how climate modellers account for long term (for humans that is) variations in climate such as the PDO which is making the news at present. Nor have been able to find out how you account for random, but frequently happening, events such as volcanic eruptions. Both of these events obviously change the climate. In addition how do you estimate changes in the level or otherwise of the level of forestation. Don’t forget that North Africa kept much of the Roman Empire happily fed for over 400 years. I could look it up but I doubt that there is alot of wheat grown in Libya these days.
I guess I could also say that when people start saying things such as there is a “consensus” or the “science is settled” that I am unnerved which explains my reference to Copernicus. We can never fully understand a complex system such as the earth with millions of variables and to say otherwise is arrogant . For example electromagnetic theory in the 1890s seemed to physicists at the time to give a full explanation of interaction between particles and electricity/magnetism but the physicists were constrained by their inability to adequately measure very small scale things and had their hats knocked into a snook when Einstein came along. (As an aside, I wonder who peer reviewed him? “Peer review” is also a term I read frequently on blogs such as this)
The point of all of this is that we never know enough about the physical world, something we have only been able to measure or image half adequately in the last twenty years or so.
So when a poor sod like Gus-bob asks a few, to you, dumb questions just remember to treat him with respect.
Donald Oats says
Re #134: Damn, does this mean I should chuck my book on the Physics of the Earth’s Interior? The blurb on the back says it’s for geophysicists and earth scientists, but it only has physics and maths (models) in it :-(
More seriously, in the modern sciences we can’t get very far before running smack into a model or the need for one: whether it is statistical only, mathematical, physical, or a more qualitative construction, a model is about the only thing that connects data to ideas about how that data came to be. Mathematical models based on the physical aspects of a system IMO push research towards understanding far more effectively than waving hands about and saying it is all too complicated.
CobblyWorlds says
#166 Secular Animist,
Further to what you post.
1)
NSIDC now state the current Arctic ice pack state points to a likelihood of a new record minima this year. http://nsidc.org/arcticseaicenews/index.html That’s despite the past behaviour of the ice pack indicating that a new minima is not followed by a new minima in the next year. Maslowski’s projection of an ice-free arctic summer by 2013 may indeed be optimistic as it didn’t include last year’s events. We could see the transition to a seasonally free Arctic happen as fast as figure 3 of Nghiem “Rapid reduction of Arctic perennial sea ice” (2007) implies:
http://seaice.apl.washington.edu/Papers/NghiemEtal2007_MYreduction.pdf 303kb pdf
2)
2007’s melt caused surface Arctic Ocean waters in the exposed region to warm by as much as 5degC.
http://www.sciencedaily.com/releases/2007/12/071212201236.htm
3)
And at least part of the 540 billion tonnes of methane (clathrate) on the Arctic ocean shelf floor may alread be beginning to destabilise.
http://www.spiegel.de/international/world/0,1518,547976,00.html
QUOTE
Data from offshore drilling in the region, studied by experts at the Alfred Wegener Institute for Polar and Marine Research (AWI), also suggest that the situation has grown critical. AWI’s results show that permafrost in the flat shelf is perilously close to thawing. Three to 12 kilometers from the coast, the temperature of sea sediment was -1 to -1.5 degrees Celsius, just below freezing. Permafrost on land, though, was as cold as -12.4 degrees Celsius. “That’s a drastic difference and the best proof of a critical thermal status of the submarine permafrost,” said Shakhova.
Paul Overduin, a geophysicist at AWI, agreed. “She’s right,” he said. “Changes are far more likely to occur on the sea shelf than on land.”
ENDQUOTE
Would an increasing amount of open water in the Arctic summer cause an increasing amount of vertical mixing through wind/sea-surface induced transport/turbulence?
I suspect so, but I’m not qualified enough to be sure either way.
Jim Cripwell says
In 162 Ken Rushton writes “Back to the Present – it looks like a blob of unusually warm air from eastern Siberia is about (May 6th) to flow over the thinnest part of the Artic ice. Will the reported “unusually thin ice” melt through a month or two early, or is the ice actually not quite so fragile?” I wonder if soimeone can clear up a mystery for me. I understand that “old” ice is thicker and melts slower than new ice, but I dont think I understand annual ice. This is the approximately 9 m sq kms of ice that melts and freezes every year, and so is, by definition, always less than one year old. It is my understanding that the thickness of annual ice is purely a function of how soon winter arrives, and how cold it gets. In many parts of the Arctic this season, winter arrived early, and was colder than average, so the annual ice in these parts is reported to be thicker than average. I have seen such reports from Hudson Bay, and the Bering Strait. What is the situation with respect to annual ice in the Arctic at the present?
Ray Ladbury says
Greg Smith, I will give you the benefit of the doubt that perhaps you are unfamiliar with Gusbob’s history. Suffice to say he came on here and hijacked more than one thread with off-topic diatribes on pure pseudoscience. Given that history, I would contend that he is treated with exceptional tolerance here.
As to your other contention that one must throw up one’s hands when confronted with a complicated system, I can only conclude that you have not done much modeling. Global climate models are dynamical models, and the parameters in the models are not adjustable parameters, but rather constrained by independent data. Such models are inherently more manageable than statistical models, where fit to the data determines parameter values and their error bars.
To say that we never know enough about the physical world is meaningless. Enough to do what? We do not understand everything about the solar system, and yet this does not stop us from landing space probes on planets and asteroids. Likewise, there is much we do not understand about climate, but what we do not understand does not invalidate our well constrained understanding of CO2 and other greenhouse gasses.
What is more, climate scientists understand the difference between climate–a purely long term entity–and weather–under which we can file things like variability of PDO, etc. and which average out over time. They also understand that paleoclimate–which deals with ancient climates and is inherently on timescales much longer than a human life. Since you do not understand these things–and much else as well, it seems–I would contend that you are not in a position to comment competently on issues of climate science.
pat n says
When I look at the plot (The Eemian in the Vostok Ice Core which was referenced by Ferdinand Engelbeen in his comment #171), I do not see how he can claim that “Methane levels in ice cores followed temperature trends during the whole Eemian”. ?
Jan Umsonst says
I have an question about the sensitivity of the climate system of our Planet.
How is it?
The vegetation on Earth is like a puffer for climate change?
Therefore can i say that our vegetation is importand for the sesitivity?
As i understand it you use examples of climate change to explane the future,
In the past there was a functioning ecosystem, with a lot more biomass on the continets.
( at least in times with same clobal temperatur ), so in the past the ecosystem reacted with the warming process. For example rainforests. Through evaporation they create a cloudcover, like a protection for heat. Hotter temperatures they started crowing towards the pols, maybe, more cloudcover for the continents, and a slower heating up of the whole. ( think most kinds of forests are doing this, more or less ) Also if i remember it correct, in much hotter times, most of our continets were covert by rainforrests and other kinds of forests, was like a puffer for continental temperatures i guess.
In our times we will have finished the rainforests maybe in 2040. More and more area will be needed for food production, while exhausted ground will be left behind. With or without clobal warming, we will have finished our ecosphere by 2100, growing numbers and so.
(only war, or climate change can stop this)
If this is not stupit and makes sense i come to my main question.
Is this principle inside the ipcc report or not? Sinking sesitivity of the system towards 2100
Sorry for interrupting
Dan says
re: 172 and volcanic effects. The effects from volcanic eruptions on climate are short-term compared with the long-term warming trend. For example, look at the effects after Pinatubo. There was short-term cooling but the warming trend rapidly resumed. Furthermore, volcanic eruptions do not produce more CO2 than mankind despite the skeptic’s urban myth to the contrary.
spilgard says
Re 172:
Disrespect? With monumental patience, all of gusbobb’s points and question have received detailed, relevant, lengthy and continuing responses. Disrespect would be to ignore him. The peevish and frustrated tone of some of the responses is no more than I would expect if I were to enter a geophysics forum and blithely overturn tectonic theory with the latest Hollow Earth ideas.
Jim Eager says
Re Greg Smith @ 172: “So when a poor sod like Gus-bob asks a few, to you, dumb questions just remember to treat him with respect.”
Gregg, if you peruse gusbob’s earlier comments in the Galactic Glitch thread at
https://www.realclimate.org/index.php/archives/2008/03/a-galactic-glitch/langswitch_lang/sw
you’ll see that he is hardly the ‘poor sod’ thaat you assume, though to his credit he is now making an attempt to understand what people have been trying to tell him for quite sometime now.
pete best says
Looks like the Ice has taken a recet nose dive below 2007 and its only May.
http://nsidc.org/data/seaice_index/images/daily_images/N_timeseries.png
Alastair McDonald says
Martin and Ray,
I am not looking for a derivation of Planck’s function. What I would like to know is the justification for using Planck’s function for black-body radiation to calculate the emissions of line radiation, a very different beast. The Fraunhofer lines are where solar radiation does NOT fit Planck’s curve.
Cheers, Alastair.
dhogaza says
This is simpl a restatement of the old denialist canard “if we can’t understand everything, we understand nothing”, used by anti-evolutionists, anti-climate scientists, anti-everythingists.
If you’re really a scientist, Greg, surely you can do better than this.
Ron Taylor says
I posted this on another thread this morning, but it seems to fit better here:
If you want to so something “chilling,” watch this Japanese video of polar satellite images, showing the disintegration of multi-year ice during last winter.
http://www.homerdixon.com/download/arctic_flushing.html
From the video:
The image is a low-resolution reproduction of a sequence of satellite images of Arctic ice this past fall and winter. The sequence runs in a continuous loop from October 01, 2007, to March 15, 2008. A link to the high-resolution video file is also provided.
Note the stream of multi-year ice flowing out of the Arctic basin down the east coast of Greenland at one o’clock in the image. As of the middle of March, most of the basin, including the pole itself, appears to be covered only by seasonal ice.
Ray Ladbury says
Alastair, Blackbody radiation is not a distinct type of radiation from line radiation. Atoms, molecules, and solids can only radiate between their energy levels, so “blackbody” radiation is composed of all such modes that are thermally excited. That is why you get a much better approximation of a blackbody spectrum from a solid than a gas (more modes to add togetether and more thermal motion to smear out those modes).
The Planck distribution is simply the equilibrium energy density of a photon gas at temperature T. However, since photons are noninteracting, equilibrium can only be achieved by the photon gas interacting with surrounding matter. In effect, once the radiation field is in equilibrium with the matter, there will be as many excited modes decaying radiatively as there are photons exciting new modes, and the energy of the mode multiplied by the number of photons/phonons/collisions… capable of exciting the mode will give the energy density.
Ferdinand Engelbeen says
Re #177 Pat N,
The plot is a little confusing, but if you look at the beginpoints and endpoints of the warming and of the CH4 increase, there may be a (in geological terms) slight lag of CH4 increase after temperature increase of less than 1,000 years. After the maximum temperature/CH4 levels ended, there is no visible lag of CH4 if you take the (older) temperature proxy of Petit e.a. as base, or CH4 leads the temperature drop with several thousands of years during a long period, but lags again at the moment that temperature is at minimum, if you take the (more recent) temperature proxy by Jouzel as base.
It is physically possible that already starting ice sheets growth led to stronger reductions of methane release than can be deduced from the temperature curve…
Anyway, it doesn’t look that methane release from permafrost and/or clathrates will be a problem for this century. A temperature increase of 3°C will not give more than 100 ppbv more methane (made an error in #171: the CH4 scale in the graph needs to be doubled…). This leads to an increase of retained warmth (without feedbacks) of about 0.06 W/m2, not really impressive…
CobblyWorlds says
#171, Ferdinand Engelbeen,
From the graphs of yours:
“This points to a low influence of CO2 on temperature.”
Does it really?
What I see is a process being initiated by insolation maxima for “Spring” (April May June), as suggested by Hansen et al in “Climate change and trace gases”, fig 2b, table 1, and accompanying text. http://pubs.giss.nasa.gov/abstracts/2007/Hansen_etal_2.html
CO2 and Methane track closely until greater areas of the surface are exposed by the receding ice-sheets and made available to biological activity at 13k yr BP. That’s when the methane races ahead of CO2, it rises in tandem with ice sheet recession.
Likewise your statement about 113k yr BP seems flawed to me because again you’re seeing the methane level drop as more physical area is covered by ice, thus reducing it’s unit-area biological output.
CH4’s atmospheric residence time is much less than that of CO2 (for which Ocean/Atmosphere must be considered), so I’d expect it to react more rapidly to drops in temperature.
As for the Eemian/PETM as an analogue, I’m not so sure that the past offers much of a guide given the multiplicity of human impacts. We’ll find out soon enough whether the reduction of Southern Ocean CO2 uptake, the expansion of the Hadley Cells, the ongoing transition of the Arctic to a seasonally ice-free state, are part of a wider pattern of changes ahead of time, or if they’re outliers/”noise”.
From where I’m sitting it really looks like we’re much deeper into the briar patch than we can yet assert to a scientifically robust degree.
Jim Eager says
Greg Smith wrote: “Nor have [I] been able to find out how you account for random, but frequently happening, events such as volcanic eruptions.”
I’m very surprised by this statement, since James Hansen’s earliest circulation model from the 1980s included a random volcanic eruption and the model prediction matched the short-term aerosol cooling effect of the 1991 Pinatubo eruption fairly closely. This has been pretty widely discussed as an example of the model’s ability. One would have to not look very thoroughly to not know this.
Brian Patterson says
Hi,
I was wondering if there is anywhere on this site where I can find a response to the following report:
http://epw.senate.gov/public/index.cfm?FuseAction=Minority.SenateReport#report
I would like to make it clear that this report does not in any way undermine my view that we must strive to to reverse the effects of climate change that most experts agree are occurring. I would also like to add that I think it is amazing that the mdeida require a very high burden of proof that the climate change fears are justified. Let’s face it, if most expert in nuclear physic were to warn us of a 10% chance of meltdown in a nuclear plant, everyone would clamour to have it shut down. Yet the dangers of climate change are almost infinitely greater. Nonethless, if anyone can point me to well-inforemd response to the document I’ve just mentioned, I would be most grateful.
Thanks and best wishes,
Brian.
Martin Vermeer says
#183 Alastair,
Oh, but it does! The Planck curve of a level higher up in the Solar atmosphere, were temperatures are lower. Higher opacity (stronger absorption) means that the photosphere — the fuzzy visible surface from where light escapes to space — is at a higher level within these lines than in the continuum.
Also in the Solar atmosphere, like in the Earth one, there is a negative temperature gradient with height. Only, we’re talking about visible light here, not infrared. The negative gradient can also be observed in the form of “limb darkening”.
On a conceptual level, there is no difference between black-body radiation and line radiation. It’s all Planck, just the absorbtivity == emissivity varies.
Greg Smith says
Re 176 Ray Ladbury
Ray, you have not answered my questions. Perhaps you or others in this forum could point me towards a site/papers that explain exactly and concisely what variables are in your “dynamic models” and what constraints are placed on the data that are entered into these models and who decides which set of variables are relevant which ones are not and why. In addition could you also please explain the algorithms and their inherent assumptions and error bars which have been used to drive the models, the outcomes of which are being used to drive fundamental changes to the world’s economy. I, for one, would like to know how you understand they work before being dismissed as incompetent to talk about climate science. As I said in my initial post I am neither a sceptic nor a warmer but I am a scientist and want to make scientific assessments based on actual data and not rhetoric. I too am a fellow passenger on this space ship GIGO
Hank Roberts says
Pete wrote:
“Looks like the Ice has taken a recet nose dive below 2007 and its only May.”
Pete, I expect you saw a temporary hiccup in the charting software output, which the site warns can happen. Click “Data Note” on the home page:
http://www.nsidc.org/arcticseaicenews/disclaimer1.html
pete best says
re #187, tis true, its half way between record and nominal
Barton Paul Levenson says
Ray Ladbury writes:
I think CO2 actually does have other absorption lines, e.g. at 4.7 microns; it’s just that 14.99 microns is the real big one.
Barton Paul Levenson says
Geoff Wexler writes:
The greenhouse effect would be a bit less due to the lower total pressure.
Barton Paul Levenson says
Greg Smith writes:
Agriculture is Libya’s second largest economic sector after oil.
ANY scientist in ANY field would be familiar with “the scientific consensus” and would realize that that and peer review are how modern science works.
Nobody is saying we can fully understand it. We can understand it well enough to draw a number of conclusions. That has been true ever since Torricelli and others established that higher altitudes were generally colder and that pressure fell with altitude back in the 1600s.
Einstein’s peer reviewers were the editors of the peer-reviewed journal, Annalen der Physik, which he published in.
JCH says
Greg, I found this website about climate models to be helpful:
http://tinyurl.com/5mdu9y
Barton Paul Levenson says
Does anybody have a link to a debunking of the “400 scientists” list? I think I saw this covered on Deltoid and a few other science blogs, but I don’t remember the details. I think the main point was that most of the people on the list are not climatologists. Ross McKittrick, if I’m not mistaken, is a mining engineer.
Mike says
Eric (skeptic) – if you’re still on this page, I’ll post some information on clouds for you. It shows that reduction in albedo from 1985 to 1998 contributed many times more to global warming than greenhouse gases. Albedo has been increasing again in the last few years.