The Paleocene-Eocene Thermal Maximum (PETM) was a very weird period around 55 million years ago. However, the press coverage and discussion of a recent paper on the subject was weirder still.
For those of you not familiar with this period in Earth’s history, the PETM is a very singular event in the Cenozoic (last 65 million years). It was the largest and most abrupt perturbation to the carbon cycle over that whole period, defined by an absolutely huge negative isotope spike (> 3 permil in 13C). Although there are smaller analogs later in the Eocene, the size of the carbon flux that must have been brought into the ocean/atmosphere carbon cycle in that one event, is on a par with the entire reserve of conventional fossil fuels at present. A really big number – but exactly how big?
The story starts off innocently enough with a new paper by Richard Zeebe and colleagues in Nature Geoscience to tackle exactly this question. They use a carbon cycle model, tuned to conditions in the Paleocene, to constrain the amount of carbon that must have come into the system to cause both the sharp isotopic spike and a very clear change in the “carbonate compensation depth” (CCD) – this is the depth at which carbonates dissolve in sea water (a function of the pH, pressure, total carbon amount etc.). There is strong evidence that the the CCD rose hundreds of meters over the PETM – causing clear dissolution events in shallower ocean sediment cores. What Zeebe et al. come up with is that around 3000 Gt carbon must have been added to the system – a significant increase on the original estimates of about half that much made a decade or so ago, though less than some high end speculations.
Temperature changes at the same time as this huge carbon spike were large too. Note that this is happening on a Paleocene background climate that we don’t fully understand either – the polar amplification in very warm paleo-climates is much larger than we’ve been able to explain using standard models. Estimates range from 5 to 9 deg C warming (with some additional uncertainty due to potential problems with the proxy data) – smaller in the tropics than at higher latitudes.
Putting these two bits of evidence together is where it starts to get tricky.
First of all, how much does atmospheric CO2 rise if you add 3000 GtC to the system in a (geologically) short period of time? Zeebe et al. did this calculation and the answer is about 700 ppmv – quite a lot eh? However, that is a perturbation to the Paleocene carbon cycle – which they assume has a base CO2 level of 1000 ppm, and so you only get a 70% increase – i.e. not even a doubling of CO2. And since the forcing that goes along with an increase in CO2 is logarithmic, it is the percent change in CO2 that matters rather than the absolute increase. The radiative forcing associated with that is about 2.6 W/m2. Unfortunately, we don’t (yet) have very good estimates of background CO2 levels in Paleocene. The proxies we do have suggest significantly higher values than today, but they aren’t precise. Levels could have been less than 1000 ppm, or even significantly more.
If (and this is a key assumption that we’ll get to later) this was the only forcing associated with the PETM event, how much warmer would we expect the planet to get? One might be tempted to use the standard ‘Charney’ climate sensitivity (2-4.5ºC per doubling of CO2) that is discussed so much in the IPCC reports. That would give you a mere 1.5-3ºC warming which appears inadequate. However, this is inappropriate for at least two reasons. First, the Charney sensitivity is a quite carefully defined metric that is used to compare a certain class of atmospheric models. It assumes that there are no other changes in atmospheric composition (aerosols, methane, ozone) and no changes in vegetation, ice sheets or ocean circulation. It is not the warming we expect if we just increase CO2 and let everything else adjust.
In fact, the concept we should be looking at is the Earth System Sensitivity (a usage I am trying to get more widely adopted) as we mentioned last year in our discussion of ‘Target CO2‘. The point is that all of those factors left out of the Charney sensitivity are going to change, and we are interested in the response of the whole Earth System – not just an idealised little piece of it that happens to fit with what was included in GCMs in 1979.
Now for the Paleocene, it is unlikely that changes in ice sheets were very relevant (there weren’t any to speak of). But changes in vegetation, ozone, methane and aerosols (of various sorts) would certainly be expected. Estimates of the ESS taken from the Pliocene, or from the changes over the whole Cenozoic imply that the ESS is likely to be larger than the Charney sensitivity since vegetation, ozone and methane feedbacks are all amplifying. I’m on an upcoming paper that suggests a value about 50% bigger, while Jim Hansen has suggested a value about twice as big as Charney. That would give you an expected range of temperature increases of 2-5ºC (our estimate) or 3-6ºC (Hansen) (note that uncertainty bands are increasing here but the ranges are starting to overlap with the observations). ALl of this assumes that there are no huge non-linearities in climate sensitivity in radically different climates – something we aren’t at all sure about either.
But let’s go back to the first key assumption – that CO2 forcing is the only direct impact of the PETM event. The source of all this carbon has to satisfy two key constraints – it must be from a very depleted biogenic source and it needs to be relatively accessible. The leading candidate for this is methane hydrate – a kind of methane ice that is found in cold conditions and under pressure on continental margins – often capping large deposits of methane gas itself. Our information about such deposits in the Paleocene is sketchy to say the least, but there are plenty of ideas as to why a large outgassing of these deposits might have occurred (tectonic uplift in the proto-Indian ocean, volcanic activity in the North Atlantic, switches in deep ocean temperature due to the closure of key gateways into the Arctic etc.).
Putting aside the issue of the trigger though, we have the fascinating question of what happens to the methane that would be released in such a scenario. The standard assumption (used in the Zeebe et al paper) is that the methane would oxidise (to CO2) relatively quickly and so you don’t need to worry about the details. But work that Drew Shindell and I did a few years ago suggested that this might not quite be true. We found that atmospheric chemistry feedbacks in such a circumstance could increase the impact of methane releases by a factor of 4 or so. While this isn’t enough to sustain a high methane concentration for tens of thousands of years following an initial pulse, it might be enough to enhance the peak radiative forcing if the methane was being released continuously over a few thousand years. The increase in the case of a 3000 GtC pulse would be on the order of a couple of W/m2 – for as long as the methane was being released. That would be a significant boost to the CO2-only forcing given above – and enough (at least for relatively short parts of the PETM) to bring the temperature and forcing estimates into line.
Of course, much of this is speculative given the difficulty in working out what actually happened 55 million years ago. The press response to the Zeebe et al paper was, however, very predictable.
The problems probably started with the title of the paper “Carbon dioxide forcing alone insufficient to explain Palaeocene–Eocene Thermal Maximum warming” which on it’s own might have been unproblematic. However, it was paired with a press release from Rice University that was titled “Global warming: Our best guess is likely wrong”, containing the statement from Jerry Dickens that “There appears to be something fundamentally wrong with the way temperature and carbon are linked in climate models”.
Since the know-nothings agree one hundred per cent with these two last statements, it took no time at all for the press release to get passed along by Marc Morano, posted on Drudge, and declared the final nail in the coffin for ‘alarmist’ global warming science on WUWT (Andrew Freedman at WaPo has a good discussion of this). The fact that what was really being said was that climate sensitivity is probably larger than produced in standard climate models seemed to pass almost all of these people by (though a few of their more astute commenters did pick up on it). Regardless, the message went out that ‘climate models are wrong’ with the implicit sub-text that current global warming is nothing to worry about. Almost the exact opposite point that the authors wanted to make (another press release from U. Hawaii was much better in that respect).
What might have been done differently?
First off, headlines and titles that simply confirm someone’s prior belief (even if that belief is completely at odds with the substance of the paper) are a really bad idea. Many people do not go beyond the headline – they read it, they agree with it, they move on. Also one should avoid truisms. All ‘models’ are indeed wrong – they are models, not perfect representations of the real world. The real question is whether they are useful – what do they underestimate? overestimate? and are they sufficiently complete? Thus a much better title for the press release would have been more specific “”Global warming: Our best guess is likely too small” – and much less misinterpretable!
Secondly, a lot of the confusion is related to the use of the word ‘model’ itself. When people hear ‘climate model’, they generally think of the big ocean-atmosphere models run by GISS, NCAR or Hadley Centre etc. for the 20th Century climate and for future scenarios. The model used in Zeebe et al was not one of these, instead it was a relatively sophisticated carbon cycle model that tracks the different elements of the carbon cycle, but not the changes in climate. The conclusions of the study related to the sensitivity of the climate used the standard range of sensitivities from IPCC TAR (1.5 to 4.5ºC for a doubling of CO2), which have been constrained – not by climate models – but by observed climate changes. Thus nothing in the paper related to the commonly accepted ‘climate models’ at all, yet most of the commentary made the incorrect association.
To summarise, there is still a great deal of mystery about the PETM – the trigger, where the carbon came from and what happened to it – and the latest research hasn’t tied up all the many loose ends. Whether the solution lies in something ‘fundamental’ as Dickens surmises (possibly related to our basic inability to explain the latitudinal gradients in any of the very warm climates) , or whether it’s a combination of a different forcing function combined with more inclusive ideas about climate sensitivity, is yet to be determined. However, we can all agree that it remains a tantalisingly relevant episode of Earth history.
John P. Reisman (OSS Foundation) says
I think the title was an unfortunate choice and made it easier for it to go the wrong way, but it would have likely gone that way anyway as certain people are chomping at the bit for new information they can spin out of context.
In celebration of the latest deni alist drama I finished a new page I have been playing with.
http://www.ossfoundation.us/projects/environment/global-warming/empirical-v.-observations/
this compliments of course an old classic
http://www.ossfoundation.us/projects/environment/global-warming/models
and my favorite
http://www.ossfoundation.us/projects/environment/global-warming/myths/models-can-be-wrong
As always, any contextually relevant comments are appreciated if it looks like I am misrepresenting things.
Brian Schmidt says
#8 – another vote here for “spin cycle science”. I’ll also suggest “overspun science”.
I have a proposed (partial) solution to this problem: major universities siphon off a percent of their PR budgets to create an independent, nonprofit, scientific press office. That office puts out its own press releases for papers produced by member universities. The arms-length relationship will produce more honest releases and headlines, ones that also clearly delineate when the author is talking about the paper’s conclusions and when, as in Dicken’s case, the author is speculating beyond what reviewers allowed.
The advantage to universities (aside from trivialities like Truth, etc.) would be perceived greater credibility of the resulting press release.
Hank Roberts says
> global warming on Mars
Theo, that’s #16: Mars is warming
http://www.skepticalscience.com/argument.php
SecularAnimist says
D Robinson wrote: “It’s only a small percentage of the media that bothers to cover anything counter to AGW theory, it tends to be fringe coverage at best …”
Do you consider The Washington Post and The Wall Street Journal and Fox News and CNN — all of which have repeatedly and prominently presented ExxonMobil-funded denialist propaganda as legitimate “debate” and have for DECADES, until quite recently, given denialist frauds and cranks “equal time” with the world’s climate science community — to be “fringe”?
Adam says
The many interesting twists to the PETM story tells me that we’re not so sure if its missing feedbacks or missing carbon…
Methane has taken center stage since the composition of carbon in our atmosphere became very light, very fast during the PETM. There is pretty good evidence that this occurred in a single pulse, in less than 500 yrs:
http://es.ucsc.edu/%7Ejzachos/pubs/Zachos_etal_RoyS_07.pdf
There is also convincing evidence that the earth was warming a couple thousand years prior to the injection of light carbon (Slujis et al., 2007, Nature).
Volcanoes were doing interesting things during exactly 55 Mya (breaking up Greenland from Europe and stuff, Storey et al., 2007, Science) and outgassing from the mantle does not change the carbon composition of the atmosphere…
So what’s so crazy about more carbon and less feedbacks?
glen says
I originally posted on the R. Zeebe et al paper as OT on july 17th under the Sea ice minimum forecasts, 43rd.
Gavin writes: “In fact, the concept we should be looking at is the Earth System Sensitivity (a usage I am trying to get more widely adopted) as we mentioned last year in our discussion of ‘Target CO2‘. The point is that all of those factors left out of the Charney sensitivity are going to change, and we are interested in the response of the whole Earth System – not just an idealised little piece of it that happens to fit with what was included in GCMs in 1979.”
Science Magazine(31 july 09) has a book review by LR Kump of He Knew He Was Right, The Irrepressible Life of James Lovelock and Gaia / James Lovelock In Search of Gaia by John Gribbin and Mary Gribbin
LR Kump writes: “Throughout his book, Lovelock decries American science. He refers to the “disastrous mistake” of assuming “that all we need to know about the climate can come from modeling the physics and chemistry of the air in ever more powerful computers.” The geochemists’ box models of global biogeochemical cycles and the atmosphere and ocean scientists’ general circulation models ignore the physiology of a living planet. They assume linear parameterizations where life instills parabolas, with multiple equilibria and sharp transitions from homeostasis to positive feedback and system failure when pressed beyond optima. In Lovelock’s view, American science is too compartmentalized into narrow disciplines, too reductionist in approach, so well funded as to stifle creativity, and too reliant on computer models. Lovelock places higher value on observation and experimentation than on modeling. To understand his perspective, imagine Marcus Welby, M.D., using computer models to generate a prognosis for progression of a serious disease.
According to Lovelock, America has largely ignored or rejected Gaia theory, a claim that is somewhat difficult to rectify with the Gribbins’ conclusion that it has become increasingly respectable. In Lovelock’s view, Gaia puts American scientists at professional risk. [Indeed, a reviewer of the manuscript of a 1994 paper I coauthored with Lovelock (4) claimed that its publication would likely ruin my career.] He believes that the concept casts doubt on the way science is divided into disciplines, presumably challenging the value of their associated administrative and educational bureaucracies. “Gaia is a holistic concept and therefore unpalatable to rational Earth and life scientists.” As counterpoint, the Gribbins proffer the two American Geophysical Union Chapman Conferences on the topic and the elevated role that Earth system science has played in the scientific assessment of global warming. Yet we American scientists rarely utter the word Gaia except in critique of, or commentary on, these meetings or the writings of Lovelock, Lynn Margulis, and precious few others; Science’s internal search engine reveals no papers where Gaia was used except in this fashion.
Early in Final Warning, Lovelock rejects CO2 and climate stabilization schemes, referring to them as “no better than planetary alternative medicine.” Here he sees little prospect in alternative and renewable energy or in most geoengineering schemes, with the exception of the burial of intentionally charred biomass (5). Later—perhaps reflecting an evolution of thought during the writing of the book and an expression of the inventor in him—he argues that we now have little option but to try various geoengineering schemes to moderate what he feels are the inevitable dire consequences of “global heating.”
Do we do ourselves and society a disservice by ignoring Gaia? Are the intuitions of a planetary physician a more accurate prognosis for the future than those produced by supercomputers created by multidisciplinary teams of scientists? Climate scientists acknowledge the uncertainties of their projections and are working diligently to reduce them. But are we properly incorporating the feedbacks between climate and a globally potent biota? And do we even have time to refine these models? Lovelock thinks not. He calls for an immediate shift of focus to adaptation to a hothouse world, expecting that in the coming decades humanity will be forced to migrate to the few habitable refugia that remain (including the British Isles). The world population will be reduced from billions to millions, Gaia selecting those humans with the traits to live sustainably (her revenge). Models might come to serve an important alternative role even in this crisis phase, if and when it comes. Shifting from long-term projections to medium-term forecasting, future models with proper planetary physiology might identify the harbingers of the approach of a climate threshold, and if coupled to an expanded network of Earth observations, do so before it is too late.”
The Zeebe et al study itself concluded, “… our results imply a fundamental gap in our understanding of the amplitude of global warming associated with large and abrupt climate perturbations,” and that “this gap needs to be filled to confidently predict future climate change.”
Perhaps Lovelock’s second opinion for our Planet deserves a second look.
Doctor K says
http://en.wikipedia.org/wiki/Global_climate_model
I found the above site to be the most neutral in discussing climate models.
1. Cloud representation and convection appears to be poorly understood in the models and seems to be a fairly big contributor to the level of uncertainty.
2. Model grids and how they are applied at the poles also appear to cause some uncertainty.
3. Positive feedback contribution of greenhouse gas vs water vapour cannot be very certain until # 1 can be resolved.
Neven says
Ike Solem wrote: “http://www.sciencedaily.com/releases/2009/08/090806141512.htm
“Long Debate Ended Over Cause, Demise Of Ice Ages? Research Into Earth’s Wobble”
Denialists might want to try respinning that one, as it talks about how solar changes initiated the ice age”
Actually, this was on WUTT last week. After it was pointed out that the title was misleading it was changed from ‘CO2 not involved’ to ‘CO2 not main driver’.
Coincidentally my comment there was: “This is one of those articles, like the one about the PETM event, that really makes me wonder what it’s doing on WUWT. The comments make for entertaining reading, though. The power of subjective interpretation never ceases to amaze me.”
Hank Roberts says
> most neutral
Checked the history?
Thomas says
Put oil firm chiefs on trial, says leading climate change scientist
James Hansen, one of the world’s leading climate scientists, will today call for the chief executives of large fossil fuel companies to be put on trial for high crimes against humanity and nature, accusing them of actively spreading doubt about global warming in the same way that tobacco companies blurred the links between smoking and cancer.
http://www.guardian.co.uk/environment/2008/jun/23/fossilfuels.climatechange
Good Idea!
Bogey says
When, and after this occurred, the planet still had all its coal, oil, natural gas etc, etc reserves still intact. Not so this time.
dhogaza says
Doctor K speaks of …
And when challenged, returns a list of model issues that are not “unsubstantiated assumptions”.
Figures.
Jim Galasyn says
Thanks for the follow-up, Doctor K.
As I’m sure everybody agrees, the models are certainly not “complete,” but they certainly are useful, and improving all the time. But these widely acknowledged deficiencies are still far from being “numerous unsubstantiated assumptions”. To my mind, the assumptions are pretty well substantiated by observation and realized as parameterizations.
Joe Hunkins says
For me it is frustrating that the posts about such findings so often take the form of “look at how this is misinterpreted by climate denier evildoers and ignorant journalists” followed by the predictable complimentary comments.
Instead, there’s clearly a growing body of reasonably reviewed climate research that is challenging the veracity of existing climate models and some of the prevailing assumptions about natural climate variation.
Since the AGW hypothesis rests on the assumption that current models account well for natural variations these findings are significant and should be a core concern of any reasonably minded scientific person.
Rather than dismissing them pretty much out of hand as “inadequate”, “uncertain” or “denialist propaganda” as is generally done here, it would be enlightening to approach ideas that conflict with current models as good food for thought.
Lawrence Brown says
Re:56 “Positive feedback contribution of greenhouse gas vs water vapour cannot be very certain until # 1 can be resolved.”
Greenhouse gases are a forcing, not a feedback.
Hank Roberts says
Well, joeduck (Hunkins), your blog claims Realclimate is a place you’ve found
“personal abuse and reckless pseudo-science”
Want more of what you always get here?
Here’s mine again. Brace yourself:
Citations, please, for what you claim?
Which papers are you talking about,
where are you reading about them,
and why do you consider your source a reliable one?
Hank Roberts says
Another thought for John Mashey:
ScienceyPR
or maybe
Sciencey-FAIL
See the NYT column On Language:
How Fail Went From Verb to Interjection
dhogaza says
Well, then, list those ideas that conflict with current models that are “good food for thought” rather than last year’s (or, in many cases, last decade’s) garbage, and tell us why you think they are.
Let’s see if you can do better than Doctor K’s epic fail above, when on being asked to list the “unfounded assumptions” built into models returned with a list of things that, well, aren’t assumptions at all, founded or unfounded.
Edward Greisch says
29 Doctor K: Uncertainty is a 2 edged sword. It cuts both ways. We can’t prove that global warming Will make us humans extinct in 200 years and we also can’t prove that it will Not make us extinct in 5 years. The climate scientists who run RealClimate are as good at probability, statistics, risk analysis and error bars as anybody, probably better, but you don’t see the hairy math here in this popular level publication. Since most people are allergic to math, they stop reading at the first mathematical symbol. That is why it is necessary to leave the math, especially the really hard math like statistics, out of the article intended for popular consumption.
David B. Benson says
Joe Hunkins (63) — AGW is not a hypothesis, but a consequence of what is known about Earth’s climate. Essentially all important parts have been worked out some time ago. You’ll see that by reading “The Discovery of Global Warming” by Spencer Weart:
http://www.aip.org/history/climate/index.html
Andy Revkin’s review:
http://query.nytimes.com/gst/fullpage.html?res=9F04E7DF153DF936A35753C1A9659C8B63
Thomas says
Mostly I have a science question. Why is the arctic amplification expected to be larger in Paleocene time? My semi-naive guess would be that lack (or minimal) snow-ice albedo feedback would greatly reduce arctic amplification. So what other process is overwhelming that one?
But, a comment on the denialism. Much of it is collective-ad-hominem in nature “those stupid scientists don’t know what they are talking about”. So any claim that we don’t understand climate sensitivity, even if the new data supports more dangerous higher sensitivity, can be used to discredit climate science in general. Once that is done, the precautionary principle can be thrown out the window.
Hank Roberts says
Gavin, to veer momentarily back to the subject you started here —
You’ve been clear about the many uncertainties of the paleo record before and during the PETM event. Have you tried to lay out an Earth System Sensitivity for that situation? I’d love to see that set side by side one for the current planet.
I’d also like to see that somehow show the difference in rates of change.
(There’s more but something’s hitting the spam filter so I’m chopping it here)
Hank Roberts says
Second chunk to test the spam filter:
For example, we know (or I think) that at the pre-human geological pace, CO2 in the atmosphere changed at a rate slow enough that ocean pH did not change (because it’s taken up by biogeochemical cycles that operate over millenia not decades). http://scholar.google.com/scholar?q=plankton+cooled+greenhouse
That’s change at the pace of evolution (with one-year generations).
Compare that to the present when ocean pH is changing already. Very different situation. We’ll get selection effects, and we’ve got a much deeper archive of genetic variation since the past big extinctions — far more genes are out there even at low frequencies from previous climate excursions, nature doesn’t lose them, so when the next excursion comes those can show up very fast; we’re not waiting for hopeful monsters to evolve to suit a greenhouse warm planet, this time.
Copepods living away from the continental shelves already evolved, what, 200 million years or so ago? last time around, or was it the time before that …
Even though there may be genes available to be selected by changed climate, that works only at the population level (luckily organisms with a planktonic stage can spread fast!) — but for an ecosystem, timing problems show up at current rapid rates of course, e.g. http://rspb.royalsocietypublishing.org/content/272/1581/2561.abstract
“… many species are becoming mistimed due to climate change. We urge researchers with long-term datasets on phenology to link their data with those that may serve as a yardstick, because documentation of the incidence of climate change-induced mistiming is crucial in assessing the impact of global climate change on the natural world.”
I know much is beginning to be done with the biological cycling — I saw a paper by Dr. LeQuere mentioning that plankton researchers have many records of relative frequencies of species at various locations and times, contemporary and paleo, and that these ought to be made more easily available to modelers.
Dr. LeQuere has been arguing with other plankton folks for some years about the need to begin incorporating what we know into models sooner than later, e.g. (certainly not the latest word, just what I found easily)
http://plankt.oxfordjournals.org/cgi/reprint/28/9/871?ijkey=G9Ezn6vQr47zMjc&keytype=ref
Reply to Horizons Article ‘Plankton functional type modelling: running before we can walk’ Anderson (2005): I. Abrupt changes in marine ecosystems?
and
http://www.agu.org/pubs/crossref/2009/2009EO040004.shtml
Plankton Functional Types in a New Generation of Biogeochemical Models
C Le Quéré, S Pesant – Eos Trans. AGU, 2009 – agu.org
If it makes sense as an extended exercise — I’d really like to see or help with an attempt to lay out an Earth Systems description side by side for contemporary and PETM conditions, error bars and all.
Jeff goldstein says
This is more than an issue about public relations, press releases, and what to call ‘science’ that is not science. This is about an absolute crisis-level need to educate the public. Please let’s not use the term “know-nothings”. It is incredibly demeaning, and can be viewed to extend far beyond the vocal anti-AGW lobby group to the public at large. The research community (you) need to work CLOSELY with the education community NOW to inform a voting public. The only reason John Boehner’s ‘climate change model’ gets traction is because of an uninformed pubic. The truth lies in the data and their interpretation by you. It must be translated rapidly and effectively for public consumption. Folks we’re not fighting back enough. This is a war for the future of the planet and our children. Education is the only hope if we are to change the course of this ship.
FYI – when it comes to science education for the public at large, standard practice in museums and science centers is to assume the equivalent of 5th grade science literacy if the goal is conceptual understanding.
I said what I needed to say based on what I read in this thread. You can now throw darts and arrows.
Here is my humble attempt at an explanation. You can help by helping me tune it.
A Day in the Life of the Earth: Understanding Human-Induced Climate Change, with forward by Jim Hansen, Dir. of NASA/GISS
For teachers and parents to educate our children, and to educate the public. The crisis is real. The crisis is now.
http://blogontheuniverse.org/2009/06/13/a-day-in-the-life-of-the-earth/
Jeff Goldstein
Center Director
National Center for Earth and Space Science Education
USA
jeffgoldstein@ncesse.org
Hank Roberts says
Thomas, that looks like a productive question:
http://www.google.com/search?q=Why+is+the+arctic+amplification+expected+to+be+larger+in+Paleocene+time%3F
P. Lewis says
The English lexicon is replete with nouns and adjectives to describe the behaviour John Mashey wishes to define:
fib, lie, untruth, falsehood, fabrication, fiction, fairy story, tall tale, terminological inexactitude, whopper, …;
fraudulent, cheating, untrustworthy, false, untruthful, dishonourable, unscrupulous, unprincipled, corrupt, swindling, deceitful, deceiving, deceptive, lying, crafty, cunning, designing, mendacious, double-dealing, underhand, treacherous, perfidious, unfair, unjust, disreputable, rascally, roguish, knavish, crooked, shady, bent, ….
Grab a few.
Then, in comparison with words such as “asexual”, one could have “ascientific”.
Steve Hill says
Thank you Oakden Wolf for your work on the links – very useful.
Nagraj Adve says
In a recent book ‘Ice, Mud and Blood’, the scientist Chris Tunney mentions methane hydrates leaking out from Santa Barbara, California!! (I don’t have the book here with me and hence can’t give you the page number).
I was under the impression that it will be some time before methane hydrates or clathrates thaw to release methane. Can anyone shed light or informed opinion on this?
[Response: There are many places on the continental shelf where there is a very dynamic balance between methane gas, hydrates and overlying sediment and water column. At these points (sometimes called seeps) you can find hydrates on the sea floor and in the process of outgassing. Santa Barbara is one such place, and there are recent reports from the Siberian continental shelf showing similar things. These haven’t been studied for very long and so the nature of the dynamics (is this outgassing roughly in balance with methane production? over what time period?) is unclear. Given the depths, water temperature stability, and geologic environment, this isn’t likely to be related to any ongoing climate changes. However, this is not my speciality, and so if anyone could add more details and refs. that would be helpful. – gavin]
Pekka Kostamo says
“Come Hell and High Water”, popularly known as the “Business As Usual” scenario.
An unfortunate choice of words the latter, really. It means for most people that to do nothing differently makes things stay the same as before, so why worry?
Impact related nomenclature is called for.
pete best says
http://climatecongress.ku.dk/pdf/synthesisreport
This is worth a read on all those pesky climate change questions answered in the equivilent of a IPCC interim report as of Copenhagen this year.
Lloyd Flack says
With the quite different coninental configuration in the Paleocene surely we would expect both the Charney sensitivity and the Whole Earth Sensitivity to be different fron the current values. But would these sensitivities be higher or lower and by how much? To what extent can this help explain the temperature changes in the PETM?
Hugh Laue says
Jeff #74 Excellent. Thank you. Thanks to Jim Hansen for his strong stand. Thanks to RealClimate for this beacon of sanity in a crazy world.
J. Bob says
#69 – But normal people can read temperature graphs without the need of a lot of statistics. A rule of thumb we used was “the more complicated the analysis, the more suspicious”.
Doctor K says
Jim.
Agree the wording was OTT on my part but the effect of difficiencies/uncertainties in the climate models can result in a very wide range of outcomes. In my QRA field, to address the uncertainty of future events against historical probabilities of failures, we sometimes apply a factor (usually an order of magnitude) to the final results. The factor depends on the reliability of the historical data. I am not fully versed on the selected “forcing” (thanks LB) factor given to CO2 and how much of a range it has in the various models but is there a method applied to address the uncertainty with this?
On another note. There are some on this blog who seem angry and intollerant re. some of my posts. I am trying to progress the dialogue, so will not get snotty back at any of you. Peace.
Rod B says
glen (56), does Lovelock hold that belief because he scientifically thinks it’s inevitable, or because his tack requires less work, as in ‘might as well lie back and enjoy it?’
Rod B says
dhogaza (62), though Doctor K might be defining “substantiated” as having extensive external justification, instead of being strongly supported by the feelings and self-satisfaction of the modelers. (Which. btw, does not mean the modelers shouldn’t run with what they know.)
Hank Roberts says
P.Lewis, wrong words I think. Did you actually read the two press releases linked above and compare them? Read the one from Rice, note the date, Google how its hot button phrases spread; read the one from Moana, note the date, Google for mentions of it. Consider the difference between the two press releases. What did the Rice PR department do?
Well, where I grew up, it’d be called crapping in their own lunchpail.
Mark says
“A rule of thumb we used was “the more complicated the analysis, the more suspicious”.
Comment by J. Bob”
But the very simple analysis shows CO2 is the major driver. Nothing else manages the change anywhere near that closely. And the denialists trying to make out it’s something else are the ones making weird statistical jumps.
Yet somehow you feel that these are above reproach…
Jim Galasyn says
Hey Doctor K, fair enough. You might develop more confidence in our understanding of climate by starting with the basics. By this, I mean don’t worry too much about the big GCMs for now, and instead focus on the underlying thermodynamics. From fairly fundamental radiation physics, we can show that for a large increase in GHG forcing, the atmosphere and oceans must warm, in order to regain radiative equilibrium with space. To my mind, this much is uncontroversial, except to the most partisan.
The ongoing work of climate science community is to refine our understanding of the how this warming will unfold. This is a much more difficult (and interesting) question. I always recommend Ray’s new book, Principles of Planetary Climate, because it’s a rigorous but accessible treatment, and the online draft is free!
dhogaza says
So do you have any evidence that the only external justification of GCMs is the “feelings and self-satisfaction of the modelers”?
Or is that just the typical unjustified criticism based on the feelings and self-satisfaction of the denialist community that, for the most part, has never even peeked at the documentation on how, say, GISS Model E works.
Adam says
The many interesting twists to the PETM story tells me that we’re not so sure if its missing feedbacks or missing carbon…
Methane has taken center stage since the composition of carbon in our atmosphere became very light, very fast during the PETM. There is pretty good evidence that this occurred in a single pulse, in less than 500 yrs:
http://es.ucsc.edu/%7Ejzachos/pubs/Zachos_etal_RoyS_07.pdf
There is also convincing evidence that the earth was warming a couple thousand years prior to the injection of light carbon (Slujis et al., 2007, Nature).
Volcanoes were doing interesting things during exactly 55 Mya (breaking up Greenland from Europe and stuff, Storey et al., 2007, Science) and outgassing from the mantle does not change the carbon composition of the atmosphere…
So what’s so crazy about more carbon and less feedbacks?
Hank Roberts says
> more carbon and less feedbacks
Well, where is the missing carbon?
Where did it come from, and where did it go?
Well, maybe carbon might come from geology, say heating of carbonate rocks or coal, if a big magma plume rose up.
Would that leave any evidence?
But if it’s excess carbon, to cause the PETM temperature spike, it had to go into the atmosphere, then be removed through biogeochemical cycling at geological speed.
Where would it go that it’s disappeared — and how might it come back to surprise us, if it’s hidden away somehow?
Hank Roberts says
Oh, wait — where’d this come from?
Adam, what’s your source for saying this?
> outgassing from the mantle does not change
> the carbon composition of the atmosphere
If you are referring to the isotope ratio that can’t be right.
http://www.britannica.com/EBchecked/topic/1424734/evolution-of-the-atmosphere/30206/Processes-affecting-the-composition-of-the-early-atmosphere
Rod B says
dhogaza (90), I delibertly and overtly did not say or imply “only.”
John Mashey says
(Next try to get past spam filter).
Thanks to all, so far, for comments on #8; I’m accumulating these suggestions.
Let me try again, as I obviously didn’t explain it well enough.
Anti-science seems a good term for {deliberate obfuscation, cherry-picking, purposeful mistitling, and claiming that a paper says something it obviously doesn’t, etc, etc}.
MALICE vs INCOMPETENCE
But, I always recall Napoleon’s:
“Never ascribe to malice, that which can be explained by incompetence.”
(And to be fair, “incompetence” may be overly pejorative, as sometimes the real problem is difficult, and it may take strong competence to succeed.)
While many acts are obviously malice or incompetence, some are less obvious, and it takes a while to discern which is which. Then, if one wants to make real improvements, one can at least try to help raise the level of competence (since malice doesn’t easily get fixed). That was the point of what to do about bad science reporting.
In another domain, read Wachter & Shojania’s Internal Bleeding – The Truth Behind America’s Terrifying Epidemic of Medical Mistakes. This is a calm discussion by two credible UCSF doctors about systemic errors and possible solutions – hospitals may kill people, but it’s not from malice. Then, I’d guess the publisher insisted on a sensationalist title.
SCIENCE TO WIDESPREAD PUBLICITY, WITH INCREASING NOISE
In #8, I’m concerned, not with malice, but with various forms of incompetence/error, for which I seek a good term, or terms, if different effects need them. Consider the progression from real science to widespread publicity, and the various sources of *noise* that may incrementally distort the original signal:
1) A researcher may claim (and often even believe) more significance than is justified. Of course, there may well be errors, especially in early papers on a subject, and arguments over significance are part of real science.
It is always worth recalling the military maxim: “The first report from the front is wrong”, and science by press conference can often make one nervous.
2) A researcher may perfectly well calibrate their results, but not be very good at communicating to a lay audience, especially when regarding uncertainties. One might call this incompetence, but “lacking this skill” is more like it.
Even with strong competence, good summarization can be very difficult. It is a nontrivial skill to start with a scientific paper, replete with careful caveats, error-barred charts, tables, backup data, etc, and write a 2-page press release that does it justice. If you find somebody who can do that, take good care of them!
(As a related example familiar to this audience, it is instructive to read MBH98, MBH99, then follow that topic through TAR WG I full report, TS, and finally SPM, looking especially at error bars (or gray depiction), and considering the likely interpretations by general audiences. Worse were derivative versions in which the gray uncertainty background disappeared entirely, giving a visual impression of exactitude never claimed by the real research. This is a hard problem, as it is a much easier to draw and print/display one simple line, than to show multiple lines and density plots (as per AR4 WG I, p.467). General audiences just don’t think this way. The same effect always arose in computer performance ratings, in which people wanted “one number”, despite experts telling them there was no such thing.
3) A researcher may not write the press release, and the writer can easily err, quite unconsciously. Assuming the researcher has the time and patience to do good reviews, they may not even realize that the press release fails to convey to the public what they think it does. I.e., *they* know what their research said, and they may not realize that a 1-2-page press release may be ambiguous.
I spent much time (over decades) wrestling with marketing departments about press releases. If you haven’t done this, you may not realize how much effort is needed to make sure something claims as much as it should, but not too much. In some cases, you just run out of time. People mess up press releases *often*.
4) Media get press releases, and maybe even interviews the researcher, and then writes stories. Occasionally, one finds malice (I experienced one like that, for a national magazine), but mostly, reporters skills and knowledge follow a normal distribution, from really awful, to terrific, with most being more-or-less average.
At the very best, a fine reporter does their homework and a good interview , then writes an accurate , clear article. But, even then, *someone else* normally puts the headline on the article, with tight space constraints, and *anything* can happen. (Think of a headline as TwitterScience, but with even less characters).
5) Media is generally incented to make stories look more important, more exiting, more entertaining, more threatening (as in the book mentioned earlier).
6) But of course, there may well be anti-science malice tacked on at *any* stage of this process.
A PERSONAL HORROR STORY
I once did an interview with a fine WSJ reporter, who wrote a good article … and the result was:
The MIPS Stock Glitch Or How I lost 15% of a Company’s Stock Value in a Few Hours
This was amusing in retrospect, but not at the time.
a) I was one of the few techies allowed/encouraged to speak to press and customers, i.e., I was experienced at this, and being a VP, was quotable.
b) The writer was excellent. He’d studied the paper I wrote that started this, and did a good phone interview.
c) The article was well-written. I was very impressed by his ability to capture the essence of the issue in a short article. I was ecstatic that it was placed on the top-left corner of the second section, first page.
d) And it still blew up, badly… because the article headline had an unfortunate ambiguity. I don’t think this was malice on the part of the headline-writer. The headline wasn’t even wrong, just easy to misinterpret.
e) Compared to explaining complex science, this topic was *trivially easy*.
SUMMARY
If one views this as a signal-transmission issue, maybe the whole thing is science-noise, where some of the noise is natural, and some (anti-science) is purposeful attempt to distort or obscure the signal. Early in the process, it may be hard to tell which is which.
Doctor K says
Jim. High school physics also has a relationship between elevation and temperature PV=RT. Hence, as
elevation increases, the temperature drops and vica versa. The 33K rise in temperature between elev. 4000m
and sea level for a column of air can be attributed to that fact of physics under the ideal gas law. If GH gas contribution was doubled, this value would not vary much. How do the models allow for this fact of physics? Has the media done any of their own investigative work on this? How has politics at the UN
played a role in all of this? From what I can tell, the contribution of CO2 to atmospheric warming is overstated drastically, based on the above. Can anyone here explain this?
CM says
I’m a bit confused. Gavin’s post says (emphasis added):
I thought the range was the convergent outcome of both models (ever since Charney split the difference between Manabe’s and Hansen’s models) and observations?
[Response: That’s how it started, but the AR4 range (2 to 4.5deg C) is actually based on data-derived constraints. – gavin]
CM says
Re terminology, I like “spin science” (or “sexed-up science”, perhaps?).
A more specific suggestion for what the press release contained:
unsoundbite
thingsbreak says
@78[Nagraj Adve & Gavin]:
Kennet et al. 2000 and E. Solomon et al. 2009 and their refs might be helpful.
Mark says
“From what I can tell, the contribution of CO2 to atmospheric warming is overstated drastically, based on the above. Can anyone here explain this?”
No.
It’s rather like asking: since the result of the hamiltonian solution to an asymetrically charged nucleus results in quantum imbalances, there seems to be an error in the assumption that atomic fusion will release any energy of use. Can anyone here explain this?
PV=nRT has nothing to do with CO2 and its only contribution to AGW science is that such an adiabatic lapse rate ensures that the saturated gas argument is incorrect, since such a conclusion only has validity when taken through a homogenous gas, where this lapse rate ensures no such limitation pertains.