Guest commentary by Spencer R. Weart, American Institute of Physics
I often get emails from scientifically trained people who are looking for a straightforward calculation of the global warming that greenhouse gas emissions will bring. What are the physics equations and data on gases that predict just how far the temperature will rise? A natural question, when public expositions of the greenhouse effect usually present it as a matter of elementary physics. These people, typically senior engineers, get suspicious when experts seem to evade their question. Some try to work out the answer themselves (Lord Monckton for example) and complain that the experts dismiss their beautiful logic.
The engineers’ demand that the case for dangerous global warming be proved with a page or so of equations does sound reasonable, and it has a long history. The history reveals how the nature of the climate system inevitably betrays a lover of simple answers.
The simplest approach to calculating the Earth’s surface temperature would be to treat the atmosphere as a single uniform slab, like a pane of glass suspended above the surface (much as we see in elementary explanations of the “greenhouse” effect). But the equations do not yield a number for global warming that is even remotely plausible. You can’t work with an average, squashing together the way heat radiation goes through the dense, warm, humid lower atmosphere with the way it goes through the thin, cold, dry upper atmosphere. Already in the 19th century, physicists moved on to a “one-dimensional” model. That is, they pretended that the atmosphere was the same everywhere around the planet, and studied how radiation was transmitted or absorbed as it went up or down through a column of air stretching from ground level to the top of the atmosphere. This is the study of “radiative transfer,” an elegant and difficult branch of theory. You would figure how sunlight passed through each layer of the atmosphere to the surface, and how the heat energy that was radiated back up from the surface heated up each layer, and was shuttled back and forth among the layers, or escaped into space.
When students learn physics, they are taught about many simple systems that bow to the power of a few laws, yielding wonderfully precise answers: a page or so of equations and you’re done. Teachers rarely point out that these systems are plucked from a far larger set of systems that are mostly nowhere near so tractable. The one-dimensional atmospheric model can’t be solved with a page of mathematics. You have to divide the column of air into a set of levels, get out your pencil or computer, and calculate what happens at each level. Worse, carbon dioxide and water vapor (the two main greenhouse gases) absorb and scatter differently at different wavelengths. So you have to make the same long set of calculations repeatedly, once for each section of the radiation spectrum.
It was not until the 1950s that scientists had both good data on the absorption of infrared radiation, and digital computers that could speed through the multitudinous calculations. Gilbert N. Plass used the data and computers to demonstrate that adding carbon dioxide to a column of air would raise the surface temperature. But nobody believed the precise number he calculated (2.5ºC of warming if the level of CO2 doubled). Critics pointed out that he had ignored a number of crucial effects. First of all, if global temperature started to rise, the atmosphere would contain more water vapor. Its own greenhouse effect would make for more warming. On the other hand, with more water vapor wouldn’t there be more clouds? And wouldn’t those shade the planet and make for less warming? Neither Plass nor anyone before him had tried to calculate changes in cloudiness. (For details and references see this history site.)
Fritz Möller followed up with a pioneering computation that took into account the increase of absolute humidity with temperature. Oops… his results showed a monstrous feedback. As the humidity rose, the water vapor would add its greenhouse effect, and the temperature might soar. The model could give an almost arbitrarily high temperature! This weird result stimulated Syukuro Manabe to develop a more realistic one-dimensional model. He included in his column of air the way convective updrafts carry heat up from the surface, a basic process that nearly every earlier calculation had failed to take into account. It was no wonder Möller’s surface had heated up without limit: his model had not used the fact that hot air would rise. Manabe also worked up a rough calculation for the effects of clouds. By 1967, in collaboration with Richard Wetherald, he was ready to see what might result from raising the level of CO2. Their model predicted that if the amount of CO2 doubled, global temperature would rise roughly two degrees C. This was probably the first paper to convince many scientists that they needed to think seriously about greenhouse warming. The computation was, so to speak, a “proof of principle.”
But it would do little good to present a copy of the Manabe-Wetherald paper to a senior engineer who demands a proof that global warming is a problem. The paper gives only a sketch of complex and lengthy computations that take place, so to speak, offstage. And nobody at the time or since would trust the paper’s numbers as a precise prediction. There were still too many important factors that the model did not include. For example, it was only in the 1970s that scientists realized they had to take into account how smoke, dust and other aerosols from human activity interact with radiation, and how the aerosols affect cloudiness as well. And so on and so forth.
The greenhouse problem was not the first time climatologists hit this wall. Consider, for example, attempts to calculate the trade winds, a simple and important feature of the atmosphere. For generations, theorists wrote down the basic equations for fluid flow and heat transfer on the surface of a rotating sphere, aiming to produce a precise description of our planet’s structure of convective cells and winds in a few lines of equations… or a few pages… or a few dozen pages. They always failed. It was only with the advent of powerful digital computers in the 1960s that people were able to solve the problem through millions of numerical computations. If someone asks for an “explanation” of the trade winds, we can wave our hands and talk about tropical heating, the rotation of the earth and baroclinic instability. But if we are pressed for details with actual numbers, we can do no more than dump a truckload of printouts showing all the arithmetic computations.
I’m not saying we don’t understand the greenhouse effect. We understand the basic physics just fine, and can explain it in a minute to a curious non-scientist. (Like this: greenhouse gases let sunlight through to the Earth’s surface, which gets warm; the surface sends infrared radiation back up, which is absorbed by the gases at various levels and warms up the air; the air radiates some of this energy back to the surface, keeping it warmer than it would be without the gases.) For a scientist, you can give a technical explanation in a few paragraphs. But if you want to get reliable numbers – if you want to know whether raising the level of greenhouse gases will bring a trivial warming or a catastrophe – you have to figure in humidity, convection, aerosol pollution, and a pile of other features of the climate system, all fitted together in lengthy computer runs.
Physics is rich in phenomena that are simple in appearance but cannot be calculated in simple terms. Global warming is like that. People may yearn for a short, clear way to predict how much warming we are likely to face. Alas, no such simple calculation exists. The actual temperature rise is an emergent property resulting from interactions among hundreds of factors. People who refuse to acknowledge that complexity should not be surprised when their demands for an easy calculation go unanswered.
Jim Eager says
Re Barton @548, should that be G.S. Callendar?
dhogaza says
In fact, one could state that he used a “back of the envelope”-style calculation …
Richard Sycamore says
Francois, if you try to discuss climate science here you’ll have to learn to put up with the hounding of the gatekeepers: Ray, Hank, Barton, David, etc. Your credentials have not earned you the right to question the consensus. The time for discussing the science is past; it is time for action. Etc.
Francois opened by suggesting that it is reasoanble to ask that these models by explained IN DETAIL, that THAT should be part of the mandate of IPCC. And look at the controversy he generates from such a simple and reasonable suggestion. Why do you all feel it is not necessary to explain these models? What makes you believe that Arrhenius’s approximations are an adequate model? Did Arrhenius correctly predict that temperatures would fail to rise in the early 2000s, despite the huge increase of CO2? Did the GCMs? Explanations, please.
Dave Rado says
Francois Ouellette writes in #543:
Surely you aren’t really as ill-informed as you pretend to be? See here and here.
Dave
Dave Rado says
Richard Sycamore, #553 writes:
Your statement is a mite disingenious given that you are posting on a blog devoted entirely to discussing the science. Almost none of the articles on this site discuss action, they almost all discuss science, and in considerable depth. Perhaps you should try reading the articles here before you post again?
and Richard Sycamore also write:
See here and here.
For someone who claims to be interested in the science, you are astonishingly badly informed about it.
Dave
Ray Ladbury says
Francois, one needn’t have climate scientists as friends, enemies or even acquantances. Rather, one needs to know how they work, what their motivations are and how they communicate. By lumping all into the same bin, I would contend that you demonstrate a profound ignorance of the culture of not just climate science but of most sciences in general. The culture of particle physics is quite distinct from space science or condensed matter physics. And all of these are distinct from biology, ecology and so on. It is your contempt for “academic science” that renders your opinion suspect.
As to your characterization of the scientific community as hysterics (or do you know of a respected scientific professional or honorific society that has come down on the opposite side of the consensus?), again, you do not seem to understand the process. The “academic science” has identified an effect and shown with reasonable confidence that it poses a threat. That is not alarmism, but rather the first step in risk mitigation: IDENTIFY THE THREAT. Only then can we evaluate and eventually remediate either the consequences or probability of occurrence.
You say that the “academic” nature of climate science makes it suspect. OK, Francois, where else would we look for understanding? Who but those trained in climate science have actually studied and advanced understanding of the issue?
And you have the temerity to come on here and accuse the entire “academic science” community of either fraud or incompetence and then complain: “See how I’m treated here!” Oh, PLEASE! Maybe if you started out with respect rather than comtempt, you might have a friendlier reception.
Now, here is a clue to you and your buddy Richard Sycamore. Anyone who wants to can challenge the consensus. Anyone who does so outside the forum of peer-reviewed scientific literature is wasting his or her time. That is where the consensus was forged and that is the only place where it will change. Don’t like the consensus? Great. Then publish something that advances the understanding of the subject and challenges that consensus. Otherwise you are wasting your time and that of the people who actually come on here to learn the science.
dhogaza says
So why doesn’t he just download the source to one and study it? No one’s hiding anything.
Richard Sycamore says
dhogaza,
If the IPCC’s function as literature reviewer is legitimate and valuable – which I think it is – then why would a contribution synthesizing GCM science not be equally welcome? Sure, anyone can download the code and “study it”. I don’t see how that is likely to lead to a consensus on what the codes say. Do you? Finally, you say “no one is hiding anything” as though someone suggested that something is being hidden. Intent to hide is not required for errors to lie hidden undiscovered. I agree with Francois. I would like to see an attempt at the sort of summary that Spencer Weart suggests is impossible. Perhaps you disagree. However neither view merits the sort of treatment that Francois is getting.
Hank Roberts says
> gatekeeper
Bogus. A lie, from a sock puppet at CA (yours?).
A few fools believed it.
The ‘Contributors’ who manage RC are listed. I’m not one.
I’m a reader, with no ‘gatekeeper’ power on this or any weblog.
I urge people to learn for themselves, check what people claim against original sources, check references.
If that scares you away from learning, I’m doing it wrong. I’ll try to be gentler with you hereafter.
Hank Roberts says
“public policies” … “the last time evolutionary biologists did”
You don’t believe antibiotic resistance evolves? That caution is the most recent recommendation I find. Got some newer example?
Hank Roberts says
http://www.climatescience.gov/Library/sap/sap3-1/final-report/default.htm
Climate Models: An Assessment of Strengths and Limitations
Final Report, Synthesis and Assessment Product 3.1
See also press release (dtd 31 July 2008) and brochure from the Department of Energy.
SAP 3-1 Final Report Cover Climate Models: An Assessment of Strengths and Limitations. A Report by the U.S. Climate Change Science Program. [Bader, D.C., Covey, C., Gutowski, W.J., Held, I.M., Kunkel, K.E., Miller, R.L, Tokmakian, R.T., Zhang, M.H., (Authors)]. U.S. Department of Energy, Washington, DC, USA.
Mark says
Richard, #558.
I think the problem is that this questionaire will not change anything. Those who think AGW is right will see it proving that. Those that think AGW is wrong will see proof for them in it.
A better system would be to put up some of the statements pro and anti AGW and asking for what they rate them to accuracy, irrelevancy and lunacy.
But they never ask me. And they have the gall to say “our survey showed that nobody wanted cheap medium sliced bread, so we got rid of it!”.
Peter Carbonetto says
Brilliant post.
Thank you, Spencer!
Richard Sycamore says
#561
Thank you for the reference, although I’ve read it before and it merely re-asserts what Weart asserts above. Specifically:
“Climate sensitivity is not a model input. It emerges from explicitly resolved physics, subgrid-scale parameterizations, and numerical approximations used by the models—many of which differ from model to model—particularly those related to clouds and ocean mixing. The climate sensitivity of a model can be changed by modifying parameters that are poorly constrained by observations or theory.”
Francois is not asking for an assessment of model strengths and limitations. He is asking for an exposition of the models’ core calculation of climate sensitivity. Spencer Weart asserts, not that this has already been done, but that this is not possible – due to the emergent nature of the calculation. So your reference very much sidesteps the question. But thank you all the same.
Richard Sycamore says
#555 Those, err “sources” fail to make your argument, unfortunately.
But I want to thank Hank #561 for the relevant reference, which clarifies the important difference betweeen “equilibrium sensitivity” and “transient climate response”:
“Equilibrium sensitivity is defined as the long-term near-surface temperature increase after atmospheric carbon dioxide has been doubled from preindustrial levels but thereafter held constant until the Earth reaches a new steady state, as described in the preceding paragraph. Transient climate response or TCR is defined by assuming that carbon dioxide increases by 1% per year and then recording the temperature increase at the time carbon dioxide doubles (about 70 years after the increase begins). TCR depends on how quickly the climate adjusts to forcing, as well as on equilibrium sensitivity. The climate’s adjustment time itself depends on equilibrium sensitivity and on the rate and depth to which heat is mixed into the ocean, because the depth of heat penetration tends to be greater in models with greater sensitivity (Hansen et al. 1985; Wigley and Schlesinger 1985). Accounting for ocean heat uptake complicates many attempts at estimating sensitivity from observations, as outlined below.”
This helps answer my question of why Arrhenius’s calculation of equilibrium sensitivity might not be relevant to the transient climate response observed over a given short interval. More specifically:
“Equilibrium sensitivity depends on the strengths of feedback processes involving water vapor, clouds, and snow or ice extents (see, e.g., Hansen et al. 1984; Roe and Baker 2007). Small changes in the strengths of feedback processes can create large changes in sensitivity, making it difficult to tightly constrain climate sensitivity by restricting the strength of each relevant feedback process. As a result, research aimed at constraining climate sensitivity—and evaluating the sensitivities generated by models—is not limited to studies of these individual feedback processes. Studies of observed climate responses on short time scales (e.g., the response to volcanic eruptions or the 11-year solar cycle) and on long time scales (e.g., the climate of last glacial maximum 20,000 years ago) also play central roles in the continuing effort to constrain sensitivity. The quantitative value of each of these observational constraints is limited by the quality and length of relevant observational records, as well as the necessity in several cases to simultaneously restrict ocean heat uptake and equilibrium sensitivity. Equilibrium warming is
directly relevant when considering paleoclimates, where observations represent periods that are very long compared to the climate’s adjustment time. The transient climate response is more directly relevant to the attribution of recent warming and projections for the next century.”
The GCMs are loaded with assumptions that are, as the quotes above indicate, worthy of discussion and study.
As for the idea that “outsiders” do not have the necessary skill to overturn any of the assumptions amongst insiders, I think that conventional wisdom has been shown to be false in some exceptional cases. But I’ll spare you the examples that readily come to mind.
David B. Benson says
Francois Ouellette — See my comment #544. Other harms include to ski resorts, likely Nepalese peasants, and probably eastern Europe. The Tibetan glaciers are doomed which will result in harms to Chinese.
While no individual hurricane can be directly attributed to current global warming, it seems that increased average intensity can.
Hadley Centre has written that global warming will result in more extreme precipitation events. Chile had a drought last summer and two serious floods last winter, rather rare in the historical record.
Jonick Radge says
“Ray, i don’t see how I was agressive. Provocative, maybe.”
Translation: “What? Little ol’ me behaving like a troll [flutters eyelashes]?”
Lloyd Flack says
All the climate models that we create are wrong, both the the simple back of an envelope ones and the complicated GCMs. The question is which ones are useful and for what purposes?
The GCMs are designed to help give a detailed understanding of the climate and to allow useful long term predictions to be made. They are not needed to show that AGW is happening. The pattern of the trends is sufficient for that. They are needed to help put numbers to the CO2 sensitivity. They are not the only source of such figures. For example, we can make independent estimates from the current climate temperature record and from the Last Glacial Maximum.
Unfortunately these models are very complicated and it is hard for someone not highly familiar with the subject matter to check them and see whether they are doing something plausible. They look like black boxes even though they are not.
Simpler models can be created which show the general outline of what is going on. These are much easier to understand and check. They can include reasonable approximations of the effects of some of the main drivers of trends. But unfortunately, not of all important drivers. There will be emergent phenomena that will just not show up in simple models, for example circulation effects. We can put in our estimates of the average values of such effects but critics will quite reasonably question these estimates. No one has created a simple model that cannot be criticized as leaving out something important or including a challengable estimate of some parameter. And they are biased.
For reliable predictions of temperature sensitivity we have use complicated models. We have a complicated system.
And I agree with Francis that attributing harm happening now to AGW is a bit of a stretch. Maybe some of the droughts and floods are a result of AGW but I have not seen evidence that is more than strongly indicative of this. It is not a good idea to attribute current weather disasters to AGW. Most of them won’t be a result of AGW and this can lead to a boy who cried wolf situation. We have serious problems heading towards us and have to act now because we are dealing with systems with a large inertia and that are not completely reversible. We have to act before the problems are serious and we have to get the message across that waiting for obvious problems is a bad idea. We cannot afford to wait till it is obvious even to those looking for reasons to believe it is not happening.
Rod B says
Dave (554), what do you make of the quote from one of your linked references, “…planktonic foramanifera diversified, and dinoflagellates bloomed. Success was also enjoyed by the mammals, who radiated profusely around this time” in the context of this discussion?
Barton Paul Levenson says
F.O. writes:
Tell it to the Australians.
Agricultural lands worldwide used to be 20% in drought at any given time during the ’50s and ’60s. That figure is now 30%. People die from reduced food production. That’s happening NOW.
Barton Paul Levenson says
Jim —
Yes, I got the name wrong! It is G.S. Callendar. Sorry about that.
Barton Paul Levenson says
Richard Sycamore writes:
You mean you want the math and the source code? Well, I’d start with some books on atmosphere radiation like J.S. Houghton’s “The Physics of Atmospheres” (3rd ed. 2002), Grant W. Petty’s “A First Course in Atmospheric Radiation” (2nd ed. 2006), and Goody and Yung’s “Atmospheric Radiation” (2nd ed. 1989). Then add Henderson and Robinson’s “A Climate Modeling Primer” (1986). And look up and read some classic papers on the subject, like Manabe and Strickler 1964, where they introduce the first modern radiative-convective model, and Manabe and Wetherall 1967, where they substantially improved the first model by adding changes such as assuming fixed relative humidity with altitude rather than fixed absolute humidity.
Oh, and, of course, do all the problems in the books mentioned above.
Some books on climatology would also be useful, as you need more than atmospheric radiation to write a GCM. Henderson-Sellers and Robinson’s “Contemporary Climatology” (1987) is a good place to start, and don’t neglect Sellers’s classic “Physical Climatology” (1965), though you’ll have to translate the English units into metric. Hartmann’s 1994 “Global Physical Climatology” is also a good one. And our own Raymond Pierrehumbert (“Raypierre,” il dit) has a pretty good climatology primer online:
http://geosci.uchicago.edu/~rtp1/ClimateBook/ClimateVol1.pdf
You’ll also need to know a procedural computer language such as Fortran, and C/C++ and Pascal/Delphi can also be used. Stay away from interpreted languages and GUI languages, as they concentrate mostly on providing a pretty interface, whereas for a simulation you want speed, speed, and more speed. Ray uses Python, which I personally wouldn’t touch with a ten-foot pole, but if you’re not doing an actual climate simulation it works well enough.
I’m writing a book on how to write RCMs, but it probably won’t be finished for another year or so.
Francois Ouellette says
Ray Ladbury,
This is getting tiresome. All you can repeat is that I have “profound ignorance of the culture of not just climate science but of most sciences in general”, despite having demonstrated that I worked for a number of years in academia, and attained quite a high level there. Even now, 10 years after I left and stopped publishing, I am still asked about once a month to review a paper. Some people somewhere must still believe that I have some understanding of science. I’m still waiting for a list of your credentials. Does being an engineer working on damage to electronic components give you such a special insight into the scientific process, and the culture of biologists?
And you make further interpretations of what I said. Where did I say that there was “fraud and incompentence”. I don’t see any of those two words in all the comments I made. I said that the academic system of peer-reviewed publications had flaws and virtues, something that is not very controversial, and not even provocative. Why would there be entire conferences that specifically deal with peer-review if the process was perfect? So the question is: are the flaws in the process a potential problem when it comes to supplying scientific opinion on matters of public policy? We all know about the “consensus” mantra. But if the process that leads to a consensus is flawed, then the consensus is not worth anything. So we must pay attention to this too. There was probably a consensus amongst NASA engineers that the Space shuttle was safe, and then it exploded, and Richard Feynman, who was totally independent and not an expert in aeronautics, discovered that there were many flaws in how that consensus was forged. The “culture” was wrong. So it is an issue. Jumping up and down and shouting “consensus! consensus!” does not help. There are many examples where the consensus was wrong (as there are many when it was right).
Do I have a solution? I wish I had one! I have proposed elsewhere that our governments should have set up an independent lab (independent from the academic publication system), hired the best minds in the field, implemented a vast program of detailed observations on climate, and put them to work. Ideally, you would even want a second lab, independently of the first, and see if they come to the same conclusion. So a sort of Manhattan project for climate. Would that work? Hell, I have no idea. But seeing that our range of estimates for CO2 sensitivity does not seem to narrow year over year, maybe it’s time to try something new.
You know, as part of my little experience that you despise so much, I was once part of a major corporation developing fiber optic networks. The founder had hired the “best minds in the field”, litterally, and given them a big research budget and all the equipment they needed. Dare I say that I was among them? I only say it because you always question my credentials, so I feel the need to pile on. Anyway, after only a few months, we had results that far surpassed what you could find in “academic” papers. We could look at all those post-deadline papers and laugh at them. So that was the result of removing the constraints of academic research, whether it be budget constraints, the need to publish, the teaching requirement, the peer pressure. It was also achieved by focusing on a well defined problem. And no one had time to run a blog… Now before you accuse me again of profound ignorance, let me just say that I do not pretend that it would work in this case. But why not think about it? Why not give it a try?
Hank Roberts says
> 554, 569, past warming good
Rod, you’ve let slip that you think and read and aren’t dumb.
If past warming meant the last 100 years, damage is in the pipeline.
If past warming meant the PETM and other extremes, examples aplenty.
Diverting the thread into confused blather? Priceless.
Richard Sycamore says
#572 And can you be a little more specific about the derivation of 450ppm CO2 as the alleged tipping point toward runaway feedback? We all know the source and the date of this estimate. The question is the arithemtic behind the derivation. The answer is not contained in your 1980s textbooks.
David B. Benson says
Lloyd Flack (568) — Sea level rise is edue to AGW. Already adversly affecting some Pacific islanders. Glacier melt is due to AGW. Already adversely affecting peasants in Bolivia.
Marcus says
Re #575: Could you be specific about what recent credible scientific source states that 450 ppm is a tipping point toward runaway feedback?
450 ppm is certainly the value that many scientists have chosen to delineate “dangerous anthropogenic interference” (DAI). Which does involve some amount of judgment call. And also acknowledgment of uncertainty: after all, depending on feedback 450 ppm can lead to 1 degree C to 4 degrees C warming or more (see Ramanathan and Feng, PNAS, 2008 for an analysis of committed warming after we’ve stripped away aerosol cooling). And there isn’t yet a consensus estimate of how much damage a given warming causes, though again, 2 degrees C has been chosen by many scientists and politicians as a value to delineate DAI. I think what is widely agreed on is that damage increases with change in temperature from preindustrial above some minimal value (say 1 degree), and that it is likely to be non-linear relationship, with the damage curve getting steeper the more you change the system from preindustrial.
But that is very different from “runaway feedback”. Which has been out of vogue among climate scientists for longer than I’ve been in the field (about a decade).
Richard Sycamore says
#572
Whereas Spencer Weart suggests that an engineering quality exposition of the climate sensitivity derivation is not possible, you seem to assert that it is not only possible, but that it has already been done, it’s just distributed amongst a number of textbooks. Which is it? Supposing you are correct, do you think IPCC should be charged with the synthesis of this material? Or should each attempt his own synthesis?
[Response: You are misreading the entire post, the issue in which is whether ‘simple (quantitative) answers’ can be given to simple questions about complex systems. They can’t and so complexities are built into databases (HITRAN), radiative transfer models and GCMs. If that’s what you want, but you are not prepared to get stuck in to the details, you are perpetually going to be dissatisfied. – gavin]
Richard Sycamore says
Seems to me that Barton Paul Levenson and Francois O agree – that a synthesis of the climate sensitivity derivation ought to be possible … despite the significant challenges outlined by Dr. Weart.
Hank Roberts says
> alleged tipping point toward runaway feedback?
Straw man.
Hank Roberts says
http://scholar.google.com/scholar?sourceid=Mozilla-search&q=%2Bclimate+%2B%22tipping+point%22+%2B%22runaway+feedback%22
Lloyd Flack says
#576 David Benson,
Could you give me a reference to the Bolivian glacial melt. Mountain glacial retreat and its affects on freshwater supplies are among the first unequivocal adverse effects of AGW that I would expect.
Sea level rise is going to be blamed often when the true problem is local subsidence. We have to disentangle these effects. Which Pacific islands are affected, how much is the claimed local sea level rise, and what is the estimated rise in that part of the Pacific as a result of AGW? What are the IPCC estimates of sea level rise in that region?
#573 Francois Ouellette,
I wonder whether the reason why the range of sensitivity estimates has not been shrinking is so much a problem with the models, as it is a problem with the analysis of the output of the models.
The point estimate is near 3ºC for doubling of CO2. If the true value is near the initial point estimate then I would not expect the point estimate to change much. I think this is what has been happening.
As I understand it, the range of the estimates for sensitivity comes from the ensemble of estimates from models with different initial assumptions. Unless you narrow the range of model assumptions you are likely not to narrow the range of the predictions. For reasons of computational tractability the models are built on a coarser spatial scale than what would be required to give good estimates of the feed back from clouds. Nonlinearities in the models can lead to large changes in the models for for small changes in the assumptions. In other words as I see it there are two ways to narrow the range of the model output. One is to have a lot more computing power than we have available now so we can model phenomena on a finer scale. The other is to narrow the range of the initial assumptions for the models.
However there is another way to narrow the range of the sensitivity estimates. This is to combine the results from different analyses in a meta-analysis. Annan and Hargreaves have done this for three sensitivity estimates and come up with a narrower confidence interval than the initial estimates. Further extension of this should allow more precise estimates to be made of CO2 sensitivity. To do this we need people who are both familiar with the models and their properties and with the required statistical methods, not a common combination. To combine different analyses we need to know the degree of dependency among them, else we could get an over optimistic narrowing of the confidence interval by combining the data sources.
In summary, I think, to narrow the range of estimates of CO2 sensitivity, the best ways are to either improve the information fed into the models or the improve the statistical analysis of the output. Improving the models themselves will probably have less benefit. We probably could get better estimates from the available models output by doing improved analyses.
Mark says
Richard, #579.
Question: it is possible. Is it worth it?
Richard Sycamore says
Straw man?!
http://www.greenpeace.org.uk/files/pdfs/climate/hansen.pdf
“The single most pertinent number emerging from Cenozoic climate studies is the level of atmospheric CO2 at which ice sheets began to form as the planet cooled during the past 50 million years. Our research suggests that this tipping point was at about 450 ppm of CO2 (http://arxiv.org/abs/0804.1126 and http://arxiv.org/abs/0804.1135).”
“If humanity is so foolish as to burn all fossil fuels, thus more than doubling atmospheric CO2 from its pre-industrial level of 280 ppm, we will have set the planet on an inexorable course to an ice-free state, with all the disasters that such a course must hold for man and beast.”
According to my inexpert read of Hansen, the only thing preventing a GHG-driven runaway at the PETM was the smashing of India into Asia:
“The imbalance of carbon sources and sinks (thus the change of atmospheric CO2) depends upon plate tectonics (continental drift), because it is the rate of subduction of carbonate-rich ocean crust beneath moving continental plates that determines the rate of volcanic emission of CO2. Also the rate of weathering (the primary long-term sink of surface carbon) is a function of the rate at which fresh rock is exposed by mountain building associated with plate tectonics.
Specifically, during the period 60 My BP (60 million years before present) to 50 My BP India was plowing north rapidly (20 cm per year) through the Tethys Ocean and in the process subducting carbonate-rich ocean crust, causing atmospheric CO2 to increase. Global temperature peaked 50 My ago when India crashed into Asia. Available proxy measures of CO2 indicate that atmospheric CO2 reached 1000-2000 ppm at that time. The Earth was at least 12°C warmer than today, there were no ice sheets on the planet, and sea level was about 75 meters higher.
With the collision of India and Asia the subduction source for CO2 emissions declined, but the weathering sink increased as the Himalayas and Tibetan Plateau were pushed up. Thus the past 50 My have generally
been a period of declining atmospheric CO2 and a cooling planet.”
Re #578 A complex answer is ok. I agree it would be silly to expect something one-page-simple. I’m willing to read a 300-page appendix if the answer is in it.
Dave Rado says
Rod B, #569, are you really advocating a repetition of an event that killed off 90% of animal and plant species on the basis that as a result the other 10% were able to prosper? Or are you just being argumentative for the sake of it?
Lloyd Flack says
Dave Rado,
Aren’t you confusing the PETM with the End-Permian mass extinction. That had an over 90% extinction rate (at leat for marine life forms) The PETM while it did increase the extinction rate was nowhere near that severe.
Rod B says
That’s true, Hank (574); it was unclear what “past warming” they were referring to.
Rod B says
Dave (583), neither. I was just quizzical about a link referenced as solid support for the damage of warming containing phrases like “…mammals flourished…”
Richard Sycamore says
#583
51% of the population probably think so. But that’s just a guess.
I suppose you don’t like the idea of Bray & von Storch’s survey either?
Lloyd Flack says
#584 Richard Sycamore,
I think you are actually referring to the Eocen Optimum rather than the Paleocene-Eocene Thermal Maximum. The PETM was an earier brief (
[Response: don’t use a raw < symbol, use & l t ; instead (with no spaces). – gavin]
Martin Vermeer says
#584 Richard Sycamore:
The “straw man” accusation refers to your use of the expression “runaway feedback”. Either you don’t know what you’re talking about, or you’re using the wrong terminology. A “runaway feedback”, i.e., one over 100% leading to a Venus-like loss of the oceans, is not going to happen on Earth for another billion years or so.
But Marcus in #577 explained that already.
Francois Ouellette says
Lloyd Flack,
I understand your point. I’ve read a few papers on models and all, but I will not claim here to have the most up-to-date expertise. I don’t want to engage in too technical a debate. So what follows is really just the opinion of someone who is scientifically litterate, and is more about methodology than theory.
You say: “to narrow the range of estimates of CO2 sensitivity, the best ways are to either improve the information fed into the models or to improve the statistical analysis of the output.” That makes a lot of sense, but I guess as an experimental physicist, I would be inclined to go for the first choice. I would be very much wary of using too much statistics just to study the output of models. At some point, all this has to relate to the real world. An external observer cannot but be surprised by such techniques as using ensemble averages of models as if it represented some kind of “better” estimate of the real world. Despite all the justifications, to me it remains a dangerous course. On the other hand, what I retain from my readings is how difficult it is to really compare the outputs of models to the actual world, and in large part this is due to a lack of comprehensive real world data, and also to the fact that the numbers that the models give you do not have a simple correspondence with what is actually measured. But in the end, it is essential to improve the correspondence between the models and the real world. That is the essence of modelling. That’s why I believe that there should be a much greater effort at collecting data, and using them to improve the models. There should also be more effort at figuring out better metrics to assess the performance of models. What I’ve seen is quite rudimentary. It’s not a simple problem, and maybe outside expertise is needed here.
But also, when I was talking in my earlier post of a “focused problem”, it does seem to me that the real sticking point is to have a more precise estimate of the feedback parameter, and in particular water vapor and cloud feedbacks. But we can’t just rely on models, we have to measure them! It seems to me that a “focused effort” here, to try and empirically measure those numbers, would be appropriate. Imagination is required to devise new techniques.
I think what is impeding this mostly empirical and experimental program is purely a budgetary constraint. It costs much more to send new satellites or set up some global monitoring system than to pay modellers and buy more powerful computers. And also models can churn out results much more rapidly, and appear to be more efficient. But that is pervasive in all sciences. As an experimental physicist, I know that my colleagues who were fiddling with purely theoretical problems, and solving them numerically, could publish many more papers. Setting up and debugging an experiment takes a lot more time than debugging software, and costs a lot more. But in the end nothing beats an experimental demonstration, if only because experiments always surprise you and more often than not make you revise your initial assumptions.
Marcus says
#584: Richard Sycamore, the 450 ppm threshold that Hansen is referring to should be considered a “tipping point” or a “threshold” more than a “runaway feedback”. It is a tipping point which also contains a positive feedback. So, hypothetically, there may exist an anthropogenic GHG forcing at which the Greenland ice sheet (GIS) is mostly stable, but where an additional 0.1 W/m2 of GHG forcing would lead eventually to the total disintegration of the GIS, and this disintegration of the GIS would contribute additional albedo forcing. So, hypothetically, there might exist a CO2 concentration (say 450 ppm) where temperature increase is one value (say 2 degrees). But at 455 ppm, due to crossing the GIS tipping point the melting would lead to albedo changes leading to a significantly larger global temperature increase (say 4 degrees). This isn’t actually a “runaway” feedback, despite occasional terminology confusion among one or two posters on this site (Lynn comes to mind).
That there should exist such a threshold for the GIS is, I believe, fairly well accepted: the GIS is only stable because the height of the ice combined with the lapse rate of the atmosphere ensures that the top of the sheet is cool enough not to melt, whereas if the GIS did not exist, the temperature at ground level would _not_ be sufficient to maintain an icesheet (see, also, hysteresis). I don’t personally know what the change in albedo feedback would be, and therefore what temperature change would result: however we do know that the disappearance of the GIS would directly lead to 7m of sea level rise, which is I think what most people worry about.
The exact value of the GIS threshold is not as well accepted. Hansen proposes 450 ppm based on historical evidence. Given that conditions today are not exactly equal to historical conditions (solar influence, oceanic currents, other GHGs, atmospheric patterns, etc) historical analogues can only be inexact proxies. The threshold could be more than 450 ppm (perhaps much more) though it could also be less. I think that our understanding of ice sheet dynamics is, unfortunately, not yet at the point where we can make a good “bottom-up” estimate of the critical temperature threshold for ice sheet stability.
Also, the timing of this event is important: it is likely to be quite slow (see the Pfeffer work) and therefore we might have a window between seeing evidence having crossed the threshold in which to reduce forcing quickly and cross back before it is too late (possibly using geoengineering).
Of course, geoengineering is likely to produce its own drawbacks. As would radical restructuring of the economy in a short time period. Therefore, judicious reduction of GHG emissions now would seem to be a preferred option to waiting until we are forced to make precipitous reductions later.
Hank Roberts says
> what is impeding this mostly empirical and experimental
> program is purely a budgetary constraint.
The problem was Triana did get built but did not get launched. It’s sitting in a warehouse.
Put an instrument in the right place and it will be able to measure, all the time, both sunlight reaching Earth and heat leaving Earth from the same point of view. NO multiple different satellite instruments in various orbits for various periods of time. One measurement of the whole sunlit face of the planet, constantly.
The instrument would have provided the information needed to resolve the open questions that we’ve been instead trying to piece together from fragmentary instrument records.
It wasn’t a budget problem that kept the satellite on the ground.
Look it up.
David B. Benson says
Lloyd Flack (582) — The Bolivean glaceir problem was in a DailyScience news story; I didn’t keep a link. The SLR in the Pacific is real, primarily due to warmer water.
There are similar glacier problems already in Nepal, if I remember correctly.
Hank Roberts says
http://scholar.google.com/scholar?hl=en&lr=&safe=off&scoring=r&q=CHACALTAYA+glacier&as_ylo=2007
Richard Sycamore says
Could someone clarify for me the Hansen statement on the cooling effect of continental collision? Is he suggesting that runaway warming was prevented by this happy accident? And if so, does that not imply that that is what is necessarily in store for us beyond the 450ppm tipping point?
The more substantive the reply, the better.
Barton Paul Levenson says
The more new rock exposed, the more CO2 gets weathered out of the air and downstream to the sea.
G.R.L. Cowan, H2 energy fan \'til ~1996 says
Weathering those silicate rocks manually, or anyway with hand-made machinery, turns out to be a relatively low-energy-cost way of sequestering CO2, as previously discussed.
Martin Vermeer says
Richard Sycamore #597,
what Barton said. And this is your starting link:
http://maureenraymo.com/uplift_overview.php
…and no, no “runaway”. Just “tipping point”.