Guest commentary by Spencer R. Weart, American Institute of Physics
I often get emails from scientifically trained people who are looking for a straightforward calculation of the global warming that greenhouse gas emissions will bring. What are the physics equations and data on gases that predict just how far the temperature will rise? A natural question, when public expositions of the greenhouse effect usually present it as a matter of elementary physics. These people, typically senior engineers, get suspicious when experts seem to evade their question. Some try to work out the answer themselves (Lord Monckton for example) and complain that the experts dismiss their beautiful logic.
The engineers’ demand that the case for dangerous global warming be proved with a page or so of equations does sound reasonable, and it has a long history. The history reveals how the nature of the climate system inevitably betrays a lover of simple answers.
The simplest approach to calculating the Earth’s surface temperature would be to treat the atmosphere as a single uniform slab, like a pane of glass suspended above the surface (much as we see in elementary explanations of the “greenhouse” effect). But the equations do not yield a number for global warming that is even remotely plausible. You can’t work with an average, squashing together the way heat radiation goes through the dense, warm, humid lower atmosphere with the way it goes through the thin, cold, dry upper atmosphere. Already in the 19th century, physicists moved on to a “one-dimensional” model. That is, they pretended that the atmosphere was the same everywhere around the planet, and studied how radiation was transmitted or absorbed as it went up or down through a column of air stretching from ground level to the top of the atmosphere. This is the study of “radiative transfer,” an elegant and difficult branch of theory. You would figure how sunlight passed through each layer of the atmosphere to the surface, and how the heat energy that was radiated back up from the surface heated up each layer, and was shuttled back and forth among the layers, or escaped into space.
When students learn physics, they are taught about many simple systems that bow to the power of a few laws, yielding wonderfully precise answers: a page or so of equations and you’re done. Teachers rarely point out that these systems are plucked from a far larger set of systems that are mostly nowhere near so tractable. The one-dimensional atmospheric model can’t be solved with a page of mathematics. You have to divide the column of air into a set of levels, get out your pencil or computer, and calculate what happens at each level. Worse, carbon dioxide and water vapor (the two main greenhouse gases) absorb and scatter differently at different wavelengths. So you have to make the same long set of calculations repeatedly, once for each section of the radiation spectrum.
It was not until the 1950s that scientists had both good data on the absorption of infrared radiation, and digital computers that could speed through the multitudinous calculations. Gilbert N. Plass used the data and computers to demonstrate that adding carbon dioxide to a column of air would raise the surface temperature. But nobody believed the precise number he calculated (2.5ºC of warming if the level of CO2 doubled). Critics pointed out that he had ignored a number of crucial effects. First of all, if global temperature started to rise, the atmosphere would contain more water vapor. Its own greenhouse effect would make for more warming. On the other hand, with more water vapor wouldn’t there be more clouds? And wouldn’t those shade the planet and make for less warming? Neither Plass nor anyone before him had tried to calculate changes in cloudiness. (For details and references see this history site.)
Fritz Möller followed up with a pioneering computation that took into account the increase of absolute humidity with temperature. Oops… his results showed a monstrous feedback. As the humidity rose, the water vapor would add its greenhouse effect, and the temperature might soar. The model could give an almost arbitrarily high temperature! This weird result stimulated Syukuro Manabe to develop a more realistic one-dimensional model. He included in his column of air the way convective updrafts carry heat up from the surface, a basic process that nearly every earlier calculation had failed to take into account. It was no wonder Möller’s surface had heated up without limit: his model had not used the fact that hot air would rise. Manabe also worked up a rough calculation for the effects of clouds. By 1967, in collaboration with Richard Wetherald, he was ready to see what might result from raising the level of CO2. Their model predicted that if the amount of CO2 doubled, global temperature would rise roughly two degrees C. This was probably the first paper to convince many scientists that they needed to think seriously about greenhouse warming. The computation was, so to speak, a “proof of principle.”
But it would do little good to present a copy of the Manabe-Wetherald paper to a senior engineer who demands a proof that global warming is a problem. The paper gives only a sketch of complex and lengthy computations that take place, so to speak, offstage. And nobody at the time or since would trust the paper’s numbers as a precise prediction. There were still too many important factors that the model did not include. For example, it was only in the 1970s that scientists realized they had to take into account how smoke, dust and other aerosols from human activity interact with radiation, and how the aerosols affect cloudiness as well. And so on and so forth.
The greenhouse problem was not the first time climatologists hit this wall. Consider, for example, attempts to calculate the trade winds, a simple and important feature of the atmosphere. For generations, theorists wrote down the basic equations for fluid flow and heat transfer on the surface of a rotating sphere, aiming to produce a precise description of our planet’s structure of convective cells and winds in a few lines of equations… or a few pages… or a few dozen pages. They always failed. It was only with the advent of powerful digital computers in the 1960s that people were able to solve the problem through millions of numerical computations. If someone asks for an “explanation” of the trade winds, we can wave our hands and talk about tropical heating, the rotation of the earth and baroclinic instability. But if we are pressed for details with actual numbers, we can do no more than dump a truckload of printouts showing all the arithmetic computations.
I’m not saying we don’t understand the greenhouse effect. We understand the basic physics just fine, and can explain it in a minute to a curious non-scientist. (Like this: greenhouse gases let sunlight through to the Earth’s surface, which gets warm; the surface sends infrared radiation back up, which is absorbed by the gases at various levels and warms up the air; the air radiates some of this energy back to the surface, keeping it warmer than it would be without the gases.) For a scientist, you can give a technical explanation in a few paragraphs. But if you want to get reliable numbers – if you want to know whether raising the level of greenhouse gases will bring a trivial warming or a catastrophe – you have to figure in humidity, convection, aerosol pollution, and a pile of other features of the climate system, all fitted together in lengthy computer runs.
Physics is rich in phenomena that are simple in appearance but cannot be calculated in simple terms. Global warming is like that. People may yearn for a short, clear way to predict how much warming we are likely to face. Alas, no such simple calculation exists. The actual temperature rise is an emergent property resulting from interactions among hundreds of factors. People who refuse to acknowledge that complexity should not be surprised when their demands for an easy calculation go unanswered.
Hank Roberts says
> sensitivity estimates are completely based on
John Lang, you say you read that in the Commentary.
It’s not there. Search the words. Is it an interpretation of something you read? What?
David B. Benson says
Chris McGrath (49) wrote “Why is there ‘only’ a lag of 800 years in the icecore record between the earth coming out of a glacial period and the response of CO2?”
That’s what is seen in the Antarctic ice cores, although there is a similar delay from Pacfic Warm Pool data. It is a balance between the warming oceans expressing CO2 and the changes in the terrestrial carbon pool.
My amateur reading of the matter.
Fred Jorgensen says
Engineers work in the real world. When we build bridges, we make sure ‘they don’t fall down!’
But the real story is: ‘Probably won’t fall down for 100 years.’,
and then we schedule inspections and maintenance, because in the real world ‘sh*t happens!’ in
spite of our elaborate models and building standards.
So when climate scientists and theoreticians show us computer projections and fancy statistics, well,
‘We’re from Missouri!’.
The simple graphs of steady CO2 increase and up and down temperature over the last 100 years or so, has
this engineer slightly sceptical.
Sorry. We’re a tough crowd!
[Response: No. You are impossible crowd (well a crowd of one perhaps). How many times have you been told patiently that the impact of CO2 is not derived from correlations over the last 100 years, or 100,000 years? How many times have you been pointed to the physics of radiative transfer or estimates of climate sensitivity? It’s clearly dozens. You haven’t moved from your incorrect idea one iota. Why should anyone continue to discuss with you? – gavin]
mugwump says
RE John Mashey #34:
Part of my skepticism does indeed derive from my day job, which involves pretty heavy duty computer modeling, but not of the climate. I know how easy it is to overfit when you snoop the test data. In fact, we don’t consider a model validated until we’ve tested it against completely unseen data. Climate modelers have spent years tweaking heavily parameterized models against a very limited set of data. They are almost guaranteed to have overfit.
Another part of my skepticism is a kind of feedback. If you ask skeptical (but informed) questions of many prominent “mainstream” climate scientist bloggers you are often treated with disdain, or worse. That kind of sneering attitude only serves to make me more skeptical.
A third part of my skepticism arises from the extent to which the global warming issue has been hijacked by environmentalists who seek to use it to further their political goals.
The final part of my skepticism arises from the extent to which any published work that deviates from the “consensus” climate change view is immediately eviscerated by blogs such as this and by a flurry of negative “comment” responses in the open literature. The more rigorous the deviant research, the greater the attacks. In the words of WS himself, too often the lady doth protest too much.
At present, the only rational position (in my opinion, of course), is a skeptical one.
danny bloom says
If you want to see how the other side thinks, see my latest interview in which I interviwed a man who does not believe in global warming. NSFW. But his views represent what a lot of Americans think and feel. It’s good to know the other side. I mean, what we are up against. Forget polar cities. This comes first.
http://northwardho.blogspot.com
And while I am here: a professor named Jurgen Scheffran, a research scientist in the Program in Arms Control,
Disarmament and International Security and the Center for Advanced
BioEnergy Research at the University of Illinois, is among those
raising concerns. In a survey of recent research published earlier
this summer in the Bulletin of the Atomic Scientists, Scheffran
concluded that “the impact of climate change on human and global
security could extend far beyond the limited scope the world has seen
thus far.”
Scheffran’s review included a critical analysis of four trends
identified in a report by the German Advisory Council on Global Change
as among those most possibly destabilizing populations and
governments: degradation of freshwater resources, food insecurity,
natural disasters and environmental migration.
“Most critical for human survival are water and food, which are
sensitive to changing climatic conditions,” Scheffran said.
The degradation of these critical resources, combined with threats to
populations caused by natural disasters, disease and crumbling
economic and ecosystems, he said, could ultimately have “cascading
effects.”
“Most critical for human survival are water and food, which are
sensitive to changing climatic conditions,” Scheffran said.
“Environmental changes caused by global warming will not only affect
human living conditions but may also generate larger societal effects,
by threatening the infrastructures of society or by inducing social
responses that aggravate the problem,” he wrote. “The associated
socio-economic and political stress can undermine the functioning of
communities, the effectiveness of institutions, and the stability of
societal structures.
These degraded conditions could contribute to civil strife, and,
worse, armed conflict.”
“cascading effects.”
In addition to global cooperation, Scheffran believes that those
occupying Earth now can learn a lot about the future by studying the
past.
“History has shown how dependent our culture is on a narrow window of
climatic conditions for average temperature and precipitation,” he
said. “The great human civilizations began to flourish after the last
ice age, and some disappeared due to droughts and other adverse shifts
in the climate. The so-called ‘Little Ice Age’ in the northern
hemisphere a few hundred years ago was caused by an average drop in
temperature of less than a degree Celsius.
“The consequences were quite severe in parts of Europe, associated
with loss of harvest and population decline,” Scheffran said. “Riots
and military conflicts became more likely, as a recent empirical study
has suggested.”
Guenter Hess says
#47, Gavin’s inline response
Gavin,
The Thermodynamic Atmosphere Effect just uses Ray Pierrehumberts formula 3.8 from his online Climate Book as a model, don’t you think?
Ts = (ps/prad)^(R/cp)*Trad
[Response: That’s the adiabat. But it isn’t the greenhouse effect. – gavin]
Magnus Westerstrand says
Gavin et al.
Is it possible to make the blog reaction always point to the shortest url: e.g. https://www.realclimate.org/index.php/archives/2008/09/simple-question-simple-answer-no and not e.g. https://www.realclimate.org/index.php/archives/2008/09/simple-question-simple-answer-no/langswitch_lang/tk
I think that would be helpful…
[Response: Click on the English flag on the sidebar. This behaviour is due to an intersection between the cache and the language-switch code. You can always remove the langswitch tag and reload. – gavin]
rxc says
I am now retired, but in my former career, I used to be in charge of the work done by the US government involved in evaluating computer models of nuclear power plants. These programs modeled core physics, fuel thermal-mechanical behavior, and thermal-hydraulics in the core and in the entire nuclear coolant system. These programs were developed over the past 60 years of the nuclear industry by the various nuclear vendors and by the government, as well, at a cost of billions of dollars.
These programs are quite complex, having to take into account a large number of variables and factors, from basic uncertainties in the underlying physics, to uncertainties in material properties, to uncertainties associated with human performance and manufacturing. A LOT of this work was done at the behest of environmentalists in the early 1970s, who complained that the analytical models used to “prove” the safety of the plants was not complete or understandable. Dr. Weart should be very familiar with this effort, inasmuch as he write a very interesting book on nuclear power and nuclear issues (“Nuclear Fear”) in the 1980s.
All of the models that I used to evaluate, and the ones that I used in the government to evaluate those produced by the vendors were extensively documented. The government developed a strict methodology to allow the quantititave evaluation of the uncertainty in the models, which is now used by all three reactor vendors for their loss-of-coolant accident models. The uncertainty methodology is sufficiently general that it can be applied to a wide range of computer models, including the atmospheric models described above. it starts with an evaluation of the state of understanding of the the physics, and the mathematical models used to describe the physics, and then proceeds to use a strict process to introduce variations in the important parameters and conflate the results of different “runs” to estimate the overall uncertainty of the model.
Use of this method produces documentation that can be scrutinized by other engineers to determine the suitability of the model. I am highly suspect of modelers who say that their methods are too complex to document properly, especially if the consequences are so dire as they say. It is important for these models to be publicly available, on the internet, with full documentation of their underlying physics and all of the runs that have been made to produce the results.
These are extraordinary claims (the end of the world) being made by the modelers, and they demand extraordinary proof.
I am not a climate scientist, but if the climate scientists cannot produce documentation like this, then their claims cannot be believed.
Timo Hämeranta says
Re 4. Spencer & 3. Andrew,
the study is
Diffenbaugh, Noah S., Filippo Giorgi, and Jeremy S. Pal, 2008. Climate change hotspots in the United States. Geophys. Res. Lett., 35, L16709, doi:10.1029/2008GL035075, August 30, 2008, online http://www.purdue.edu/eas/earthsystem/Diffenbaugh_GRL_08.pdf (8,52 MB)
Timo Hämeranta says
About models please see
Reichler, Thomas, and Junsu Kim, 2008. How Well do Coupled Models Simulate Today’s Climate? BAMS Vol. 89, No 3, pp 303-311, March 2008, online http://ams.allenpress.com/archive/1520-0477/89/3/pdf/i1520-0477-89-3-303.pdf
“…, we note the caveat that we were only concerned with the timemean state of climate. Higher moments of climate, such as temporal variability, are probably equally as important for model performance, but we were unable to investigate these. Another critical point is the calculation of the performance index. For example, it is unclear how important climate variability is
compared to the mean climate, exactly which is the optimum selection of climate variables, and how accurate the used validation data are. Another complicating issue is that error information contained in the selected climate variables is partly redundant. Clearly, more work is required to answer the above
questions…”
About climate sensitivity please see
Rind, David, 2008. The Consequences of Not Knowing Low- and High-Latitude Climate Sensitivity. BAMS Vol. 89, No 6, pp. 855-864, June 2008, online http://ams.allenpress.com/archive/1520-0477/89/6/pdf/i1520-0477-89-6-855.pdf
“Along with the continuing uncertainty associated with global climate sensitivity [2°–4.5°+, for doubled CO2 in the latest Inter-governmental Panel on Climate Change (IPCC) report], we have not made much progress in improving our understanding of the past/future sensitivity of low- and high-latitude climates. Disagreements in paleoclimate interpretations, and diverse results from the IPCC Fourth Assessment Report future climate model simulations suggest that this uncertainty is still a factor of 2 in both latitude regimes. Cloud cover is the primary reason for model discrepancies at low latitudes, while snow/sea ice differences along with cloud cover affect the high-latitude response. While these uncertainties obviously affect our ability to predict future climate-change impacts in the tropics and polar regions directly, the uncertainty in latitudinal temperature gradient changes affects projections of future atmospheric dynamics, including changes in the tropical Hadley cell, midlatitude storms, and annual oscillation modes, with ramifications for regional climates. In addition, the uncertainty extends to the patterns of sea surface temperature changes, with, for example, no consensus concerning longitudinal gradient changes within each of the tropical oceans. We now know a good deal more about how latitudinal and longitudinal gradients affect regional climates; we just do not know how these gradients will change…”
Current state in Climatology in plain English…
[Response: As always, what is your point? And why do you think papers that demonstrate a) that climate models have been getting better over time, and b) another that says that regional latitudinal gradients are hard to predict, have to do with the importance of complexity in explaining what is going on? Throwing in random citations with cherry-picked quotes is a well worn tactic, but please, try and be a little relevant. – gavin]
Mark says
Chris, #49.
The simplest answer is that in the ancient past there were no dino drilling platforms. So 10,000 years to sequester 7% of fossil fuels has nothing to do with the past CO2 expression.
Barton Paul Levenson says
I have a quick-and-dirty way to calculate planet surface temperatures here.
Mark says
Fred, #43. WHAT more pressing needs are there for insurance? And are you an economist to know what % of GDP would be needed? Have you worked on a model for it?
And as a % of GDP of the US, what’s spent on securing oilfields annually? Not even fazing the current administration.
Mark says
Gosta, #16
No.
The weather doesn’t have free will. It can’t change based on what it thinks will happen in the future.
People can. People do.
Barton Paul Levenson says
Dr. Farley,
I take on Cockburn here. It’s very reminiscent of shooting fish in a barrel. :)
Barton Paul Levenson says
Fred Jorgenson posts:
Fortunately, we don’t need all that — even though we do have one heck of a solid case! A carbon emissions trading scheme and a push for renewable sources of energy will work just fine.
Barton Paul Levenson says
Guenther Hess says:
Mankind will probably survive, yes. The current civilization may not. Athens had the best philosophers in the world, but it didn’t stop Alexander the Great from annexing it. The Nazis had the most brilliant physicist in the world working for them — Werner von Heisenberg — but their atom bomb program failed miserably. You can probably think of additional examples.
JohnnyB says
Dear Mr. Weart,
You say greenhouse gas warming is not possible to predict but one piece of information I picked up watching an Australian scientific meeting mentioned a number of studies, including a University of Chicago study, predicts CO2 has no/little warming ability passed 22ppm (parts per million).
This is based on CO2’s well known infra-red radiative profile, known and solid science for nearly a century.
Namely once CO2 reaches 22ppm in Earths atmosphere it effectively reaches its limits of warming ability as CO2 is only able to reflect/force only certain infra-red wavelengths, that’s achieved/blocked at 22ppm and past that any additional CO2 (to 380ppm or 600ppm) does no additional ‘work’.
Like sunglasses with 2 of say 4 filters, it doesn’t matter how many extra pairs you wear, 2 sections of light always get through and 2 are always blocked. Add 20 new pairs with the same light filters and it makes no difference!
Water vapour, the most powerful GHG, has a very broad infra-red wavelength forcing ability in contrast.
As I say this has been known science for nearly a century and for me is the killer punch, amongst many, that KO’s the global warming hoax. Can you confirm please?
[Response: None of this true. Where is your source and why do you think it’s trustworthy? For more details about why this is wrong, read these previous posts (here and here). – gavin]
Mark says
Johhny B, #56.
Put one blanket on the bed. You are now insulated 100% against losses from radiation and convection. So putting a second blanket on is a waste of time, isn’t it? After all, everything that a blanket stops is already stopped by the one blanket you have on now.
Fred Staples says
So, Spenser, it’s Science but not as we know it. No assumptions which can be tested from first principles, no equations which can be checked, and nothing which can be tested against experiments or measurements. Prior experiments (Angstrom’s saturation test in 1905) are dismissed as “botched” (by Gavin) but not repeated.
[Response: I said no such thing, and these experiments have been repeated hundreds of time at much higher accuracy (look up HITRAN). – gavin]
Quantum theory is invoked as photons like little silver bullets random-walking their way into space.
At the long end of the spectrum the dominant behaviour in the wave-particle duality is the wave. Absorption is resonant. “We see, to our amazement, that the photon frequency associated with the transition is just an integer multiple of the classical resonant frequency of the harmonic oscillator” from a First Course in Atmospheric Radiation, page 252, by Grant W Petty, University of Wisconsin Madison.”
Not to my amazement, I have to say. I was regurgitating these equations 50 years ago.
Your comment “the air radiates some of this energy back to the surface, keeping it warmer than it would be without the gases” ignores the second law of thermodynamics. Heat cannot pass from a cooler to a warmer body. Check it out. Switch off the fridge and open the door.
[Response: Oh please. The 2nd law is for *net* transfers. We receive radiation from the big bang at a brightness temperature of about 3K. Your claim would preclude us detecting it because energy can’t come from a colder body (deep space) to a warm one. Nonsense. – gavin]
Robert Essenhigh is one senior engineer who rejects AGW out of hand because the dominant radiation absorber in the atmosphere must be water vapour. He calculates the height at which all the radiation which can be absorbed, will be absorbed, finds this to be relatively close to the surface, and concludes that additional CO2 can make no difference.
“The absorption coefficient is 1–2 orders of magnitude higher than the coefficient values for the CO2 bands at a concentration of 400 ppm. This would seem to eliminate CO2 and thus provide closure to that argument.”
[Response: Only because he doesn’t consider what happens higher in the atmosphere where the situation is very different. This idea that somehow radiative transfer is completely wrong is belied everytime you look at a picture derived from a satellite – if we were so wrong, none of those remotely sensed pictures of water vapour, or air pollution, or IR or aerosols or trace gases or ozone or anything would work. More nonsense. – gavin]
Chuck Wiese on Roger Prielke’s web-site reaches the same conclusion:
“The spectral overlap (C02/H2O) is quite severe, and is precisely why the results (of doubling CO2) don’t produce much change.”
[Response: Whether 4 W/m2 is “much change” is a matter of opinion. I consider half the forcing associated with the transition from a glacial to an interglacial quite a lot. – gavin]
Barton Levenson (62) published an elegant paper (The Irrelevance of Saturation) on this site which accepts the premise but refutes the conclusion. He applies conservation of energy across space, an upper and a lower atmospheric layer, and the surface. The upper layer contains negligible water vapour, and is warmed by any additional CO2. Because energy is radiated downwards, the lower layer and the surface have to compensate by increasing their radiation which, via Stefan-Bolzmann, requires warming.
If the relative absorption in the upper atmosphere increases from 0.5 to 0.6 Barton generates an increase of 3 degrees C at the surface.
But if the relative absorption is a far more realistic 0.01 (zero water vapour, very low pressure), doubling the CO2 produces a temperature increase of only 0.2 degrees at the surface, which is not really measurable.
Elsewhere on the site another of your contributors summarised neatly the (unquantified) “higher is colder” argument – “And as adding greenhouse gases to the atmosphere increases the optical thickness or depth of the medium, it raises the height from which photons escape, and assuming a roughly constant lapse rate this will imply a warmer surface”.
Possibly, but what evidence is there for a “roughly constant lapse rate”?
The UAH data from 1978 quotes:
Lower Troposphere – 1.3 degrees per century
Mid Troposphere – 0.5 degrees per century
These figures are not compatible with AGW, particularly since most of the lower troposphere warming took place between December 1999 and January 2002. Current temperatures are back to 1978 levels, themselves the trough of changes since the previous peak, in the forties.
I am, incidentally, pleased to see that the rise in temperature from the Little Ice Age to the forties is now accepted on the site, without any attribution to AGW.
[Response: “now”? please find some cites that indicates that our explanations have changed in any major respect. – gavin]
Bruce Tabor says
“These people, typically senior engineers, get suspicious when experts seem to evade their question.”
Before I became a statistician I was a chemical engineer and then a biomedical engineer.
It is perhaps the evasiveness that enginees are responding to. Engineers tend to be distrustful of pat answers. But I would have thought physicists would have been more of a problem in that they prefer to see an eplanation reduced to first principles. (Admitedly much of the physics engineers encounter at university is of this variety.)
Engineers are not strangers to complexity – although this may vary by dscipline. Much of what they do cannot be reduced to first principles and is ultimately empirical. Chemical engineers (& others) often work with dimensionless equstions. Relationships of quantities that have been expeimentally derived, and since they are dimensionless, can be scaled to the situation at hand. Furthermore engineers often resort to computer models in the design of chemical reactors, structures, circuits etc. They could no more summarise one of their own complex systems in 6 pages than you can summarise AGW.
I suspect it is a culture clash that is causing the problem, not an incapacity to accept that a simple derivation is all that is required. For most engineers, Gavin’s “The CO2 problem in 6 easy steps” *should be* and excellent start.
Maurizio Morabito says
I see that my point (#10) is more or less repeated by other commentators (eg #53 and #58).
Gavin: you reply to #53: “Why should anyone continue to discuss with you?”
[Response: It wasn’t rhetorical, it was serious. I spend way too much time replying to comments because I do genuinely want people to understand what it is we are talking about and why. But even my time is limited, and so one needs to prioritise. People that just want to declaim, rather than learn, just create noise and responding to their pseudo-questions is of limited use (other than to point out that they rarely make any sense). I judge the worth of engaging with people by their willingness to listen and modify their position. If they don’t change at all, why bother? I’m not doing this for my health. – gavin]
If you really want to communicate, then you better find a way of communicating. If on the other hand you don’t want to communicate, there is little point in replying to comments, really.
In fact you remind me of those English-speaking tourists arriving back home in frustration, convinced that the locals they visited are brainless idiots, after having shouted, yelled, huffed and puffed to make themselves understood…by people that simply do not speak English.
If you or Mr Weart want to speak to engineers, or anybody else, then you both better speak in a way that engineers can understand. And if they don’t appear to have understood, you cannot simply jump to saying “why are you people so slow to understand?”…the only sensible option is to see where the miscommunication is (yes, it can be with you too), and to work to fix that.
I have provided a few suggestions already.
People do have various degrees of skepticism in the nature and dangers of anthropogenic global warming. How difficult is it to recognize that? If you instead poo-poo their thoughts whenever expressed, you will win nobody’s mind. Fine by me, but then what’s a blog for?
mugwump says
I forgot to add the most significant scientific reason behind my skepticism at #54: the uncertainty in forcing due to aerosols and ocean heat uptake is almost as large as the forcing from human GHGs itself.
Every climate model from a simple “single-box” through to global circulation models with multiple layers and interactions has built into the denominator of climate sensitivity the actual forcing – ie the difference between CO2 forcing and the offsets due to aerosols and ocean. Since the offsets are highly uncertain, the denominator of climate sensitivity has error bars that come close to encompassing the origin (zero), which means the climate sensitivity estimates themselves have error bars that approach infinity (in the extreme).
It is this enormous range in estimated climate sensitivity that is invoked by the likes of Stern and Al Gore (and the alarmist industry in general) to claim that there is a non-negligible probability of total climate catastrophe, and that therefore we must act now to drastically curb CO2 emissions.
But the reality is much more sanguine. We simply cannot attach a probability to wildly high estimates of climate sensitivity from climate models, borne as they are out of an uncertainty in the denominator.
To see this, consider an extreme boundary case: there is a small (but non-zero) probability that the net forcing has in fact been zero over the 20th century, once you account for aerosols, ocean heat uptake, clouds, etc. In that case the sensitivity required to reproduce the 20th century temperature increase is infinite. But we know there is zero probability of infinite climate sensitivity (otherwise we wouldn’t be here discussing this), so clearly it is not valid to convert uncertainty in forcing into a probability for climate sensitivity based on the output of climate models.
[Response: You touch on a real point, but you are not correct in your assessment. The issue of the forcing uncertainty is a problem for the twentieth century, and it does mean that high sensitivities cannot be ruled out from considering that period alone (Forest et al, etc). However, this isn’t anything to do with climate models. They have very well defined sensitivities since you can impose whatever forcing you want. But even better is to take a situation where the forcing is unambiguously non-zero – the last glacial period is the best bet for this and actually rules out with high probability both the very low and very high numbers (as we have frequently discussed). – gavin]
Lloyd Flack says
#34 John Mashey,
Perhaps some of the difference between these two categories is in the numbers of components the sysystems that they deal with have and how crucial individual components can be.
Category 1 deal with what could be described as serial systems. A few crucial components have many opportunities to affect the system and the effects of individual components and processes are not swamped by the effects of other components and processes. Components are often in one of a few discrete states. The system is a dictatorship with the crucial components as dictators.
Category 2 deal with what could be described as parallel systems. Many components simultaneouly affect the system. The effects of any single component can be swamped by the affects of many other components. Thus even if you have mad an error about the effects of an individual component your estimates will usully be fairly robust. Components usually are in one of many states or the states have continuous values. The outcome is decided by a vote among many components.
Bruce Tabor says
mugwump at #54,
The last 3 of the 4 reasons for your skepticism do not relate to the technical accuracy of the scientific work on climate change, but rather to the political and emotional context of the debate. By “emotional” I include your own emotional response.
Surely the issue is whether AGW is a real and present danger. Are the predictions accurate. Are the changes observed in the world’s climate illusionary or ephemeral – eg the huge loss or Arctic sea ice – or are they part of a dangerous trend.
Your first point of skepticism concerns over-fitting of models. A valid question. These are not statistical models that are “fit” as such, but rather ones incorporating the physics of the underlying processes. I have no doubt that, despite claims to the contrary, some parameters are “tweaked” at the margins, as the relevant physics is not fully understood. But these are additive processes and for the most part, leaving these out (eg aerosol effects) results in similar effects – Gavin could possibly confirm this. Older simpler models show little deviation from current ones.
Skepticism becomes denial when you refuse to evaluate your position in light of the available evidence. Your last three points make me wonder if this is the case.
Magnus Westerstrand says
Gavin at 57,
Yes I know, however I guess it confuses a few that e.g. don’t blog themselves…
Timo Hämeranta says
Re 68. Johnny B, about Water vapour, the most powerful GHG, two interesting papers have been published:
1. Huang, Yi, and V. Ramaswamy, 2008. Observed and simulated seasonal co-variations of outgoing longwave radiation spectrum and surface temperature. Geophys. Res. Lett., 35, L17803, doi:10.1029/2008GL034859, September 4, 2008
“We analyze the seasonal variations of Outgoing Longwave Radiation (OLR) accompanying the variations in sea surface temperature (SST) from satellite observations and model simulations, focusing on the tropical oceans where the two quantities are strikingly anti-correlated. A spectral perspective of this “super-greenhouse effect” is provided, which demonstrates the roles of water vapor line and continuum absorptions at different altitudes and the influences due to clouds…”
2. Dessler, Andrew E., P. Yang, J. Lee, J. Solbrig, Z. Zhang, and K. Minschwaner, 2008. An analysis of the dependence of clear-sky top-of-atmosphere outgoing longwave radiation on atmospheric temperature and water vapour. J. Geophys. Res. – Atm., 113, D17102, doi:10.1029/2008JD010137, September 3, 2008
“We have analyzed observations of clear-sky top-of-atmosphere outgoing longwave radiation (OLR) … We also analyze the sensitivity of OLR to changing surface temperature Ts, atmospheric temperature Ta, and atmospheric water vapor q. We find that OLR is most sensitive to unit changes in Ta when that change occurs in the lower troposphere. For q, the altitude distribution of sensitivity varies between the midlatitudes, subtropics, and the convective region… In the tropical convective region, a rapid increase in q in the midtroposphere leads to a dramatic reduction in OLR with increasing Ts, which has been termed the “super greenhouse effect”.”
Timo Hämeranta says
Re 72. mugwump, about aerosols please see
Rosenfeld, Daniel, Ulrike Lohmann, Graciela B. Raga, Colin D. O’Dowd, Markku Kulmala, Sandro Fuzzi, Anni Reissell, and Meinrat O. Andreae, 2008. Flood or Drought: How Do Aerosols Affect Precipitation? Science Vol. 321, No 5894, pp. 1309-1313, September 5, 2008
“Aerosols serve as cloud condensation nuclei (CCN) and thus have a substantial effect on cloud properties and the initiation of precipitation. Large concentrations of human-made aerosols have been reported to both decrease and increase rainfall as a result of their radiative and CCN activities. At one extreme, pristine tropical clouds with low CCN concentrations rain out too quickly to mature into long-lived clouds. On the other hand, heavily polluted clouds evaporate much of their water before precipitation can occur, if they can form at all given the reduced surface heating resulting from the aerosol haze layer…”
About climate sensitivity please see first
Kiehl, Jeffrey T., 2007. Twentieth century climate model response and climate sensitivity. Geophys. Res. Lett., 34, L22710, doi:10.1029/2007GL031383, November 28, 2007
“Climate forcing and climate sensitivity are two key factors in understanding Earth’s climate. There is considerable interest in decreasing our uncertainty in climate sensitivity. This study explores the role of these two factors in climate simulations of the 20th century. It is found that the total anthropogenic forcing for a wide range of climate models differs by a factor of two and that the total forcing is inversely correlated to climate sensitivity. Much of the uncertainty in total anthropogenic forcing derives from a threefold range of uncertainty in the aerosol forcing used (i.e. tuned) in the simulations…
[20] Finally, the focus of this study has been on anthropogenic forcing. There is also a range of uncertainty in natural forcing factors such as solar irradiance and volcanic aerosol amount. It would of value to reduce uncertainties in these forcing factors as well.”
And finally
Allen, Myles R., and David J. Frame. Call Off the Quest. Science Perspective Vol. 318, No 5850, pp. 582-583, October 26, 2007, online http://www.eci.ox.ac.uk/publications/downloads/frame07-sensitivity.pdf
“…An upper bound on the climate sensitivity has become the holy grail of climate research. As Roe and Baker point out, it is inherently hard to find. It promises lasting fame and happiness
to the finder, but it may not exist and turns out not to be very useful if you do find it. Time to call off the quest.”
Well, time to call off the quest, and follow the suggestion of the authors:
“In reality, of course, our descendants will revise their targets in light of the climate changes they actually observe.”
mugwump says
RE Gavin @ 73:
Unfortunately, those pushing the alarmist agenda are using the higher sensitivity estimates from the models to further their political goals.
It also goes to the question of how likely we are to overfit when tuning the climate models. When you have knobs on your black box that generate such widely varying behaviour when rotated through relatively plausible ranges, reliable extrapolation becomes problematic.
Fair enough. Do you have a canonical RC reference? (There are a *lot* of posts on RC to wade through).
Without expecting an inline response here (I can read whatever you point me to), I am particularly curious about the very low numbers – what constitutes “very low” and how they are ruled out (very low for the IPCC seems to be 2C but from my reading I am starting to think that 2C is at the upper end of the plausible range).
[Response: Start here (more here). – gavin]
Kevin McKinney says
#77–
Sounds like more bad news for the Lindzen/Spencer postulate that water vapor will provide negative feedback, yes?
William says
You do engineers a disservice we can understand the science, even if it is not our speciality. Physics is a key subject for engineers.
What engineering teaches us is to reconcile the theory with the concrete. Many “Climatologists” do not seem to check their hypotheses with the observed data. Take the IPCC chart in AR4 that gives the projection of temperature against CO2.It starts at 280ppm and by the time CO2 gets to today’s 385ppm it indicates the temperature should be in range +1.0C to +2.2C, but the actual temperature rise is only +0.6C. An Engineer would ask, “Why”!
It looks like the models are wrong. Perhaps you need some engineers to ask the “Why” question.
Can you explain why the projections do not match the actual data?
[Response: But they do! (Fig SPM.4). I don’t know what other figure you are talking about (reference), but the two factors that might be missing is the lag in the ocean response to forcing, and the net effect of all the forcings (incluiding other GHGs, aerosols and land use etc.). – gavin]
mugwump says
Bruce Tabor @ #75:
I elaborated on the scientific grounds for my skepticism at #73.
Leaving out aerosols most assuredly does change the output.
The models generate widely varying estimates of climate sensitivity from 20th century observations. I always wondered why, given that as you say they are all supposedly modeling the same physical processes. It was only recently that I looked into it sufficiently to realize that you have to divide by net forcing to get sensitivity (either explicitly as in a box-model, or implicitly via the underlying physics in a GCM), and the error bars on net forcing come perilously close to the origin.
dhogaza says
Mugwump has been told this before, as recently as a couple of weeks ago on Deltoid, but continues to make the “overfitting claim” regardless.
The first time is forgivable ignorance. Continuing to repeat it is [edit]
Hank Roberts says
rxc above wrote wishing for climate models to be as precisely documented as nuclear power plant models.
I’m sure the climate modelers would like to do that.
Each piece put into a nuclear power plant is specified in advance and you know what it’s made of.
I’m sure the climate modelers would like comparable information.
Suppose you were trying to model a nuclear power plant you couldn’t take apart?
But you vary one factor you’ve calculated will make a change in its behavior and it’s changing?
Willy Ley: “analysis is all very well but you can’t tell how a locomotive works by melting it down and analyzing the mess” — aren’t you glad fission plant modelers aren’t often faced with trying to do that? Do you understand why climate modelers have a difficult task? Do you understand why climate modelers raise concerns when they see CO2 increasing and can predict the climate system will be going outside its known safe performance parameters?
What’s the worst thing that can happen, with either physical system?
Right. Precautions are appropriate before fiddling with the inputs. Restrain those who would meddle, eh?
SecularAnimist says
mugwump wrote: “Unfortunately, those pushing the alarmist agenda are using the higher sensitivity estimates from the models to further their political goals.”
With all due respect, that statement — “pushing the alarmist agenda to further political goals” — is a dead giveaway that you are a pseudo-skeptic, someone who concludes that the overwhelming consensus of the scientific community that anthropogenic global warming is real, and dangerous, MUST be wrong because you dislike what you imagine to be the “political” consequences of that reality.
Timo Hämeranta says
Re 80. Kevin, well, Lindzen and Spencer are not studying water vapor, instead please see first Lindzen’s latest
Rondanelli, Roberto, and Richard S. Lindzen, 2008. Observed variations in convective precipitation fraction and stratiform area with sea surface temperature. J. Geophys. Res. – Atmos., 113, D16119, doi:10.1029/2008JD010064, August 29, 2008
“This paper focuses on the relation between local sea surface temperature (SST) and convective precipitation fraction and stratiform rainfall area from radar observations of precipitation…
Although a dependence on temperature such as the one documented is consistent with an increase in the efficiency of convective precipitation (and therefore consistent with one of the mechanisms invoked to explain the original Iris effect observations) this is but one step in studying the possibility of a climate feedback. Further work is required to clarify the particular mechanism involved.”
[edit – limit your random quotes to at least peer reviewed papers]
Rod B says
One of the difficulties for sceptics (like me) is that, even if the complexities are understood and believed, when the analysis finally gets to the end, the conclusions turn into very simple and exact assertions with absolute certainty. The final simple answers are usually accompanied with words like certainty, absolute, consensus, incontrovertible, irrefutable, undeniable, etc. As an example, Spencer’s commentary, IMO, explained the reality of the science very well (though I agree he unfairly characterizes engineers — but that’s a nit point); then the very first post said, in essence, ‘and so X gives you exactly Y’ (followed by a ‘and don’t give me any back talk’ — again a minor point) as did a number of following posts. It seems like gaining an understanding of the complexity of the science is not desirable per se but just a necessary evil only if the simple conclusions are called into question. To a sceptic this is not very convincing.
As a bit of mitigation, I have to leave the above as a “difficulty” and can’t quite raise it to a “criticism.” Because: 1) I’m not sure if there is any other realistic and practical way to do it. I mean a scientist can’t just end his/her analysis with a “nobody knows anything”. 2) While scientists in the field are guilty (IMO) of the above, it’s considerably more evident in the scientist wannabes, and who are also much less likely to backpedal when called than are the climatologists. 3) Some of my fellow sceptics (by happenstance, not choice) are guilty of the same thing. I don’t have any helpful suggestions to offer so I just have to live with my difficulty. But it is a real difficulty none-the-less.
Rod B says
ps there is another factor that I think not pure science but none-the-less rational and expected. If a climatologist makes his/her best analysis with no more than a, say, 50-50 probability, but the 50% probability causes disastrous effects bordering on Armageddon, it’s understandable (maybe even necessary) that the scientist will push that scenario. Though this really has no relevance to this discussion…
[Response: No it isn’t. It’s appropriate to say what the odds are. (though frankly, I wouldn’t get on a plane with 50:50 odds of crashing). – gavin]
TEBB says
John Mashey – I’m just bowled over by your explanation of the ways that different types of engineers look at climatology. My father is a retired ME engineer and I have always felt a huge gulf between us as to our ways of thinking, and this is food for thought. I have copied your comment and emailed it to several people – a friend who is trained as a geologist but is now a database programmer and who doesn’t believe in evolution, a long time friend whose husband is an ME engineer and with whom I have had many conversations about ME’s, and a few others with whom I have discussed how people solve problems differently.
Kevin McKinney says
Timo–thanks.
However, it still seems to me that this anti-correlation noted in the papers you cited would be essentially antagonistic to what Lindzen is trying to show: a negative feedback mechanism involving water vapor and cloud.
Further enlightenment is welcome. . . I can’t claim to understand this area well at all. .
Timo Hämeranta says
Re 88-89. Rod,
the IPCC correctly states:
“…the future is inherently uncertain…”
Ref: IPCC AR4 WG III SPM final sentence
Re Gavin: the TAR 95 % confidence level was OK, the AR4 90 % isn’t.
[Response: You think that that TAR was more certain that AR4? Read the reports again. – gavin]
mugwump says
“Your first point of skepticism concerns over-fitting of models. A valid question. These are not statistical models that are “fit” as such, but rather ones incorporating the physics of the underlying processes.”
[edit – no ad homs]
However, for the record, any model with free parameters that are estimated from data can be considered statistical models that are “fit” to the data. GCMs have many free parameters, including aerosol content at any point in time, various parameterizations of cloud processes, and many other physical constants that are only known to a fairly crude precision. climateprediction.net has a much more comprehensive list if you’re interested.
Now, GCM modelers don’t necessarily estimate the free parameters by directly inferring them from model behaviour, but they certainly adjust the parameters to get the model output to “fit” the 20th century instrumental record (it could be argued that the fit between models and the 20th century instrumental record is the sine qua non of progress in the field).
Once you have models with parameters than can be adjusted to fit data you have the risk of overfitting, regardless of how that fitting takes place.
[NB, I am not saying any individual climate model does or does not overfit, but the potential is certainly there. The wide range of climate sensitivity estimates produced by the models, and the underlying cause of that wide range (uncertain net forcing in the denominator), should make us very skeptical of the high sensitivity estimates]
danh says
Previously, if I’d encountered `people who are looking for a straightforward calculation of the global warming that greenhouse gas emissions will bring. What are the physics equations and data on gases that predict just how far the temperature will rise?’, then I would have been given to understand that they didn’t want two significant figure, or even one significant figure, precision – just the right order of magnitude. Therefore, I would have happily pointed them at a model without feedbacks or super-accurate absorption spectra – perhaps Arrhenius (1896, Philos. Mag. fifth series 41(251): 237-276) – making clear to them the approximate nature of the quantitative results, of course. Having read Spencer’s post, I’m worried I might have been taking the wrong approach. What do folks think?
Yoron says
Hi all. I’m surprised that anyone still believe that the earths behavior could be reduced to an elegant (?) algorithm. It’s more of a chaotic system than a static one, even though it to our time sense may be perceived as rather stable and, ah, unchanging. Still ‘weather’ is very much influences from all over the Earth acting in three dimensions (+ time). Consider a hurricanes creation development and path. Can anyone give me the ‘pinpointing’ algorithm for that?
When we go down to quantum mechanics nothing seems to exist and in the end all become probabilities. And when we try to ‘slice up’ weather the best we will get is isolated approximative answers mostly relating to those specific parameters we defined it by. The ‘world’ seems to me more than just its parts so we definitely need complex simulations wherein we put all those patterns and what we think binds them together.
Maybe we will find some good algorithms but I don’t think it will be before we have created simulations that is proved to ‘work out’. Math is a science as well as as a ‘hermetic’ knowledge but this thing about weather is non-linear math, not linear as far as I see it.
” Non-linear equations can be extremely difficult, and are often
insoluble. For example the long-term behaviour of three bodies moving
under gravity cannot be solved mathematically, though we can work a
long way into the future. Unfortunately, non-linear, partial
differential equations are exactly the equations that describe most
real-life situations, like weather systems, frictional or turbulent
motion, and so on. Before computers came along these equations were
handled by making sweeping simplifications – the trick was to know
which simplifications were justified. Although we still can’t SOLVE
the equations we can run computer models which give a very good idea
of what will actually happen. ”
the quote from http://mathforum.org/library/drmath/view/53603.html
mugwump says
RE #85,
SecularAnimist, higher estimates of climate sensitivity from GCMs and the 20th century instrumental record are suspect for the reasons I gave. And there are a lot of prominent people who are using those higher estimates to push their policy agenda. It’s just a fact. I don’t see how that observation makes me a “pseudo skeptic”.
Anyway, I won’t respond to any more personal questioning of my motives. This is already degenerating. I’d rather spend my time investigating how the lower bounds on climate sensitivity are derived from the last glacial – starting with the links gavin kindly provided in #79.
Ray Ladbury says
Rod, I believe that I have recommended this essay to you in the past:
http://ptonline.aip.org/journals/doc/PHTOAD-ft/vol_60/iss_1/8_1.shtml
However, you bring up an interesting point–how do we wend our way through all the caveats to “scientific certainty. You are no doubt familiar with Sherlock Holmes stating (in many different stories) some version resembling: “How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth?” (Note: this one’s from “The Sign of the Four”.) That really is how the whole scientific consensus thing works–eventually an idea just becomes too central to progress in understanding that the science becomes unthinkable without it. Put another way, if you reject that idea, you won’t get many publications, because your ideas won’t advance understanding of the science.
However, not every aspect of a complicated theory will achieve this level of acceptance at the same time. Some aspects may remain uncertain or even controversial even as others are incontrovertible. When you accept that some aspects of the theory are incontrovertible, then regardless of the uncertainties, you may be able to draw some conclusions based only on the “certain” part of the theory.
CO2 greenhouse warming has some very special properties. The well-mixed nature of CO2 mean that it acts as a ghg up to very high in the troposphere. The long life in the atmosphere means that it will keep on giving for a very long time. What is more, there is nothing in our understanding to suggest that the physics changes appreciably up to very high CO2 levels.
So it is a virtual certainty that increasing CO2 will change the climate, and since the climate affects all aspects of human civilization, it is also a virtual certainty that we need to be concerned.
Magnus says
Climate sience is though not simply. But I can not get away from the suspicion that RC et al. want to have it complicated. This because effects (warming) desperately are tried to get linked to wrong cause (CO2) instead of a more realistic cause (GCR). At the best this is due to normal scientific progress at the worst due to much vested interest coupled with political agendas and bias.
It is my strong opinion that once causes and effect are put together free from other than sience it will all fall out beatuifully.
[Response: There is no trend in GCR. It’s really that simple. – gavin]
Guenter Hess says
Gavin,
sorry for my short version in #47 in the previous comment, but I don’t think it is that easy
This is how Ray Pierrehumbert in his climate book describes the greenhouse effect:
” ..In a nutshell, then, here is how the greenhouse effect works: From the requirement of energy
balance, the absorbed solar radiation determines the effective blackbody radiating temperature
Trad. This is not the surface temperature; it is instead the temperature encountered at some
pressure level in the atmosphere prad, which characterizes the infrared opacity of the atmosphere,
specifically the typical altitude from which infrared photons escape to space. The pressure prad is
determined by the greenhouse gas concentration of the atmosphere. The surface temperature is
determined by starting at the fixed temperature Trad and extrapolating from prad to the surface
pressure ps using the atmosphere’s lapse rate, which is approximately governed by the appropriate
adiabat. Since temperature decreases with altitude over much of the depth of a typical atmosphere,
the surface temperature so obtained is typically greater than Trad, as illustrated in Figure 3.6.
Increasing the concentration of a greenhouse gas decreases prad, and therefore increases the surface
temperature because temperature is extrapolated from Trad over a greater pressure range. It
is very important to recognize that greenhouse warming relies on the decrease of atmospheric
temperature with height, which is generally due to the adiabatic profile established by convection.
The greenhouse effect works by allowing a planet to radiate at a temperature colder than the
surface, but for this to be possible, there must be some cold air aloft for the greenhouse gas to
work with.
For an atmosphere whose temperature profile is given by the dry adiabat, the surface temperature
is
Ts = (ps/prad)^R/cp*Trad. (3.8)
With this formula, the Earth’s present surface temperature can be explained by taking prad/ps =
.67, whence prad _ 670mb. Earth’s actual radiating pressure is somewhat lower than this estimate,
because the atmosperic temperature decays less strongly with height than the dry adiabat.”
Here is the link in the English version of Thiemes discussion point:
http://freenet-homepage.de/klima/indexe.htm
I think Thieme questions mostly the backradiation, everything else looks similar to Ray Pierrehumberts version.
[Response: The adiabatic equation is fine. But the greenhouse effect determines what prad is. Theimes effectively just fixes it to the ‘right’ number without acknowledging that this would be different with different amounts of greenhouse gases. – gavin]
mugwump says
OK, I followed gavin’s links in #79. Haven’t got to the glacial-derived lower bounds yet, but did get sidetracked by this: The certainty of uncertainty
That’s an RC article from last October that discusses this Science paper by Roe and Baker (RB): “Why is Climate Sensitivity so Unpredictable”, also published last October.
In short, RB model the “climate feedback factor” f as a gaussian, and show that the climate sensitivity, which is proportional to 1 / (1-f), has a huge upper tail. Well, of course it does. Under their distribution f=1 has a non-zero probability which means there is a non-zero probability of infinite sensitivity.
But this probability of high sensitivity is meaningless. Obviously, very large sensitivities are ruled out by the data (at least one person here is still alive). So at best they should be using a truncated gaussian. But even then, the distribution of f only makes sense if it implies a reasonable distribution on climate sensitivity, given everything we know. Whatever the individual uncertainties in various 20th century forcings, the range of plausible sensitivities is constrained by historical evidence from volcanic eruptions, changes in solar forcing, etc.
In other words, it is not valid to use a large uncertainty in f to infer a long tail for climate sensitivity. At best you can say “we don’t know what the likelihood of high climate sensitivity is”. You can’t say “we’re confident the probability of high climate sensitivity is non-negligible”.
So, what does RC say about the matter?
Read that last sentence carefully. Rather than saying “we don’t know what the probability of high climate sensitivity is”, they’re claiming “we’re confident the probability of high climate sensitivity is non-negligible”. And they want policy to reflect that.
Wrong.
[Response: You need to go a little further in your reading. We have never claimed here that these really high sensitivities are plausible. For instance here. You are welcome to criticise people for what they say, but do not attribute statements to me that I have not made, nor opinions I do not hold. I am very clearly on record as saying the the IPCC range (2 to 4.5 deg C) is what people should mostly be focussed on.- gavin]
TEBB says
Mr. Mashey: a list of your favorite science books for non-science (i.e. liberal arts majors who don’t work in science but find it interesting) would be wonderful to have. I have a lot of good evolution and biology books, but none on global warming because I’m afraid of accidentally getting something outside the climatologists’ consensus.
I enjoy reading realclimate but have no understanding of most of the concepts like forcing. I’m at a very basic level of understanding global warming – just the feedback effect of melting ice means less reflection of sun energy which speeds up melting of ice.