Guest commentary by Spencer R. Weart, American Institute of Physics
I often get emails from scientifically trained people who are looking for a straightforward calculation of the global warming that greenhouse gas emissions will bring. What are the physics equations and data on gases that predict just how far the temperature will rise? A natural question, when public expositions of the greenhouse effect usually present it as a matter of elementary physics. These people, typically senior engineers, get suspicious when experts seem to evade their question. Some try to work out the answer themselves (Lord Monckton for example) and complain that the experts dismiss their beautiful logic.
The engineers’ demand that the case for dangerous global warming be proved with a page or so of equations does sound reasonable, and it has a long history. The history reveals how the nature of the climate system inevitably betrays a lover of simple answers.
The simplest approach to calculating the Earth’s surface temperature would be to treat the atmosphere as a single uniform slab, like a pane of glass suspended above the surface (much as we see in elementary explanations of the “greenhouse” effect). But the equations do not yield a number for global warming that is even remotely plausible. You can’t work with an average, squashing together the way heat radiation goes through the dense, warm, humid lower atmosphere with the way it goes through the thin, cold, dry upper atmosphere. Already in the 19th century, physicists moved on to a “one-dimensional” model. That is, they pretended that the atmosphere was the same everywhere around the planet, and studied how radiation was transmitted or absorbed as it went up or down through a column of air stretching from ground level to the top of the atmosphere. This is the study of “radiative transfer,” an elegant and difficult branch of theory. You would figure how sunlight passed through each layer of the atmosphere to the surface, and how the heat energy that was radiated back up from the surface heated up each layer, and was shuttled back and forth among the layers, or escaped into space.
When students learn physics, they are taught about many simple systems that bow to the power of a few laws, yielding wonderfully precise answers: a page or so of equations and you’re done. Teachers rarely point out that these systems are plucked from a far larger set of systems that are mostly nowhere near so tractable. The one-dimensional atmospheric model can’t be solved with a page of mathematics. You have to divide the column of air into a set of levels, get out your pencil or computer, and calculate what happens at each level. Worse, carbon dioxide and water vapor (the two main greenhouse gases) absorb and scatter differently at different wavelengths. So you have to make the same long set of calculations repeatedly, once for each section of the radiation spectrum.
It was not until the 1950s that scientists had both good data on the absorption of infrared radiation, and digital computers that could speed through the multitudinous calculations. Gilbert N. Plass used the data and computers to demonstrate that adding carbon dioxide to a column of air would raise the surface temperature. But nobody believed the precise number he calculated (2.5ºC of warming if the level of CO2 doubled). Critics pointed out that he had ignored a number of crucial effects. First of all, if global temperature started to rise, the atmosphere would contain more water vapor. Its own greenhouse effect would make for more warming. On the other hand, with more water vapor wouldn’t there be more clouds? And wouldn’t those shade the planet and make for less warming? Neither Plass nor anyone before him had tried to calculate changes in cloudiness. (For details and references see this history site.)
Fritz Möller followed up with a pioneering computation that took into account the increase of absolute humidity with temperature. Oops… his results showed a monstrous feedback. As the humidity rose, the water vapor would add its greenhouse effect, and the temperature might soar. The model could give an almost arbitrarily high temperature! This weird result stimulated Syukuro Manabe to develop a more realistic one-dimensional model. He included in his column of air the way convective updrafts carry heat up from the surface, a basic process that nearly every earlier calculation had failed to take into account. It was no wonder Möller’s surface had heated up without limit: his model had not used the fact that hot air would rise. Manabe also worked up a rough calculation for the effects of clouds. By 1967, in collaboration with Richard Wetherald, he was ready to see what might result from raising the level of CO2. Their model predicted that if the amount of CO2 doubled, global temperature would rise roughly two degrees C. This was probably the first paper to convince many scientists that they needed to think seriously about greenhouse warming. The computation was, so to speak, a “proof of principle.”
But it would do little good to present a copy of the Manabe-Wetherald paper to a senior engineer who demands a proof that global warming is a problem. The paper gives only a sketch of complex and lengthy computations that take place, so to speak, offstage. And nobody at the time or since would trust the paper’s numbers as a precise prediction. There were still too many important factors that the model did not include. For example, it was only in the 1970s that scientists realized they had to take into account how smoke, dust and other aerosols from human activity interact with radiation, and how the aerosols affect cloudiness as well. And so on and so forth.
The greenhouse problem was not the first time climatologists hit this wall. Consider, for example, attempts to calculate the trade winds, a simple and important feature of the atmosphere. For generations, theorists wrote down the basic equations for fluid flow and heat transfer on the surface of a rotating sphere, aiming to produce a precise description of our planet’s structure of convective cells and winds in a few lines of equations… or a few pages… or a few dozen pages. They always failed. It was only with the advent of powerful digital computers in the 1960s that people were able to solve the problem through millions of numerical computations. If someone asks for an “explanation” of the trade winds, we can wave our hands and talk about tropical heating, the rotation of the earth and baroclinic instability. But if we are pressed for details with actual numbers, we can do no more than dump a truckload of printouts showing all the arithmetic computations.
I’m not saying we don’t understand the greenhouse effect. We understand the basic physics just fine, and can explain it in a minute to a curious non-scientist. (Like this: greenhouse gases let sunlight through to the Earth’s surface, which gets warm; the surface sends infrared radiation back up, which is absorbed by the gases at various levels and warms up the air; the air radiates some of this energy back to the surface, keeping it warmer than it would be without the gases.) For a scientist, you can give a technical explanation in a few paragraphs. But if you want to get reliable numbers – if you want to know whether raising the level of greenhouse gases will bring a trivial warming or a catastrophe – you have to figure in humidity, convection, aerosol pollution, and a pile of other features of the climate system, all fitted together in lengthy computer runs.
Physics is rich in phenomena that are simple in appearance but cannot be calculated in simple terms. Global warming is like that. People may yearn for a short, clear way to predict how much warming we are likely to face. Alas, no such simple calculation exists. The actual temperature rise is an emergent property resulting from interactions among hundreds of factors. People who refuse to acknowledge that complexity should not be surprised when their demands for an easy calculation go unanswered.
GlenFergus says
Ike at #290:
As with most aircraft, the lift will be approximately proportional to the wing area. So the effect of an the 8′ wingspan reduction is readily calculable, by hand. Your reduction is small, so it will certainly take off – but maybe not if the temperature is hot, the runway short, the load heavy, and one engine is down.
The point is well made though. I find it fantastic that one can design and build an A380 from scratch, without prototyping, and have the damn thing fly first time, near perfectly. That while the first production models are nearing completion! Those who think that complex computer models are inherently unreliable should never step aboard one.
Mark says
Re #289:
“As far as the Annan paper that you reference, it is interesting that the LGM evidence he uses has the same or higher probability for climate sensitivity of 1C as for 4.5C!”
But what is the upper bound? The lower bound is more than 0, is the upper bound less than 5.5? If not, the uncertainties are likely to take us to higher values than lower values, despite the instantaneous value of probability.
Also, the result (how expensive loss is under certain warming scenarios) we get is not linear but of a higher power than 1, so you should use that relationship to see which has the greatest cost likelihood.
Barton Paul Levenson says
Chris Dudley writes:
You don’t think it’s crackpot to believe that interstellar dust grains are bacteria, or that flu is delivered to the Earth from space? Sorry, I call ’em as I see ’em.
Mark says
Rod B, 292, again you miss answering.
Why do you maintain that the uncertainties make the system safer not worse?
You say in 292 you were saying merely that the paleo data was uncertain, but that uncertainty gives an answer from 2 to 4.5. Yet you maintain that the system must only edge on the 2 or lower side of this uncertainty.
The errors are already taken into account, so saying “your models are no good because the input has huge errors” is irrelevant: the models are run with the inputs varied within those error bars.
But answer the question asked: Why does this uncertainty only make the likely outcome of GW unimportant?
Barton Paul Levenson says
Rod writes:
No, the large majority of the warming is still from CO2. Venus has about half Earth’s atmospheric water vapor. It has three million times as much carbon dioxide.
Barton Paul Levenson says
Ron Taylor — yes, lowered diurnal temperature range is evidence for AGW.
Martin Vermeer says
mugwump #287:
Yes, precisely. You make separate attributions: part of the delta-T comes from ice sheet albedo, part from CO2 change (and part from Milankovich). You can do this because ice sheet area is (more or less) known.
What I meant is that the albedo feedback caused by CO2 warming — part of the independently known ice sheet area change — is directly accounted for, and not attributed to CO2 when defining doubling sensitivity, as a matter of convention.
Martin Vermeer says
Paul Middents #297:
…and you don’t even bring up the interesting question why Mugs and friends are so preoccupied with the lower bound.
As I see it, there are three interesting empirical values for doubling sensitivity that we’re trying to get a handle on: the most probable, the worst case, and the best case value. The first two are of obvious interest to responsible policy makers; the latter can only interest ostriches, not humans :-)
(Actually, ostriches sticking their heads in the sand is urban legend.)
Ellis says
Gavin, thank you for your reply. Is it your contention that the LGM is not a part of the “ice ages” that is referred to by Dr. Rind. He clearly speaks to the LGM in Rind 2008,
I grant you in the abstract of Rind 1998, he states,
However, the complete paper is not available-for free- so I am unsure how he arrived at this conclusion, or to be honest, exactly what it means. As to (Rind, Peteet 1985), what I take from the 1985 paper-
is that, over 20 years later Dr. Rind is asking the same questions.
mugwump says
RE #297 Paul Middents:
Rubbish. I’ve been looking into the LGM for the past two days. My main question is still outstanding: why is it believed that climate sensitivity to 2XCO2 is linear? Note: this is just a question. I am not saying that it should or should not be linear, but on face value linearity needs to be justified (as Hansen himself notes – see the first quote in #287).
And if by “handwaving” you are referring to my snowball earth example, I knew at the time it was a risk with this crowd to include a hypothetical situation by way of illustration only. Such rhetorical devices help to clarify the point for those who are genuinely interested in the problem, but can easily be exploited for ridicule by those who are not.
All I am doing is reading the original papers and asking questions. That’s the way science works. Religion of course works differently: it requires unquestioning faith from its believers, who usually feel very threatened when outsiders start asking difficult questions.
mugwump says
RE #297 Paul Middents:
It certainly did. Which is why I don’t understand why it was deleted. Since you have linked to the original exchange, judge for yourself whether the last post was worthy of deletion:
mugwump says
RE #296 Ron Taylor:
You are correct. The basic (no-feedback) sensitivity is nonlinear in T, but over the range we are talking about (a few degrees K), it is close to linear.
However, the feedbacks are a function of the climate state, which is a highly nonlinear function of T (only a few degrees separates us from the last ice age – a very different climate), hence the contribution to climate sensitivity from the feedback processes may be very different today than it was in the LGM.
Lawrence McLean says
Re #296 and #253, Ron Taylor and Barton. Thanks Barton for your reply, however, your response was not precisely to do with what I was talking about.
I suspect there is no data dealing with the issue that I raised. My question is: for a given amb_ient night time temperature: what has been the trend for the the lowest achieviable temperatures due to the night sky radiation effect? That is not the same subject as the diurnal temperature range. Thanks…
Auste Redman says
I just saw an interesting documentary on climate Sceptics. Michael Mann is a contributor:
http://www.bbc.co.uk/iplayer/episode/b00dm7d5/b00dm7bf/
(I’m not sure if the link works in all regions).
Lawrence Coleman says
Read this from science daily digested from a report from oregon university. Re: correlation between CO2 and temp over the past 70K years. Briefly..there is unequivocal correlation between cyclical fluctuations in CO2 concentrations on the atmosphere and resultant mirroring of temp in the northern and southern hemisphere. The level of CO2 now is twice that of that in the last ice age which triggered a 15C change in temp. Another point mentioned which partially answers my question on what affect changes to the north atlantic current has on world climate is that….the main driver of global CO2 and temp was the relative speed of the N.A current. As the current slows greenland and northern europe get colder while the southern hemisphere and antarctica gets appreciably hotter.
So the most likely outcome of the rapid arctic melt should be a rapid slowing down of the N.A current with a resultant cooling of the temperate regions of the northern hemi whle the southern hemi cooks. As alluded to in the artical the historic climatic changes took affect within 20 years and that was with CO2 half the current levels..makes you think!!
Lawrence Coleman says
the website for that report @ 315 is http://www.sciencedaily.com/releases/2008/09/080911150048.htm
Mark says
Lawrence, #313.
Uh, why is there a need for the difference? What if, because of GHG concentrations, there’s more dew-point moisture and therefore thicker water haze, causing more overcast skies even where there are no clouds per se?
Your question would miss that.
If it doesn’t happen, then what’s the difference between what you want and what actually happens as shown by measurements? As academic fluffery?
Mark says
Mugwump, #312, but a non-linear change like you admit means that the possibility of a 4.5 degree change is more “effective” than the possibility of a 1 degree change.
And a 1 degree change added to the current 1 degree over long term average before indistrialisation makes 2 degrees, which was enough to turn interglacial into glacial. What will such a change do to an interglacial? Even at 1 degree.
Patrick Caldon says
Dan Hughes,
See “An introduction to three dimensional climate modeling”. The answer you’re looking for is in equation 3.31 (page 56) and repeated for emphasis as equation 3.53 (page 62). As you state/guess option (2) is used. The text describes reasonably carefully how this is constructed. There’s then a discussion of how this is transformed into a sigma co-ordinate (generally given a SIG monkier in modelE) on pages 63 through 68. I’d recommend NCAR CCM as being more readable and better documented, however it’s not too difficult to pick you way through the modelE stuff.
If you’re interested in conditions describing where vertical hydrostatic balance is valid and invalid, this text refers me to J.R. Holton, “An Introduction to Dynamic Meteorology”, which it describes as having a discussion on this issue (which I’ve not read).
Hank Roberts says
Ron Taylor writes
> my reading indicates that the GCMs predict
Mugwump tells Ron
> You are correct.
Beware who you rely on for opinions.
Mugwump’s got strong opinions but without citation, it’s not science.
What reading, in each case, supports these statements?
Chris Dudley says
Barton Paul Levenson #303,
I think that the idea of viruses has certainly received serious consideration. Returning humans have been isolated in the past to protect against possible infection. Viruses can survive in space very well. And, the environment Hoyle and Wickramasinghe considered, a star forming cloud, certainly seems to be wet enough to support life. So, the idea has some firm foundations.
I feel that it falls down on Occam’s razor: Given the extreme rapidity of viral evolution here on Earth and the efficiency of certain vectors such as birds, we don’t need a different global mechanism to spread viruses. That does not mean that the idea did not have sound components. Ideas can be wrong without being crackpot.
I certainly recall the 3.4 um absorption profile published by Hoyle and Wickramasinghe reproduced in a review article not too long ago: Fig 9 here: http://ads.ari.uni-heidelberg.de/full/1994ApJ…437..683P That certainly would not happen if the idea were not taken seriously in an historical sense by the community. Presently, most people in the field associate the 3.4 um feature with the diffuse interstellar medium rather than with molecular clouds so that the mechanism proposed by Hoyle and Wickramasinghe to produce the feature would not seem to be available for that dust. I worry that ice mantles may mask the presence of the material responsible for the 3.4 um feature in molecular clouds so that associating the material only with the diffuse medium may not be correct. But the fact that UV irradiated ice residues seem to reproduce the main characteristics of the feature and such conditions pretty much have to arise at cloud edges makes me think that a better explanation for the feature is available.
An umpire standing in another ballpark can call them as he sees them, but the calls won’t mean much since the balls and strikes would not really have been seen. I suggest that you want to look closely at the ideas and how they were presented and received at the time before passing judgement.
Paul Middents says
RE #311
Mugs, commenting on a two year old blog, can’t understand why James Annan deleted his comment. Perhaps it was James’ subtle way of directing him to the caveat at the bottom of his “Empty Blog”.
“COMMENTS ARE WELCOME, BUT I DON’T EXPECT (OR WISH) SUBSTANTIAL DEBATE HERE. PLEASE GO TO GLOBALCHANGE WHERE ALL CAN CONTRIBUTE.”
When we go to GLOBALCHANGE we find a one month old thread, “Estimating equilibrium climate sensitivity”.
This thread was conducted on a scientific level by professionals. Mugs might try out his theories and questions on that thread. I would advise checking the attitude and political bias at the door.
I think advancing science might involve just a little more than reading the original papers and asking questions. Maybe more than two days might be required to absorb the full impact of LGM studies on our overall understanding of the climate. Concentration in a relevant field, gathering data, conducting original analysis and peer reviewed publication just might be essential for the advancement of science.
Mick says
So why not say you are now not sure of the greenhouse effect?
If you cannot produce hard numbers then your still in the pre experimental phase. Not to say there is anything wrong with proposing this hypothesis, just that if you can’t make predictions you can’t make claim to scientific truth as regards to the cause of the recent warming and even more recent cooling.
Brian Dodge says
re 290 Ike Solem – good point – bad example. The complexities of a 747 aren’t in the aerodynamics, but the control systems. Lift varies directly with wing area; with the same chord, wing section, takeoff weight, thrust, and air density(which varies a lot with temperature, altitude, and humidity), but eight feet less span, a 747 would require about 1.02 time the speed and 1.04 times the runway. Given that FAA braking distance regs for aborted takeoffs result in liftoff at about 80% of runway length, you could saw off about 20 feet of each wing and just barely(but not legally) lift off, as long as that clipping didn’t affect critical control systems(flaps, ailerons, hydraulics, sensors, fuel, etc). A much smaller percentage change in the wing section could have critical effect on the aerodynamics(duct tape a 1×4 along the length of the wing just aft of the slat on the upper surface, and it would never get off the ground) and would it require extensive modelling to predict behavior of seemingly subtle changes in camber, thickness, or form.
see http://adg.stanford.edu/aa241/performance/takeoff.html, and http://www.mh-aerotools.de/airfoils/javafoil.htm
mugwump says
RE #322:
Pauly pops apparently can’t understand why deliberately diminutive dismissives of commenter’s nom-de-plums are offensive. Nor can he understand why it is questionable for someone to delete a non-offensive remark that illustrates a significant error of interpretation at the end of a long technical discussion.
Regardless, I am not interested in rehashing this. No more comments from me on the subject (unless you have a technical point to raise against my interpretation of the DK Pinatubo results).
mugwump says
Mark #318:
Not necessarily. It just means the climate response to an increased forcing today may not be the same as during the LGM, even after factoring out ice albedo effects.
Lawrence Brown says
Re:320 from Hank
“Ron Taylor writes
> my reading indicates that the GCMs predict
Mugwump tells Ron
> You are correct.”
Ron is referring to diurnal temperature range, while mugwump is referring to the non-linearity of sensitivity. There seems to be a mis-connect in the dialogue.
Way back in an earlier post mugwump makes reference to Shakespeare in saying that proponents of AGW protest too much about contrarian opinions. I think the frustation and (sometimes) exasperation is more relevant to Banquo’s ghost. Some of the same points of denial resurface more than once even after they’ve been vanquished.
mugwump says
RE #320 referring to #312 referring to #296:
In my case Hank, try Principia Mathematica
[the bit about fluxions]
Hank Roberts says
Mick, ever wonder how a CO2 laser works?
Read anything under the Start Here or Science links?
Lasers use the same radiation physics basis — how CO2 behaves — as climate models, which address a more complicated physical system.
Don’t be fooled. You’re reposting someone’s talking point off some PR site. You can get better information.
Connor says
Sorry to go completely off-topic here but I have a question of my own that I would love to get an answer for… (Hopefully a lot simpler!)
Regarding the Greenland ice sheet and the estimates that it contains enough ice to raise sea levels by nearly 7 metres I need to find out how the researchers came to that figure – the best I can find is that it was put forward by Church et. al (2001) but can only access the abstract to that report which isnt much help.
Basically I am engaged in one of those pointless internet forum debates where some genius believes that it is a fraudulent figure but I lack the understanding of physics and arithmitic to properly put him down.
His argument goes something like this:
“Your quoted figure is 2.85 m cubic Km for total ice associated with Greenland. I do not dispute that although it is only an estimation. It may well be more.
The sea surface is approx 350 m sq km – This is a fact that I have not seen disputed and although I have seen figures up to 361 sq km I am working on approx figures only as that is all that is needed.
Now the arithmetic.
A 1 (one) metre rise in sea level requires 350 m cubic kilometres of water (or melted ice) That is 350 m sq by 1 metre deep.
The Greenland ice you have quoted is only short by 347.15 m cubic kilometres.
Please explain why this arithmetic is wrong.”
If anyone can give me a quick rundown on why his assumptions are incorrect it would be a HUGE help. I cant let a denialist win an argument, not on the internet!
:)
[Response: Area of the ocean is 3.5 x 10^14 m2, amount of ice 2.85 m cubic km = 2.85 x 10^6 km3 = 2.85 x 10^15 m3, therefore SLR= 2.85*10/3.5 = 8.1 m by these figures. Your interlocutor got the 1 km = 10^3 m conversion wrong. – gavin]
RichardC says
#3 – Head North. SW USA will fry and dry. Florida and the Gulf Coast will drown. Alaska will do grand, especially since they will profit from selling the rest of the country the oil which will improve Alaska’s climate! There will be a period of transition where humans will actively replace the current (dying) ecosystem, which means cheap wood in Alaska. North Dakota and other northern tier states have plenty of wind for use in a post-fossil world. Nebraska sits on the mother-lode of water. You’ve got plenty of options; just avoid the traditional retirement states like the plague, and stay 100 metres above sea level.
#4 – not sure I agree. So very much of the USA’s wealth is tied to locations close to sea level. I’ve always (OK, for the last 15 years) said that by 2020 the Arctic ocean would be ice-free in summer, and sea level will then destabilize. Once saltwater gets past Greenland’s sill as the glaciers retreat, tides will ravage the entire ice sheet. (Greenland’s glaciers flow up to the sea – at least at their bottom. They start perhaps 800ft below sea level!) Four metres per century (and accelerating) is reasonable.
#24 –Asking for a huge delay between predictions and policy changes is simply a call for massive atmospheric change. You guarantee a single outcome, and one which has a tiny probability of a success. Truly a self-serving argument. Why do you ask whether outdated models worked? (They didn’t) Since the data is still there, it’s better to ask how well the current versions “predict” past climate when fed past data. The data stream should be as long as possible and the techniques should be brand new. Besides, all the models underestimate the the results of GHG injection into the atmosphere. I’d like to see a plot of the models’ expectations over time, as in 1980’s predictions using real data through 2008, then 1990’s etc etc. Noting the trajectory of the evolution of predictions probably gives a good feel for the real future. As the Arctic darkens and methane skyrockets, the models will have a lot of catching up to reality to do. The models are still tremendously flawed. They are far too conservative, as the effects of feedbacks can’t be realistically modelled until after the fact. As each “Oops” happens, the models will improve, but by then it is too late! Heck, the IPCC predicted sea level rise based on ZERO, yes ZERO ice melt. They didn’t know how severe it would be, so they guessed ZERO! That’s insanity driven by the naturally conservative nature of science as steered by the politics of the right. Unfortunately, the mindset which served science so well when it was a “It doesn’t matter except to academics” field breaks down when it is thrust into a political and highly relevant to everyone arena. Scientists are used to cinderblock rooms and tend to screw up when thrust in spotlights. Unfortunately, the reality is that models are irrelevant from a practical standpoint. Arctic ice is giving us the real-world experimental data far faster than any computer could calculate it. As the permafrost gets up to speed, the answer will be clear: “Oops, we’re screwed.” After that is obvious to everyone, then the models will be able to tell us what we will already know.
#43 – The cost of mitigating GHGs is negative. Alternative reality: The USA increased CAFE to 60MPG, including light trucks back in 1980. Oil remained a glutted commodity and prices stayed around $10 a barrel. Rules for mor_gages for cars and houses were modified to include fuel costs, so buyers would qualify for the *slightly* larger mor_gage (Stupid SP_M filter) a more efficient system would require. This reduced both energy usage and the buyer’s monthly total cost of ownership, while driving down energy prices and preventing the massive problems caused by the “oil curse” in the Mideast and other regions. Again, a NEGATIVE cost. The easiest way to increase wealth is to drive down energy prices by increasing efficiency. All it would take is a few small changes to start the process. The mor_gage change I just posited would be a fine start. Saving energy is about REDUCING costs.
Sanafaye says
How can i convince my friend that global warming is REAL and not just a rise in temperature?
Phil Scadden says
mugwump needs credit for dealing with his/her skepticism in the right way – reading the papers and getting answers to the questions. It seems to me that the ~2 value for climate sensitivity comes primarily from GCMs, fits observation records, fits paleo-record, but till we do the experiment (doubling CO2 which is preceding quite well), then you cant have certainty. This is science after all. Hope its lower and pray it isnt higher. However, there are sufficient lines of evidence pointing at 2 for this to be used as a basis for policy till better data or model come along. Its wildly irresponsible to assume that it is lower.
Lynn Vincentnathan says
Been away, dealing with a niece who wouldn’t evacuate from Houston-Galveston area (her sister finally got her to leave)….like so many contrarians who say, well, what if there’s no GW….
Here’s something about a (perhaps) underestimation of the aerosol effect: “Earth already committed to 2.4-dgree C rise from climate change [by 2030]” newstory based on Ramanathan & Feng’s study in PNAS Online. See: http://www.climateark.org/shared/reader/welcome.aspx?linkid=106630
mugwump says
RE #330:
We’re probably fine if it is 2. We’ve had 1/3 of that already. My assessment FWIW (which may not be much) is 2 or less we can probably ignore, 4 or more (on a short time scale of the next century or so) we’d better do something about. In between those two bounds it is unclear which direction policy should go.
I don’t think we should worry about effects more than 150 years hence, unless we are really really certain they are going to be very bad. By then technology will be insanely more advanced than it is now, and we’ll probably be able to just suck the CO2 out of the atmosphere if we need to. As justification, compare the technology of today with that of 150 years ago, and then consider that technological development is still on an exponential curve and is likely to be so for at least another few hundred years.
Hank Roberts says
Phil, what’s the source for your statement?
> it seems to me that the ~2 value for climate
> sensitivity comes primarily from GCMs
I think you likely have a basis for this, from seeing other things you’ve written, but I don’t know where.
Steve Reynolds says
Mark: “But what is the upper bound? The lower bound is more than 0, is the upper bound less than 5.5?”
Did you read Annan’s paper? The probability from LGM data is higher at 0 than at 5.5.
The overall conclusion from all the data sets an upper bound of 4.5C.
Rod B says
Mark, you have me confused with someone else, or we keep passing in the night. I never said anything about the “…uncertainties make the system safer not worse…” All I asserted was that the uncertainties of the data make the conclusions less certain.
Rod B says
BPL, wait a minute. Water vapor made Venus so hot that CO2 could not be sequestered by the normal weathering process (which also requires H2O) and so then CO2 built up, but H2O is not the primary culprit??
Hank Roberts says
Rod, seriously, read some of the Venus papers, Google Scholar would help a lot at this point. You need to follow the description through from the very early planetary atmospheres for, say, Mars, Venus, and Earth. All three were rich in hydrogen; Mars and Venus lost most of theirs, over a long stretch of time, for rather different reasons, but in both cases they couldn’t form and hang on to a whole lot of water as Earth did.
This happened for different reasons — location, location, location, as they say in planetary real estate — there’s a habitable zone around a sun — not too hot, not too cold, more or less survivable.
Ike Solem says
Brian Dodge, have you run your idea past Boeing? You claim that you can do a back-of-the-envelope calculation and come up with an accurate answer on that?
http://www.financialexpress.com/news/Boeing-to-test-Tata-supercomputer-in-India/298008/
Supercomputers are integral to modern airplane design – it couldn’t be done without them. The same goes for climate predictions.
http://www.msnbc.msn.com/id/19421415/
More to the point: grid size is also important, as in climate models:
aero-comlab.stanford.edu/fatica/papers/jameson_fatica_hpc.pdf
It is a very similar situation to the climate model – we know that shortening the wing will reduce lift, and we know that adding infrared-absorbing gases to the atmosphere will raise the surface temperature, but for quantitative estimates, supercomputer models are an absolute necessity in both cases.
RichardC says
#337 – The bounds of CO2 direct forcing is irrelevant. Doubling CO2 can’t be done in a vacuum, with no CH4 change and no ice sheet (albedo) change. The whole debate seems similar to a focus on the explosive power of a fuse, while ignoring that a nuclear bomb is connected to the fuse. It is senseless and immaterial at best. Yep, CO2 is so wimpy that doubling its concentration in the atmosphere would lead to minor changes in global temperatures. However, ignoring the huge changes in temperature which would be unleashed by those minor changes is not rational. Excluding feedbacks from the equation is ludicrous and irresponsible. Scientists might understand that the caveats exist, but POLICY is being made based on the statements. Scientists need to learn to speak HUMAN. Insisting on retaining the arcane speech patterns of traditional science is a recipe for disaster. “Oh yes, if doubling CO2 results in a ten-fold increase in CH4, then ……..” is something that needs to be brought up first, not excluded. The public expects that all variables will be included in the initial statement, so even though it tears holes in your scientific heart, you must include all variables. Welcome to the spotlight.
Ike Solem says
“By then technology will be insanely more advanced than it is now, and we’ll probably be able to just suck the CO2 out of the atmosphere if we need to.” – mugwump.
Yes – and we’ll all live twice as long as we do now, and aliens will have landed and given us even more technology, and we’ll be flying to the stars on fusion reactors… nice pipe dream.
Technological wish-fulfillment is not always possible… for example, in the 1950s many leading scientists believed that accurate weather control was just around the corner – they actually had an entire military program devoted to the concept, courtesy of John Von Neumann.
http://swiki.cs.colorado.edu:3232/phd-intro-2007/uploads/34/Grcar_talk_03.4.pdf
Ed Lorenz showed that long-term weather forecasts were impossible, putting an end to that concept.
That’s why appeals to futuristic technology are not to be trusted.
You might just say what I think you really mean: “I don’t care what kind of world tomorrow’s children inherit, because I’ll be dead by then, and I’m certainly not going to make any sacrifices on their behalf.”
Mark says
Rod 338, you do however say “what if it’s less than 2?” and similar.
Mark says
Steve 337, I really don’t think that the chance of CO2 sensitivity is anything other than 0.0 recurring at 0.
Barton Paul Levenson says
Chris Dudley writes:
I suggest that I already have and I stand by what I said. No matter how you argue for them, believing that the flu comes from space and that interstellar dust grains are bacteria are crackpot ideas. Period.
Has it occurred to you that the number of possible biochemicals is astronomical? How is it that all that flu forming in space just happens to have DNA or RNA for its genetic material, allowing it to infect human beings? And how does it survive raw ultraviolet radiation from the sun?
And are you aware that space is filled with ionizing radiation? There sure are complex chemicals in space, especially in nebulae, but if dust grains are bacteria, what the hell do they eat? The interstellar medium in a nebula isn’t much more than a hard vacuum, you know. In real life, if you look at a bacterial colony, about 25% of the bacteria are reproducing by fission at any given time, because their life spans are very, very short. Doesn’t happen to interstellar dust grains.
Your defense of Hoyle argues an irrational and emotional a priori attachment to his ideas that cannot be swayed by evidence or logic. So, if you don’t mind, I’m going to stop discussing this asinine controversy. You can go on if you want, of course.
Barton Paul Levenson says
Rod B writes:
When Venus had 300 bars of steam in its atmosphere water vapor was the major greenhouse gas there. Now that it has 89 bars of carbon dioxide and little else, CO2 is.
Ray Ladbury says
Phil Scadden, A value of 2 for climate sensitivity is a lower limit. The most probable value remains 3. However, you need to consider the asymmetry of the probability distribution for sensitivity values. 2 and 4.5 constitute lower and upper bounds at the 90% CL. The question then becomes whether we want to bet the future of civilization on a 90% CL. If not, the asymmetry becomes a serious issue.
Moreover, I would contend that Mugwump’s sanguinity about a warming of even 2 degrees may be misplaced. We don’t know when significant emissions of CO2 and CH4 from natural sources will kick in. True skeptics need to understand that uncertainty is not their friend here. If we cannot limit sensitivity, if we cannot better understand the onsets of natural tipping points, and if we cannot trust the models to guide our efforts, the only responsible course of action is to impose stringent limits even at the cost of economic health. The models and the climate science community are your best friends if you seek rational policies and healthy economies.
Oracle of ReCAPTCHA: 162nd prayer
mugwump says
RE #343 Ike:
That’s a great deal more far-fetched than large-scale CO2 sequestration.
I have young children. You think I don’t already make huge sacrifices for them? As I tell my wife, we are now genetically irrelevant. Our only biological purpose is to transfer our remaining life force to our children.
Their living standard is vastly greater than mine was as a child. Mine was vastly greater than my parents, which was vastly greater than my grandparents. This is due in large part to virtually unfettered growth built on very cheap energy for the past 150 years.
I know what is better for my descendants, and it is not halting growth on the basis of uncertain climate sensitivity projections.
Ray Ladbury says
Mugwump says: “I know what is better for my descendants, and it is not halting growth on the basis of uncertain climate sensitivity projections.”
Look, I don’t see much point in descending into a debate over who cares about their children more. However, I think you have to consider the implications of the uncertainty. You seem to be assuming that the sensitivity will shake out at 2 degrees per doubling rather than 3 or 4.5. This is not an evidence-based decision. Indeed, it’s far more likely that the sensitivity could be greater than 5.5 than below 1.5. Moreover, we know that at some point, natural sources of ghgs will swamp our own emissions. We don’t know how close we are to that tipping point.
Uncertainty is not your friend and it is definitely not the friend of your progeny. We are at a turning point wrt energy. All climate concerns do is tell us that we should look to other sources than coal to meet future needs–and petroleum is becoming to precious to burn.