These days, when global warming inactivists need to trot out somebody with some semblance of scientific credentials (from the dwindling supply who have made themselves available for such purposes), it seems that they increasingly turn to Roy Spencer, a Principal Research Scientist at the University of Alabama. Roy does have a handful of peer-reviewed publications, some of which have quite decent and interesting results in them. However, the thing you have to understand is that what he gets through peer-review is far less threatening to the mainstream picture of anthropogenic global warming than you’d think from the spin he puts on it in press releases, presentations and the blogosphere. His recent guest article on Pielke Sr’s site is a case in point, and provides the fodder for our discussion today.
Actually, Roy has been pretty busy dishing out the confusion recently. Future posts will take a look at his mass market book on climate change, entitled Climate Confusion, published last month, and his article in National Review. We’ll also dig into some of his peer reviewed work, notably the recent paper by Spencer and Braswell on climate sensitivity, and his paper on tropical clouds which is widely misquoted as supporting Lindzen’s IRIS conjecture regarding stabilizing cloud feedback. But on to today’s cooking lesson.
They call it "Internal Radiative Forcing." We call it "weather."
In Spencer and Braswell (2008), and to an even greater extent in his blog article, Spencer tries to introduce the rather peculiar notion of "internal radiative forcing" as distinct from cloud or water vapor feedback. He goes so far as to say that the IPCC is biased against "internal radiative forcing," in favor of treating cloud effects as feedback. Just what does he mean by this notion? And what, if any, difference does it make to the way IPCC models are formulated? The answer to the latter question is easy: none, since the concept of feedbacks is just something used to try to make sense of what a model does, and does not actually enter into the formulation of the model itself.
Clouds respond on a time scale of hours to weather conditions like the appearance of fronts, to oceanic conditions, and to external radiative forcing (such as the rising and setting of the Sun). Does Spencer really think that a subsystem with such a quick intrinsic time scale can just up and decide to lock into some new configuration and stay there for decades, forcing the ocean to be dragged along into some compatible state? Or does he perhaps mean that slow components,like the ocean, modulate the clouds, and the resulting cloud radiative forcing amplifies or damps the resulting interannual or decadal variability? That latter sounds a lot like a cloud feedback to me — acting on natural variability whose root cause is in the ponderous motions of the ocean.
Think of it like a pot of water boiling on a stove. What ultimately controls the rate of boiling, the setting of the stove knob or the turbulent fluctuations of the bubbles rising through the water? Roy’s idea about clouds is like saying that you should expect big, long-lasting variations in the boiling rate because sometimes all the steam bubbles will decide to form on the left half of the pot leaving the right half bubble-free — and that things will remain that way despite all the turbulence for hours on end.
The only sense that can be made of Spencer’s notion is that there is some natural variability in the climate system, which in turn causes a natural variability to some extent in the radiation budget of the planet, which in turn may modify the natural variability. Is this news? Is this shocking? Is this something that should lead us to doubt model predictions of global warming? No — it is just part and parcel of the same old question of whether the pattern of the 20th and 21st century can be ascribed to natural variability without the effect of anthropogenic greenhouse gases. The IPCC, among others, nailed that, and nobody has demonstrated that natural variability can do the trick. Roy thinks he has, but as we shall soon see, it’s all a matter of how you run your ingredients through the food processor.
The impressive graph that isn’t
So here’s what Roy did. He took two indices of interannual variability: the Southern Oscillation (SOI) index, which is a proxy for El Nino, and the Pacific Decadal Oscillation Index (PDOI). He formed an ad-hoc weighted sum of these indices,and then multiplied by an ad-hoc scaling factor to turn the resulting time series into a time series of radiative forcing in Watts per square meter. Then he used that time series to drive a simple linear globally averaged mixed layer ocean model incorporating a linearized term representing heat loss to space. And voila, look what comes out of the oven!
Roy is really taken with this graph. So much so that he uses it as a banner near the top of his climate confusion web site under the heading "Could Global Warming Be Mostly Natural?" But is it as good as it looks? To find out, I programmed up his model myself, but chose the set of adjustable parameters based on compatibility with observations constraining reasonable magnitudes for these parameters. Here’s what I came up with:
So why does Roy’s graph look so much better than mine? As Julia Child said, "It’s so beautifully arranged on the plate – you know someone’s fingers have been all over it."
A Cooking lesson
Lesson One: Jack up the radiative forcing beyond all reason. Reliable data on decadal variability of the Earth’s radiation budget are hard to come by, but to provide some reality check I based my setting of the scaling factor between radiative forcing and the SOI/PDOI index on the tropical data of Wielecki et al 2002 (as corrected in response to Trenberth’s criticism here.) The data is shown below. On interannual time scales, it’s mostly the net top-of-atmosphere flux that counts, so the curve to look at is the green NET curve in the bottom-most panel.
Except for the response to the Pinatubo eruption (the pronounced dip during 1991), the fluctuations are on the order of 1 W/m2 or less once you smooth on an annual time scale. Based on this estimate and on the typical magnitude of Spencer’s combined SOI/PDOI index, I chose a scaling factor (Roy’s a) of 0.27 W/m2 .. In his article, Roy uses a value ten times as big, but then he partly covers up how large the annual radiative forcing is by showing only the five year averages. With Roy’s value of the scaling coefficient, the annual radiative forcing looks like this
which is clearly grossly exaggerated compared to the data. Moreover, in my own estimate of the scaling factor I tried to match the overall magnitude of the fluctuations, whereas restricting the estimate to that part of the observed fluctuation which correlates with the SOI/PDOI index could reduce the factor further. Finally, even insofar as some part of climate change could be ascribed to long term cloud changes associated with the PDOI and SOI, one cannot exclude the possibility that those changes are driven by the warming — in other words a feedback. Still, let’s go ahead and ignore all that, and put in Roy’s value of the scaling coefficient, and see what we get.
So here’s our cooked graph as of Lesson 1 of the recipe:
Lesson Two: Use a completely unrealistic mixed layer depth. OK, so we’ve goosed up the amplitude of the temperature signal to where it looks more impressive, but the wild interannual swings in temperature look completely unlike the real thing. What to do about that? This brings us to the issue of mixed layer depth. The mixed layer depth determines the response time of the model, since a deeper mixed layer has more mass and takes longer to heat up, all other things being equal. The actual ocean mixed layer has a depth on the order of 50 meters. That’s why we got such large amplitude and high frequency fluctuations in the previous graph. What value does Roy use for the mixed layer depth? One kilometer. To be sure, on the centennial scale, some heat does get buried several hundred meters deep in the ocean, at least in some limited parts of the ocean. However, to assume that all radiative imbalances are instantaneously mixed away to a depth of 1000 meters is oceanographically ludicrous. Let’s do it anyway. After all, as Julia Child said, "In cooking you’ve got to have a ‘What the Hell’ attitude." Here’s the result now:
Lesson 3: Pick an initial condition way out of equilibrium. It looks better, especially in the latter part of the century. But it doesn’t get the trend in the early century right. Gotta keep cooking! The essential ingredient this time is the choice of initial condition for the model. If we initialize the anomaly at -0.4C, which amounts to an assumption that the system is wildly out of equilibrium in 1900, then this is what we get:
Now, it’s finally looking ready to serve up to the unsuspecting diners. Note that it’s the adoption of an unrealistically large mixed layer depth that allows Roy to monkey with the early-century trend by adjusting the initial condition. With a more realistic mixed layer depth, changing the initial condition on temperature anomaly only leads to a rapid adjustment period affecting the first few years.
My graph is not absolutely identical to Roy’s, because there are minor differences in the initialization, the temperature offset used to define anomalies, and the temperature data set I’m using as a basis for comparision. My point though, is that this is not an exacting recipe: it’s hash — or Hamburger Helper — not soufflé. Following Roy’s recipe, you can get a reasonable-looking fit to data with very little fine-tuning because Roy has given himself a lot of elbow room to play around in: you have the choice of any two variability indices among dozens available, you make an arbitrary linear combination of them to suit your purposes, you choose whatever mixed layer depth you want, and you finish it all off by allowing yourself the luxury of diddling the initial condition. With all those degrees of freedom, I daresay you could fit the temperature record using hog-belly futures and New Zealand sheep population. Anybody want to try?
Postlude: Fool me once …
Why am I not surprised about all this shameless cookery? Perhaps it’s because I remember this 1997 gem from the front page of the Wall Street Journal, entitled "Science has Spoken:Global Warming is a Myth":
That’s not Roy’s prose, but it is Roy’s data over there in the graph on the right, which purports to show that the climate has been cooling, not warming. We now know, of course, that the satellite data set confirms that the climate is warming , and indeed at very nearly the same rate as indicated by the surface temperature records. Now, there’s nothing wrong with making mistakes when pursuing an innovative observational method, but Spencer and Christy sat by for most of a decade allowing — indeed encouraging — the use of their data set as an icon for global warming skeptics. They committed serial errors in the data analysis, but insisted they were right and models and thermometers were wrong. They did little or nothing to root out possible sources of errors, and left it to others to clean up the mess, as has now been done.
So after that history, we’re supposed to savor all Roy’s new cookery?
That’s an awful lot to swallow.
John P. Reisman (The Centrist Party) says
#97 Thanks Phillipe
I used the elliptical to keep it in as simple an illustrative form as possible. The 41k yrs. obliquity, and and 26k yrs. precession cycles are certainly a factor.
People tend to understand the elliptical cycle faster based on the ratio of time near the sun at perihelion vs. the 2/3rds of the year spent further form the sun as we approach aphelion.
The post was geared towards simplicity though for Mr. Will Nitschke.
#99 JCH
Yes, unfortunately I’m in simple mode today. The array of systemic and short term thinking by multiple administrations is a serious factor. Subsidizing oil to generate economic growth in a misdirected Keynesian economic model (driven by misdirected supply side growth based on resource usage/exploitation rather than redirected to resource allocation that could be leaned away from material consumption and toward greater intellectual value and productivity that is considerate of long and short term ramifications) in a mixed market that is anything but a free market due to the lack of a gold standard supported by a fit dollar system regulated by the federal reserve and the world bank calculations significantly complicates the matter. Then you add Buckley v. Valeo in 1976 thanks to the supreme court and you end up with a diminishing middle class and oligarchical and plutocratic tendencies.
Barton Paul Levenson says
A small correction to Philippe Chantreau’s correction — the eccentricity cycle does slightly affect the total amount of year-round illumination. The factor is f = sqrt(1 – e^2), if I remember correctly, so that when the Earth has an eccentricity of zero, it gets 100% illumination, but when it has e = 0.05, it only gets 99.8749%. You can derive this from the definition of eccentricity and the inverse-square law.
Chris Colose says
Re BPL
Actually I think that’s backwards. When eccentricty is highest you get a bit more annual global average solar insolation.
Philip Machanick says
It seems that Pielke Sr. is not allowing comments on anything.
This marks the demise of one of the few contrarian sites where there was some reasonable level of debate.
Barton Paul Levenson says
Chris,
No, you might think so, but remember that a planet’s orbital velocity is highest when it is near perihelion and slowest when it is near aphelion. A more eccentric orbit gets slightly less illumination, according to the equation I posted.
John P. Reisman (The Centrist Party) says
Regarding Spencers argument: Actually, if one were to limit ones view accordingly, one can still argue that the world is flat.
Hank Roberts says
> It seems that Pielke Sr. is not allowing comments on anything.
His site became so popular for a certain kind of postings that nobody goes there any more.
If you miss that sort of thing, DotEarth is now the trendy spot for witnessing by the inactivist crowd.
They’re all there all the time.
Geoff Wexler says
re #91
Marion Delgado
“internal radiative forcing”
“spontaneous atmospheric parameter value change.”…. or
“large banks of subterranean fudge ”
How about intelligent design? After all, the role of intelligent design in evolution is that of an unexplained biological forcing.
More seriously, Spenser is adopting the opposite standpoint over Climate and Biology. He assumes that the former is all natural and the latter
requires intervention to be understood.
Geoff Wexler says
Raypierre
“even Lindzen has abandoned his earlier claims that water vapor would prove a stabilizing feedback.”
Does that mean he has been quiet about it or that he has announced that he has changed his mind?
(I am also confused about John Christie’s position about tropospheric warming; he is part author of a peer reviewed paper reversing the earlier anomaly, but then appeared
in the Swindle programme saying the opposite. Of course that could have been an out of date interview).
Roger A. Pielke Sr. says
Ray – There are comments on your weblog with respect why Climate Science does not permit comments (e.g. see #104). In the past when I did allow comments, many were not on the science (as exemplified also on Real Climate), yet required a lot of time to respond to. Moreover, the scientifically informative comments were buried within many other comments, and only the most interested would be able to wade through to see them.
Therefore, in the current version of Climate Science, I invite credentialed scientists, such as yourself, to post guest weblogs. Please consider this as an invitation.
I also would be willing to post a guest weblog on your site that overviews my perspective and your readers can then comment. If interested and open to this idea, let us know.
Best Regards Roger
Ray Ladbury says
#110. Roger, Thank you for the clarification. I sympathize with your trials dealing with the anonymity of the Internet. I think you will agree that what you serves to demonstrate the value of Realclimate as a place where discussion of scientific evidence can still proceed in a civil and orderly environment.
Mark Hadfield says
A reference to Spencer & Braswell (2008) would be helpful.
Bryan S says
Re #69: Ray, thank you for the clarification on the graphs.
Now, I want to get back to the topic of mixed layer depth. The classic way to think about this is that the shallow layer down to around 100 meters, just above the thermocline is mixed rapidly by wave action, and the deeper, colder more stable layers (to several hundred meters) slowly absorb any additional heating of the mixed layer over decades to centuries. This is the idea of the lag in climate response to increased GHG forcing. Some of the additional heat absorbed in the ocean (from the TOA radiative imbalance imposed by the GHG forcing changes) must first go to increasing the temperature of the deeper ocean before it is available to increase the temperature of the sea surface and the atmosphere. Coupled models, as I understand, do not generally mix heat deeply on short time scales, so many modelers have argued for a long equilibrium response time. As Steve Schwartz demonstrates however, the equilibrium response time is important in calculating climate sensitivity to the increased GHG forcing. If the additional heat is mixed quickly into the deep ocean, thereby heating the entire integrated volume quickly, the climate sensitivity might be considerably different than it would be with a slow response time, where the deeper ocean volume heats very slowly over decades to centuries.
[Response: This is not correct. The mixing time impacts the transient sensitivity (i.e. the warming expected by 2050 say), but not the equilibrium value. It of course comes into play if you are trying to deduce the equilibrium value from a transient time series, but that is a different issue completely. – gavin]
You have made a statement in your above critique that this equilibrium response time is slow, lasting decades to centuries. I now want to point out some observational evidence that indicates that ocean mixing through vertical turbulent eddy motion (or whatever process) is much more dynamic than many believe. During the NOAA XBT Fall Rate Workshop NOAA/AOML, Miami, Florida March 10-12, 2008, Syd Levitus presented a talk which showed the yearly changes in ocean heat content down from 0-700 m, then from 0-1500 m (notice slides 16 and 17). Although the deep ocean measurements have a large associated sampling error, it may be significant that there are still sizeable yearly changes in heat content (in Joules) below 700 m. I ask you this. If the ocean is not significantly mixed deeply over these short time intervals, where are these annual variations in deep ocean heat content changes coming from? In another talk, Victor Gouretski shows several ocean heat time series for different volume integrals (slide 48). These clearly indicate that a significant fraction of the heat storage changes (annually) take place below 300 meters. The observation that the heat storage changes are in phase across the different depth volumes from year to year may point to a time-dependent systematic error in the measurements which are not being properly corrected, or it may also point to evidence that additional heating or cooling (from the TOA net radiative flux) is mixed quickly into the deeper ocean volume. Please take a look at these slides, and ponder the implication for mixing and system response time.
Chris Colose says
Re 105 BPL
check out Tamino’s post on this
http://tamino.wordpress.com/2007/11/12/ridiculous/
http://tamino.wordpress.com/2007/11/19/wobbles-part-1/
On the second post, he used the same equation you did, only it was all over 1…so that as the denominator decreases with larger e, the whole term gets larger.
I could be reading this all wrong; I’m not saying you’re incorrect, just that I’ve read differently.
Bryan S says
Gavin: I don’t want to get into the whole Schwartz paper again (Ray brought it up), but correct me if I’m wrong here (I really need to go back and read the Schwartz paper). Schwartz is attempting to deduce climate sensitivity from the transient surface temperature response using a simple energy balance model, so that the issue of thermal lag time due to the ocean mixing is of vital importance to his analysis(not the actual sensitivity). If the ocean mixing is slow, and there is a much longer temperature lag to forcing changes than he assumed, this dooms his calculations. If there is in fact a short (or no) lag time, then his calculations have more validity.
I think the confusion is that I was referring to Schwart’s *calculated sensitivity*, not the actual sensitivity to a fully equalibriated forcing change (2X pre-industrial CO2), which is what it is and has nothing to do with mixing rate. Schwart’s deduction of this sensitivity value, does.
Barton Paul Levenson says
Chris — you’re right. I got it backwards.
Brian Klappstein says
A minor point perhaps but the negative initial state of Roys model makes sense in light of the extended volcanic forcing preceding 1900.
Simon Pope says
Re: Comment from Chuck Booth.
In intelligence work, the motivations for betrayal are known as MICE:
Money, Ideology, Coercion and Ego.
We’ve established that it probably isn’t Money. It’s unlikely that it’s Coercion. That leaves Ideology and Ego. Any takers? ;-)
P.S. I laughed all the way through your article (in a good way!)
Thanks
Leo says
Excellent, excellent post. Keep it up.
A letter I sent to Dr. Spencer:
What’s forgotten in your calculus on the supposed veracity and genuine-ness of mainstream climate researchers is that many, if not most, _don’t_ want this to be a crisis situation.
It is true that (scientific) politics affects science through and through. Ideas reach prominence only in part because of the weight of evidence behind them — the rhetorical, political, and oratorical skill of their advocates can make the difference between whether the “right” idea is accepted in 5 or 50 years.
I’m sure there are vested interests in the notion of human-caused (and preventable) climate change. What I don’t believe is that there is a massive cadre of scientists that _want_ this to be the case and have put their advocacy before their science. Not that climate scientists are now trying to disprove the thesis of human caused climate change, but contrast the sequence of behavior of mainstream science with those who subscribe to “right wing” ideas. In the “mainstream science” case the science is done first and only much later (if ever) is there an organized message campaign to gain acceptance. But in the case where the fringe opinion is seen as favorable by vested “right wing” interests (and $40 billion/year in profit is _very_ vested) we witness the opposite sequence: a well-funded and coherent message campaign is started first (c.f. “creation science/intelligent design”, the lead up to the 2003 Iraq War, etc.). This is comparable to the existence of a monopoly in the market of ideas.
I am glad you have a contrary scientific opinion on the issue of climate change. Disagreement is vital to science, and whenever the science is new (as it is for climate change) there will always be people who disagree. This is a very good thing. But _because_ science is political, what does real violence to scientific debate is sophisticated and well-funded marketing campaigns which add a strong bias to one side. When those well-funded science-biasing campaigns are further aimed at propping up small-minded business interests things are even worse.
So keep up the good scientific work, [I wrote this before any evaluation of the goodness of his scientific work] but if you believe in a “market of ideas” and democratic science, you should accept that it is a very bad thing that one side in the climate change debate has for a long time had a coherent and expensive messaging strategy while the other has succeeded largely on its own merits. Therefore, if you acknowledge that from a policy perspective the correct thing to do is to listen to the wisdom of the (scientific) crowd, then politicians and the whole planet should be working very hard to curb emissions.
John Dodds says
Re 103, 114 116.
BPL had it right the first time.
A MORE eccentric orbit results in LESS energy getting to the earth.
A LESS eccentric orbit results on MORE energy to Earth.
Try this A circle has an area of pi*r^2 = pi*a*a:
An ellipse area is pi*a*b.
When the orbit gets more elliptical then b increases, making the area larger making the average distance to the earth on the circumference further away which means less energy impacts the earth. It still takes the same time (1 yr) to orbit the ellipse or the circle.
THIS is why we are currently INCREASING (very slightly- not enough to cause the observed warming) the incoming solar insolation. The Earth is in its getting LESS eccentric phase. The Milankovitch eccentricity is reducing and is approaching its minimum eccentricity point (since we were last here about 400,000 years ago) in ~25,000 years. The earth is warming and it will continue to warm for the next ~ 25000 years, unless some other factors outweigh the sun’s influence (at least according to RC, but this is such simple high school physics its hard to argue against it.)
John Dodds says
Now a harder physics question:
IN the response to 60, and Cooking lesson 3, Ray says or implies that a closer to equilibrium condition applies, or that we are at equilibrium- energy-in equals energy-out of the earth. This agrees with my understanding that the Earth goes from a net radiating of energy at night to a net absorbing of energy in the day (it warms up) passing thru and returning to equilibrium daily and continuously.
HOWEVER it is my understanding that the GISS model(& all the other GCMs) tell us that we are in an Energy DIS-Equilibrium state (Hansen et al 2005 figure 2e), caused by the long term accumulation of added GHGs, and that we have “warming already in the pipeline”, and that we will not return to equilibrium until the cause of the warming (ie CO2) is removed in 50 to 100 years naturally. (ONLY if we stop adding CO2 immediately.) Hence the “tipping point” argument.
So Ray, & Gavin, WHICH IS IT?
Are we at an energy equilibrium or NOT.
It seems to me that If we were NOT, then you have to make the Stefan-Boltzmann law FAIL, since a warmer than equilibrium ground will just radiate more energy to a cooler than eqwuilibrium stratosphere to recreate the equilibrium- it does it daily!. Are we requiring the GCMs to violate the laws of physics?
Chris Colose says
John Dodds,
I stand by what I said on eccentricity…the main effect is not so much changing the average distance of the earth but changing the shape of the orbit creating a net effect of a bit more solar insulation at higher ‘e’ value. Eccentricity also effects precession, such that ehen e=0 the precession has no effect. But eccentricty is not really the main thing going on in ice ages cycles either
As for your next post, you sound very confused. The answer is that we are not at equilibrium which is precisely why we are warming. The increase in the OLR is how the planet comes back to equilibrium, but it has to warm to do that.
Ray Ladbury says
John Dodd, I don’t think there is any contradiction. In the first place, the adiabatic cooling in the troposphere means that no single region in the atmosphere is ever quite at equilibrium. Rather, it makes sense to look at things in terms of local thermodynamic equilibrium and minor departures therefrom. Nequilibrium statistical mechanics really means near-equilibrium statistical mechanics. Systems far from equilibrium are very hard to treat. Fortunately, most systems rarely depart far from equilibrium for long.
Hank Roberts says
John Dodds, you’ve posted this same thing repeatedly. It’s answered in https://www.realclimate.org/index.php/archives/2007/01/the-physics-of-climate-modelling/langswitch_lang/en
and at the NYT and as I recall at CA as well. Why not do the math?
Hank Roberts says
John Dodds, to give some specific examples from the past, these:
http://www.google.com/search?q=%2B%22john+dodds%22+%2B%22stefan-boltzmann%22
tamino says
Re: #120 (John Dodds)
You’re mistaken. A more eccentric orbit results in more energy getting to the earth. a less eccentric orbit results in less energy to Earth. See this.
Barton Paul Levenson says
John Dodds writes:
The Earth is roughly in radiative equilibrium. On a day to day basis, absorption = emission. In the long run, there is decreasing emission, due to increased greenhouse gases in the air, so the Earth is warming slightly (at about 0.2 K per decade). When the greenhouse gases have stabilized, and the warming “in the pipe” has settled down, the Earth will be much closer to perfect equilibrium.
[Response: Keep in mind that there are many different uses of the word “equilibrium.” Barton’s short reply is correct, and applies to equilibrium of the Earth’s top-of atmosphere energy balance; the disequilibrium top-of-atmosphere is largely the result of the disequilibrium of the ocean surface energy balance, which reflects the fact that energy is still leaking into the ocean and warming it up. There’s another important use of the word “equilibrium,” though, as in “local thermodynamic equilibrium.” That means that even though the planet as a whole is out of energy balance, there are sufficiently frequent collisions between molecules in limited size nearly-isothermal parcels of air, and sufficiently frequent interactions with infrared photons, that the mechanisms of equilibrium statistical mechanics can be applied. This is true up to extremely high altitudes in the Earth’s atmosphere, and applies even if the planet as a whole is out of equilibrium. The argument with regard to molecular collisions is straightforward, though the argument with regard to interaction with photons is very subtle and rightly confusing to most people. There’s a pretty good discussion of local thermodynamic equilibrium in Hunten’s book on atmospheric physics. –raypierre]
Chris Colose says
raypierre, would it not be more appropriate to say the ocean surface disequilibriun is the result of the TOA disequilibrium (which is the result of the GHG increase)…how can the ocean surface be out of balance independently?
[Response: Well, to clarify, it is the thermal inertia of the oceans that allows the surface budget to remain out of equilibrium for so long. If you had a planet without an ocean, the atmosphere and surface would adjust to the equilibrium corresponding to the new level of CO2 within under a year. –raypierre]
Bryan S says
Hi again Ray: Roger Pielke Sr. has posted a fascinating analysis on the last 4 years of ocean heat data by Josh Willis http://climatesci.org/2008/05/29/new-information-from-josh-willis-on-upper-ocean-heat-content/. What is your take on the magnitude of annual TOA net radiative flux in W/m2 that is inferred from these plots? It seems to me that the magnitude of the net TOA radiative flux inferred from the annual ocean heat storage change is significantly larger than is indicated by your above analysis (1 W/m2). Please comment.
Secondly, notice the seasonal oscillations of heat content anomaly in the upper 750 meter volume of ocean. Presumably these swings are due to the large seasonal heat storage changes taking place in the southern ocean (because of the large fraction of the total ocean mass). It is clear that the seasonal 0-700 meter heat content minimum correlates to the SH winter. If the same data were analyzed from 300-750 meters, and there is still a discernable seasonal signal, this might provide us some important information about the rate of heat uptake into the deep ocean in response to a change in forcing. In studying the other plots that I have referenced above, I think it can be shown that the signal from a net TOA imbalance on annual timescales penetrates deeply into the ocean (at least several hundred meters). Please comment.
John D M says
Whilst the greenhouse gases produced by the burning of fossil fuels and even by livestock are accounted for, where is all the heat generated by such activity, and indeed each little furnace more commonly known as a human being, fit into the calculations? We have all been given a carbon footprint, would not a heat footprint be more appropriate and more relevant?
[Response: No. The direct heating is at least two orders of magnitude smaller that the greenhouse gas forcing at a global level. – gavin]
John D M says
[Response: No. The direct heating is at least two orders of magnitude smaller that the greenhouse gas forcing at a global level. – gavin]
I assume the direct eating is estimated only, rather than measured, as well as being substantially greater in the northern hemisphere. Can it be quantified, ideally for each hemisphere?
Barton Paul Levenson says
John,
Human core temperature averages about 37 degrees C. (310 K), and skin temperature might average 35 C (308 K). Adult human skin area is about two square meters, so assuming perfect emissivity, humans produce about 5.6704 x 10-8 x 3084 x 2 or about 1,021 watts. But we also absorb heat from the environment. For surroundings at 20 C (293 K), we get back about 850 watts. Net human heat output — about 171 watts. That’s for a naked body; clothes retard heat exchange a bit, and between that and the fact that not everyone is an adult, human beings probably contribute about 130 watts of heat output per person — about the same as three incandescant light bulbs.
There are presently about 6.7 billion humans, so this adds up to some 871 billion watts (8.71 x 1011 W). World energy use is about 20 terawatts (2 x 1013 W), so human body heat is about 4% of that. And the sunlight absorbed by the climate system averages some 237 watts per square meter. Earth’s surface area is about 5.101 x 1014 square meters, so the climate system runs on about 1.2 x 1017 watts. Human body heat is about 7 millionths of this. If we had ten times as many people, the increase in world temperatures would still be undetectable. It’s not a problem.
John Dodds says
Re 120, & 126 tamino — 7 June 2008 @ 12:12 PM
OK Tamino please explain whey my Ephemeris program shows that the earth eccentricity is continuously DECREASING for the last 300 plus years, but the TSI data from IPCC shows a net and fairly regular INCREASE in solar energy coming in.
Then how can we explain that your equation results in a divide by zero result when eccentricity =1, for a parabolic orbit where the object never returns. A logical answer would be around 1/2. and even less when the orbit is a hyperbola, just grazing the sun at ~1AU.
Could your derivation have lost a minus sign in the integration so the real relationionship is that energy is proportional to 1/(1+e^2) instead of minus. – Just as BPL said originally (ie increasing energy for a decreasing eccentricity)
John D M says
Barton, sorry to push this point, but in my initial question I was referring to ALL the heat produced by human activity including the burning of fossil fuels and all other materials whether it is in the form of fuel or food.
We know the burning or consuming of such materials does give off greenhouse gases, but the materials are primarily burnt to produce heat energy. It was this total heat energy liberated that I was interested in having quantified. Obviously the heat given off by all humans and other living things would be a part of the total, small as it may be.
Richard Pauli says
Science Daily today asks:
Has Global Warming Research Misinterpreted Cloud Behavior?
And then tells us of recent research by Dr. Roy W. Spencer
“…To the extent that the cloud changes actually cause temperature change, this can ultimately lead to overestimates of how sensitive Earth’s climate is to our greenhouse gas emissions.
This seemingly simple mix-up between cause and effect is the basis of a new paper that will appear in the “Journal of Climate.” The paper¹s lead author, Dr. Roy W. Spencer, a principal research scientist at The University of Alabama in Huntsville, believes the work is the first step in demonstrating why climate models produce too much global warming.”
http://www.sciencedaily.com/releases/2008/06/080611184722.htm
I wrote their editor at: editor@sciencedaily.com
And pointed out the Dr Roy Spencer has a reputation that might explain his confusion with cause/effect, effect/cause
http://www.exxonsecrets.org/html/personfactsheet.php?id=19
Who is the editor of the Journal Climate?
And I found this interesting book review:
The Manufacture of Uncertainty
How American industries have purchased “scientists” to undermine scientific verities when those verities threaten their profits.
Doubt is Their Product: How Industry’s Assault on Science Threatens Your Health by David Michaels (Oxford University Press, 359 pages, $27.95)
http://www.prospect.org//cs/articles;jsessionid=anjGFkx1cf95IxGn6P?article=the_manufacture_of_uncertainty
http://www.rburton.com/work1.htm
I have not yet seen Dr Spencer’s paper, or examined his science and so I will be skeptical, but unless an exciting discussion suggests this to be a breakthrough, it shall drop off my reading list. Again here is a case where some sort of pre-publication discussion might have been warranted.
Richard Pauli says
I meant to paste the following book review:
http://www.prospect.org//cs/articles;jsessionid=anjGFkx1cf95IxGn6P?article=the_manufacture_of_uncertainty
The Manufacture of Uncertainty
How American industries have purchased “scientists” to undermine scientific verities when those verities threaten their profits.
Chris Mooney | March 28, 2008
Doubt is Their Product: How Industry’s Assault on Science Threatens Your Health by David Michaels (Oxford University Press, 359 pages, $27.95)
The sabotage of science is now a routine part of American politics. The same corporate strategy of bombarding the courts and regulatory agencies with a barrage of dubious scientific information has been tried on innumerable occasions — and it has nearly always worked, at least for a time. Tobacco. Asbestos. Lead. Vinyl chloride. Chromium. Formaldehyde. Arsenic. Atrazine. Benzene. Beryllium. Mercury. Vioxx. And on and on. In battles over regulating these and many other dangerous substances, money has bought science, and then science — or, more precisely, artificially exaggerated uncertainty about scientific findings — has greatly delayed action to protect public and worker safety. And in many cases, people have died.
Tobacco companies perfected the ruse, which was later copycatted by other polluting or health-endangering industries. One tobacco executive was even dumb enough to write it down in 1969. “Doubt is our product,” reads the infamous memo, “since it is the best means of competing with the ‘body of fact’ that exists in the minds of the general public. It is also the means of establishing a controversy.”
In his important new book, David Michaels calls the strategy “manufacturing uncertainty.” A former Clinton administration Energy Department official and now associate chair of the Department of Environmental and Occupational Health at George Washington University, Michaels is a comprehensive and thorough chronicler — indeed, almost too thorough a chronicler, at times overwhelming the reader with information.
But there’s a lot to be learned here. Even most of us who have gone swimming in the litigation-generated stew of tobacco documents (you can never get the stink off of you again) don’t have a clue about the extent of the abuses. For the war on science described in Doubt is Their Product is so sweeping and fundamental as to make you question why we ever had the Enlightenment. There aren’t just a few scientists for hire — there are law firms, public-relations firms, think tanks, and entire product-defense companies that specialize in rejiggering epidemiological studies to make findings of endangerment to human health disappear.
For Michaels, these companies are the scientific equivalent of Arthur Andersen. He calls their work “mercenary” science, drawing an implicit analogy with private military firms like Blackwater. If the companies can get the raw data, so much the better, and if they can’t, they’ll find another way to make findings of statistically significant risk go away. Just throw out the animal studies or tinker with the subject groups. Perform a new meta-analysis. Conduct a selective literature review. Think up some potentially confounding variable. And so forth.
They can always get it published somewhere. And if they can’t, they can just start their own peer-reviewed journal, one likely to have an exceedingly low scientific impact but a potentially profound effect on the regulatory process.
All of science is subject to such exploitation because all of science is fundamentally characterized by uncertainty. No study is perfect; each one is subject to criticism both illegitimate and legitimate — and so if you wish, you can make any scientific stance, even the most strongly established, appear weak and dubious. All you have to do is selectively highlight uncertainty, selectively attack the existing studies one by one, and ignore the weight of the evidence. Although Michaels focuses largely on the attempts to whitewash the risks that various chemicals pose to the workplace and public health, the same methods are also used to attack the scientific understanding of evolution and global warming.
And it happens virtually every time the government even dreams of regulating a substance. People know what’s going on, but they respond as if they’re simply shocked, shocked, to find science being tortured. And so the outgunned federal agencies that must consult science to take action — the Occupational Safety and Health Administration, Environmental Protection Agency, and Food and Drug Administration, among others — repeatedly capitulate to corporations that effectively purchase science on demand.
We used to have a regulatory system — that was the dream, anyway, of the 1960s and 1970s. But in significant part due to the manufacturing-uncertainty strategy, we now have the bureaucratic equivalent of clotted arteries. And mercenary science hasn’t just blinded federal agencies. It has also blinded the courts, where the same tactics apply. Indeed, recent changes to the role of science in the federal regulatory system and the courts have worsened the situation by making corporate sabotage of scientific research easier than ever.
The 1998 Data Access Act (or “Shelby Amendment”) and the 2001 Data Quality Act, both originally a glint in Big Tobacco’s eye, enable companies to get the data behind publicly funded studies and help them challenge research that might serve as the basis for regulatory action. Meanwhile, the 1993 Supreme Court decision in the little-known Daubert v. Merrell Dow Pharmaceuticals case further facilitates the strategy, unwisely empowering trial court judges to determine what is and what isn’t good science in civil cases. Under Daubert, judges have repeatedly spiked legitimate expert witnesses who were otherwise set to testify about the dangers demonstrated by epidemiological research. Often juries don’t even hear the science any more because the defense can get it thrown out pre-trial.
It’s all about questioning the science to gum up the works. The companies pose as if they are defending open debate and inquiry and are trying to make scientific data available to everyone. In reality, once they get the raw data, they spend the vast resources at their disposal to discredit independent research.
Michaels ends by proposing a series of reforms. He suggests giving citizens more access to the courts (since the regulatory agencies are broken), requiring full disclosure of all conflicts of interest in science submitted to the regulatory process (and discounting conflicted studies), getting rid of rigged reanalysis by promulgating scientific standards that forbid it, and returning to the practice of using the best available evidence to protect public health, rather than waiting for a degree of unassailable certainty that will never arrive.
With his extensive chronicling of just how many times the manufacturing-uncertainty strategy has been used to make our world more dangerous, Michaels has performed a great service. Moreover, because he’s a scientist himself and has seen these abuses up close in government, he can go much further than muckraking journalists who have often sought to expose this kind of malfeasance. (Full disclosure: Michaels cites my own book The Republican War on Science and mentions me in his acknowledgments.) I support Michaels’ regulatory solutions — his “Sarbanes-Oxley for Science” proposal, as he calls it — and would like to see them enacted into law or put into effect by administrative action. But if there’s a problem with Doubt is Their Product, it’s that Michaels is, in a way, too much of a scientist. Let me explain.
Michaels chronicles a long litany of outrageous abuses, nothing less than the undermining of reason itself from within. Yet despite just how vulnerable the book shows science to be, Michaels continues to have faith that the solution lies in science. No matter how many times we have seen the facts lose, he still writes as if he thinks the facts alone will win.
So Michaels slices and dices all the misinformation, as he’s ideally equipped to do. Anyone who grasps the nature of science well enough to follow him will not only be convinced but also deeply angered by what’s happening. But other readers will just feel dizzied by the complex analyses, confused and ready prey for the science sharks whom Michaels has worked so hard to expose. The manufacturing-uncertainty strategy works because it buries you in the facts, loses you in the woods of science. Sometimes, arguing back within that arena only makes it worse.
And so, while eminently rational critiques of the abuse of science have their place — and Michaels’ is excellent — I worry that the defenders of science sometimes delude themselves into thinking rational criticism is enough. It isn’t, however, because scientifically grounded argument will only persuade those inclined to defend science in the first place. In order to be protected from the kind of assault it now faces, science must do more than convince its own. Science needs the allied power of outrage, political will, and a fundamental commitment to fighting back that, as of now, simply doesn’t exist. So enough of being shocked, shocked. It’s time for the merry, rampaging science-abusers themselves to be shocked as the sleeping giant of American science awakens and finally decides it isn’t going to take it anymore.
———————-
Chris Mooney is a Prospect senior correspondent and a freelance writer living in Washington, D.C. He focuses on issues at the intersection of science and politics, and has been praised as a “revolutionary mind” by Seed Magazine, which recently commended his “trenchant brand of science-centered commentary.” His most recent articles include a Columbia Journalism Review feature story about the problem with “balance” in science coverage and a Boston Globe commentary on th…more
http://www.prospect.org//cs/articles;jsessionid=anjGFkx1cf95IxGn6P?article=the_manufacture_of_uncertainty
tamino says
Re: #133 (John Dodds)
I’ve answered that question on my blog.
Hank Roberts says
JohnDM,
Google has developed a very good natural language search engine. Type your question in plain English as a sentence into the Search box, ending with a question mark, to see it work.
Using your question, I’ll do it for you below. In this example, the first hit answers your question.
http://www.google.com/search?q=how+much+heat+is+produced+by+all+human+activity%3F
“Is waste heat produced by human activities important for the climate? No. In any given period of time, the sun provides almost 10000 times as much energy to ….”
http://www.mpimet.mpg.de/en/presse/faq-s/ist-die-abwaerme-der-menschen-wichtig-fuer-das-klima.html
David B. Benson says
Richard Pauli (135) —
“But we really won’t know until much more work is done,” Spencer said.
http://www.physorg.com/news132251958.html
Richard Pauli says
#139
Oh… I think we’ll know very well in about 5 or 10 years. Pain has is a great public teacher – leave aside your science for a moment – and when we feel heat waves, drought, floods, increased storms, climate refuges, sea level rise and other commonly noticed effects – it will be very clear what is happening. (even sounds a bit like today to those who want to see it)
Right now we have more than enough information to make crucial industrial policy decisions.
I read your denialist link as trying to support continued unrestrained CO2 output. I have not seen how that will work, it is a dangerous policy. You can do all the science you want, but calling for continued CO2 business-as-usual is civil suicide, science treachery and delusional public policy.
If denialists believed in a flat earth, I could regard this as charmingly eccentric – unless they demanded we change navigation principles. Or, if some folks believe the lunar landing was a hoax; what do I care? unless it restricts real space exploration. But advocating scientific suppression by confusion and the clouding of conclusions regarding a dangerous future; I cannot accept.
Barton Paul Levenson says
John Dodds writes:
Because the sun is getting brighter.
Richard Pauli says
New posting Mark Lynas 6-13-08
http://www.guardian.co.uk/commentisfree/2008/jun/12/climatechange.scienceofclimatechange
marklynas.org
Hank Roberts says
http://www.aps.org/units/fps/newsletters/200804/marsh.cfm
Climate Stability and Policy
By Gerald E. Marsh
“…In this essay, however, I will argue that humanity faces a much greater danger from the glaciation associated with the next Ice Age,
“Will Solar Cycle 25, mentioned earlier and predicted by NASA to be comparable to the Dalton Minimum, be the trigger for a new Ice Age?”
Oy.
David B. Benson says
Richard Pauli (140) — PhysOrg is hardly a ‘denialist’ web site.
Hank Roberts says
139, 140, 144
Richard, you quoted Spencer without a cite; then David gave you the cite for the Spencer article; then you replied calling it a denialist publication. All he did was fill in the information you didn’t give for the source of the quotation. Relax, notice that citing sources is not enemy action.
David B. Benson says
Richard Pauli (140) — Yesterday I attempted to post a longer comment regarding Roy Spencer’s work and why it is actually most unlikely to be of much significance. Somehow, the comment went into the bit bucket somewhere, so I just posted comment #144, being then in a rush.
Basically, previous interglacials have been warmer, much warmer, than the Holocene. So whatever Spencer may or may not have found, the climate system did not prevent those warmer interglacials, under the influence of only the weak orbital forcing.
I suppose we both would rather than Roy Spencer just stick to his science, hmmm?
[Response: Hi, David. I don’t know what happened to your comment. We have had continuing problems with comments posted via the popup box just disappearing. If your comment disappeared, I hope you’ll try again. Sorry about that –raypierre]
David B. Benson says
Raypierre — I didn’t use the popup box. It could be that somehow I used a word containing, as a substring, the name of a popular drug. Dunno. The other possibility is the growing unreliability of the internet as traffic continues to increase. Dunno.
Anyway, I didn’t save it but the essence is in comment #148.
Thanks for your response.
Alastair McDonald says
Re #146
I have lost posts when I do not wait until the new post is displayed as awaiting clearance or whatever.
I agree with your point that the climate can get warmer, so any negative feedback that Roy Spencer proposes is not going to keep us cool.
However, he is correct in two respects. The troposphere is not warming to the extent predicted by the climate models. The radiosondes prove that. And, the clouds ultimately act as a negative feedback, because they cut off the source of heat from the sun.
Just because his belief that God will stop us destroying the planet through global warming is ridiculous does not mean his MSU readings are wrong :-(
Cheers, Alastair.
[Response: Actually, I think you’ve jumped to conclusions both on clouds and the tropospheric warming. The IPCC has a very balanced assessment of this. The tropical mid-trop warming signal is still modest and there are formidable data problems. There is some hint of a mismatch, but it is too soon to conclude whether it’s a model problem or a problem in the data. In more cases than not in the past, mismatches of this sort have proved to be a data problem and the models were right. That was the case for the MSU lower trop data, and that was the case for the CLIMAP ice age tropical surface data. On cloud feedbacks, you are wrong to say that clouds are ultimately a stabilizing feedback, because high clouds can in principle have a sufficiently strong greenhouse effect to overwhelm the albedo effect. I have an example of how bad this could get in principle in my Kavli Institute cloud lecture, which you can find online by just googling “Kavli Pierrehumbert Clouds”. –raypierre]
Alastair McDonald says
Raypierre,
Your Cloud Thermostats and Anti-Thermostats PPT is very interesting. But what is a “Faux thermostat”? Is it a blind which closes when the incoming heat gets too great? If so, then presumably the cloud fraction problem is knowing how much the faux cloud has to close for an arbitrary planet.
Since cloud only forms in cooling air, and since all air that rises must descend, to a very rough approximation cloud cover will equal the 50% of the earth’s surface where the air rises. Of course there are many places in the atmosphere where air is neither rising nor falling. Thus to the lee of mountains, where the air has been cooled by having to rise while passing over the mountain tops, then clouds can also exist.
This may have been the reason for the Ordovician glaciation. The orogenies at that time caused the total cloud cover to exceed 50% so cooling the planet.
Returning to my point about cloud increasing with temperature, I do not mean that clouds will increase monotonically with temperature. What I am arguing is that temperatures will rise until enough cloud forms to cut off the supply of solar radiation to the surface. On Venus that required a surface temperature high enough to melt sulphur. On Earth it only requires a surface temperature high enough to melt water.
But since the earth’s clouds are limited at 50%, then when the Arctic sea ice goes the planetary albedo will increase and temperatures rise. They may do so until the Earth’s climate switches into a Venus mode with total cloud cover rather than the 50% at present. The greenhouse effect of 100% cloud cover will raise surface temperature even high, but to a stable level. This sounds similar to your description of the Eocene.
That would explain why tropical temperatures fell, and daily formation of clouds in the tropics would explain why the mid-trop warming signal is still modest. There was no need to find reasons to modify the MSU and CLIMAP data!
Cheers, Alastair.
Barton Paul Levenson says
Alastair —
Kiehl & Trenberth 1997 estimate Earth mean annual cloud cover at 62%. I have a table of other estimates somewhere if you’re interested. I think Hart (1978) used 47% and Houghton (1977) used 50%.