It’s worth going back every so often to see how projections made back in the day are shaping up. As we get to the end of another year, we can update all of the graphs of annual means with another single datapoint. Statistically this isn’t hugely important, but people seem interested, so why not?
For example, here is an update of the graph showing the annual mean anomalies from the IPCC AR4 models plotted against the surface temperature records from the HadCRUT3v and GISTEMP products (it really doesn’t matter which). Everything has been baselined to 1980-1999 (as in the 2007 IPCC report) and the envelope in grey encloses 95% of the model runs. The 2009 number is the Jan-Nov average.
As you can see, now that we have come out of the recent La Niña-induced slump, temperatures are back in the middle of the model estimates. If the current El Niño event continues into the spring, we can expect 2010 to be warmer still. But note, as always, that short term (15 years or less) trends are not usefully predictable as a function of the forcings. It’s worth pointing out as well, that the AR4 model simulations are an ‘ensemble of opportunity’ and vary substantially among themselves with the forcings imposed, the magnitude of the internal variability and of course, the sensitivity. Thus while they do span a large range of possible situations, the average of these simulations is not ‘truth’.
There is a claim doing the rounds that ‘no model’ can explain the recent variations in global mean temperature (George Will made the claim last month for instance). Of course, taken absolutely literally this must be true. No climate model simulation can match the exact timing of the internal variability in the climate years later. But something more is being implied, specifically, that no model produced any realisation of the internal variability that gave short term trends similar to what we’ve seen. And that is simply not true.
We can break it down a little more clearly. The trend in the annual mean HadCRUT3v data from 1998-2009 (assuming the year-to-date is a good estimate of the eventual value) is 0.06+/-0.14 ºC/dec (note this is positive!). If you want a negative (albeit non-significant) trend, then you could pick 2002-2009 in the GISTEMP record which is -0.04+/-0.23 ºC/dec. The range of trends in the model simulations for these two time periods are [-0.08,0.51] and [-0.14, 0.55], and in each case there are multiple model runs that have a lower trend than observed (5 simulations in both cases). Thus ‘a model’ did show a trend consistent with the current ‘pause’. However, that these models showed it, is just coincidence and one shouldn’t assume that these models are better than the others. Had the real world ‘pause’ happened at another time, different models would have had the closest match.
Another figure worth updating is the comparison of the ocean heat content (OHC) changes in the models compared to the latest data from NODC. Unfortunately, I don’t have the post-2003 model output handy, but the comparison between the 3-monthly data (to the end of Sep) and annual data versus the model output is still useful.
Update (May 2012): The graph has been corrected for a scaling error in the model output. Unfortunately, I don’t have a copy of the observational data exactly as it was at the time the original figure was made, and so the corrected version uses only the annual data from a slightly earlier point. The original figure is still available here.
(Note, that I’m not quite sure how this comparison should be baselined. The models are simply the difference from the control, while the observations are ‘as is’ from NOAA). I have linearly extended the ensemble mean model values for the post 2003 period (using a regression from 1993-2002) to get a rough sense of where those runs could have gone.
And finally, let’s revisit the oldest GCM projection of all, Hansen et al (1988). The Scenario B in that paper is running a little high compared with the actual forcings growth (by about 10%), and the old GISS model had a climate sensitivity that was a little higher (4.2ºC for a doubling of CO2) than the current best estimate (~3ºC).
The trends are probably most useful to think about, and for the period 1984 to 2009 (the 1984 date chosen because that is when these projections started), scenario B has a trend of 0.26+/-0.05 ºC/dec (95% uncertainties, no correction for auto-correlation). For the GISTEMP and HadCRUT3 data (assuming that the 2009 estimate is ok), the trends are 0.19+/-0.05 ºC/dec (note that the GISTEMP met-station index has 0.21+/-0.06 ºC/dec). Corrections for auto-correlation would make the uncertainties larger, but as it stands, the difference between the trends is just about significant.
Thus, it seems that the Hansen et al ‘B’ projection is likely running a little warm compared to the real world, but assuming (a little recklessly) that the 26 yr trend scales linearly with the sensitivity and the forcing, we could use this mismatch to estimate a sensitivity for the real world. That would give us 4.2/(0.26*0.9) * 0.19=~ 3.4 ºC. Of course, the error bars are quite large (I estimate about +/-1ºC due to uncertainty in the true underlying trends and the true forcings), but it’s interesting to note that the best estimate sensitivity deduced from this projection, is very close to what we think in any case. For reference, the trends in the AR4 models for the same period have a range 0.21+/-0.16 ºC/dec (95%). Note too, that the Hansen et al projection had very clear skill compared to a null hypothesis of no further warming.
The sharp-eyed among you might notice a couple of differences between the variance in the AR4 models in the first graph, and the Hansen et al model in the last. This is a real feature. The model used in the mid-1980s had a very simple representation of the ocean – it simply allowed the temperatures in the mixed layer to change based on the changing the fluxes at the surface. It did not contain any dynamic ocean variability – no El Niño events, no Atlantic multidecadal variability etc. and thus the variance from year to year was less than one would expect. Models today have dynamic ocean components and more ocean variability of various sorts, and I think that is clearly closer to reality than the 1980s vintage models, but the large variation in simulated variability still implies that there is some way to go.
So to conclude, despite the fact these are relatively crude metrics against which to judge the models, and there is a substantial degree of unforced variability, the matches to observations are still pretty good, and we are getting to the point where a better winnowing of models dependent on their skill may soon be possible. But more on that in the New Year.
John P. Reisman (OSS Foundation) says
497 TimTheToolMan
If you want to do a ‘simple’ experiment on ocean heating… in the winter, pour two glasses of cold water from your tap, put a glass of water outside your house and one inside. Assuming you have your heater on, you will notice that the water glass outside will get cooler, or turn to ice, depending on temperature, while the water glass inside will heat up to room temperature.
Oceans are more complex and this is an oversimplified explanation, but heat does move in and out of water bodies.
Matthew says
500, JasonB: I think it is worth noting that with an energy payback time of 1-4 years and consequently an Energy Returned on Energy Invested ratio of between 10 and 30, it does reduce the carbon footprint of California even while shifting it to China. It doesn’t really matter if the PV cells are manufactured using coal-fired electricity — it would still be a win to make them as quickly as possible (and therefore increase their contribution to the energy mix as rapidly as possible) rather than wait until there was sufficient “PV power” to manufacture them.
I think that you are overoptimistic on the payback time, or else the big subsidies would not be necessary. I could be wrong. The economics are rapidly changing, and if you are not right now, you will be in five years time or less.
It is true that China is investing in alternative/renewable energy, but if the question is when China will reduce its CO2 emissions, then the fact that they continue to increase CO2 emissions matters. Like the US, they see alternative/renewable energy partly in military terms, and they are vulnerable right now to having their fossil fuel supplies interrupted in time of war.
For both China and the US, it could turn out to be that CC&S is a more cost-effective intermediate-term strategy than solar. China and the US are building large scale installations at an unprecedented rate.
Ray Ladbury says
Matthew says, “It is true that China is investing in alternative/renewable energy…”
No, they are kicking our collective buts in alternative/renewable energy. They are doing the smart thing and developing production capability in advance of the huge increase in demand as the rest of the world begins to realize just how deep we are in the soup. And the US…well, we’re still arguing over 50-year old science.
Geoff Wexler says
Ray.
Thanks for your comment. My modifed version’s of TRY’s thought experiment was too sloppy to take very seriously or to take any further. For example an unintended consequence would have been that part of the black body spectrum emitted from the surface which would have fallen in the atmospheric window (no absorption lines) would have to be moved into an absorbing region of the spectrum. This would lead to additional energy being supplied to the gases and that would show up as an upward shift of the occupied gas levels, as you say.
The non-trivial point that remains, is that the terms in the downward IR radiation which describe the emission after thermalisation should not depend on the nature of the exciting radiation or whether the original excitation was of CO2 say or CH4. Thus the detailed flow path for the energy might be of little interest. Doesn’t that take care of most of the greenhouse emission? As an example, you might look at the spectrum and identify part of it, very approximately , as having come from methane at about 200K (invented value).
Rod B says
re 497: “[Response: Each term is calculated as a function of the temperatures, composition, cloud cover, etc at each time step in the model. Downward LW increases with higher CO2 and water vapour, but all of the terms vary with climate change and the net change is positive down. – gavin]”
A clarification question from way back basics: Is the downward LW radiation all and only from CO2 and H2O and other GHG molecules? Or is the LW downwelling for the atmosphere as a whole but triggered by CO2 and H2O?
[Response: From everything – H2O, clouds, CO2, aerosols, ozone etc. – gavin]
Steve Fish says
Comment by Matthew — 2 January 2010 @ 11:31 AM:
Embodied energy and cost estimates for PV are based on current electricity rates, but increasing costs of fossil fuels due to future scarcity and competition from emerging nations could reduce these dramatically. Depending upon ones circumstances, PV can be a pretty good bet. I think the biggest problem in the U.S. has to do with the current attitudes against buying quality up front in order to save in the long term. “I want it now” as opposed to delayed gratification. When carbon capture and sequestration from fossil power plants becomes available, PV will become an overwhelming bargain.
Steve
Matthew says
503, Ray Ladbury: They are doing the smart thing and developing production capability in advance of the huge increase in demand as the rest of the world begins to realize just how deep we are in the soup. And the US…well, we’re still arguing over 50-year old science.
I share your respect for China, and I think that we should, as they do, subsidize PV cell manufacture instead of PV cell purchase, and we should speed up construction of new nuclear power. As to payback time on investment, the payback time for nuclear power is much shorter than the payback time for solar.
But as long as we are discussing policies with respect to AGW, it should be noted that China is continuing to increase its CO2 production. According to their minister at the COP15 meeting, they will continue to do so for decades.
Mark C says
I have two questions?
Where do you get this Hadcrut3 data for 2009 from?
You state The 2009 number is the Jan-Nov average- are all the previous points on the HadCrut3 and GISTEMP curves Jan-Nov averages as well- I dont see how a Jan-Nov average can be representative of the year when it is excluding one fo the coldest months of the year?
[Response: It’s the anomaly value, it has nothing to do with whether Dec is cold in absolute terms or not. The data came from here, and when the Dec numbers get posted, I’ll update the figure. – gavin]
Brian Dodge says
“The oceans are potentially free to both cool and warm as they like and take the global temperatures along with them.”
I actually laughed out loud when I read this; It’s clear that TimTheToolMan — 1 January 2010 @ 5:54 PM either doesn’t understand cause and effect, or is choosing to ignore it for political/social/religious/trolling reasons. No doubt some people believe that the ocean can choose to cool or warm, just like the electorate can choose to vote Republican or Democratic*, but it’s probably not wise to let them control policy debates, or handle sharp instruments.
“TimtheTool, If you’ve ever been diving, you know that visible light penetrates the water to at least about 10 meters, right? You know that as you go deeper, the light is extinguished, indicating that it is being absorbed, right?
You know the skin surface effect occurs and about what magnitude it is, right? We know that decreased gradients decrease heat loss, right?
And we certainly know the ocean is warming, right?”
Ray, I wouldn’t count on Tim understanding this, or admitting it if he does; his head might explode.
[*] Of course, if the ocean did have an irrational mind of its own, like Wall Street investors, it would mean that climatology is as hamstrung as economic forecasting; maybe that’s the new denialist meme?
TimTheToolMan says
[Response: Each term is calculated as a function of the temperatures, composition, cloud cover, etc at each time step in the model. Downward LW increases with higher CO2 and water vapour, but all of the terms vary with climate change and the net change is positive down. – gavin]
Of course, but I am interested in the one term attributed to downward LW radiation and its assumed warming effect on the ocean, not so much interested in the calculation of its magnitude by clouds/water vapour/CO2 and so on at each step, which I’m sure science has quite a good handle on…
When the downward LW radiation is actually applied to the ocean to warm it, what is the mechanism used?
[Response: I still don’t know what you are after. The net heat flux is applied at the surface of the ocean and then the mixing in the ocean mixed layer determines how that is distributed. – gavin]
Rod B says
Gavin (505): [Response: From everything – H2O, clouds, CO2, aerosols, ozone etc. – gavin]
Thanks. Any from N2, Argon, or O2?
[Response: No. They aren’t LW emitters. – gavin]
Hank Roberts says
Tim, are you wondering how heat can go from the sky to the ocean?
How infrared photons heat water?
Can you ask a simple clear question?
Make a clear statement of what you believe is true?
Give us a pointer to whatever you’re reading?
What do you trust as a source, that leads you to ask what you’re asking?
It sounds like you’ve read something somewhere that uses words in a way you think everyone understands — but it appears your source may be using the words differently.
TimTheToolMan says
[Response: I still don’t know what you are after. The net heat flux is applied at the surface of the ocean and then the mixing in the ocean mixed layer determines how that is distributed. – gavin]
So are you saying that the LW radiation heats the “surface” and this is mixed into the upper layers?
That is not described in “Why greenhouse gases heat the ocean” for LW radiation to heat the oceans though is it? Indeed there is a recongition that mixing from the surface is NOT the mechanism used to heat the oceans.
Can you explain how the science supports this change in mechanism and how the magnitude of its effect is justified?
JasonB says
502, Matthew:
“I think that you are overoptimistic on the payback time, or else the big subsidies would not be necessary. I could be wrong. The economics are rapidly changing, and if you are not right now, you will be in five years time or less.”
I said energy payback time, not investment payback time. Investment payback time is still in the order of 10 years or so, but that is irrelevant to your suggestion that California’s PV investments simply amount to shipping emissions to China; the fact that it takes much less energy to make the PVs than they will generate over their lifetime means that the emissions that are “shipped” to China are far less, even if coal-fired power is used to make them.
“Like the US, they see alternative/renewable energy partly in military terms, and they are vulnerable right now to having their fossil fuel supplies interrupted in time of war.”
That’s part of it, although their response to that vulnerability is primarily to diversify their portfolio rather than try to guarantee fossil fuel supplies. (In addition to coal, nuclear, natural gas, and wind, China also has 50% of the world’s small hydro capacity and has the largest hydroelectric capacity in the world — 172 GW, about 20 times their current nuclear capacity.)
Another part, however, is that they recognise that there’s a lot of money to be made in selling those technologies. Witness how quickly they have become manufacturing powerhouses in the wind and solar PV arenas, neither of which are indigenous technologies.
“For both China and the US, it could turn out to be that CC&S is a more cost-effective intermediate-term strategy than solar. China and the US are building large scale installations at an unprecedented rate.”
It could turn out to be, but it doesn’t seem likely, especially for existing powerplants who’s location wasn’t decided upon based on the convenience of implementing CCS.
Also, don’t confuse the cost-effectiveness of PV with concentrating solar thermal, and, in the latter case, don’t forget there is a wide range of technologies from Stirling engines in dishes through to towers, parabolic troughs, and compact Linear Fresnel Reflectors — the latter being especially cost-effective and compact although less developed than the more expensive parabolic troughs.
507, Matthew:
“I share your respect for China, and I think that we should, as they do, subsidize PV cell manufacture instead of PV cell purchase, and we should speed up construction of new nuclear power.”
I haven’t seen any reports suggesting that China subsidises PV cell manufacture.
“As to payback time on investment, the payback time for nuclear power is much shorter than the payback time for solar.”
Care to cite references? The best figures I could come up with showed nuclear power actually being pretty similar to solar thermal (not PV) right now, with solar having the advantage that you could add to it incrementally and so start earning revenue from the investment very quickly with no real risk that you would make an enormous up-front investment and have your chance to earn income from it delayed or lost entirely. (It also has the advantage, for countries like the US, of making them completely independent of fuel imports.) This is despite the fact that nuclear has had half a century of prior development and total subsidies of ~$150 billion and solar thermal is still in its infancy.
Wind power, of course, is currently cheaper than both, and, as China demonstrates, you can add a lot of wind capacity very quickly while waiting for nuclear powerplants to come online. They look like exceeding the targets they set for 2020 just three years ago by the end of this year. The rate of growth of wind in China is truly remarkable. Although there aren’t enough wind resources in total for wind to be the primary long-term solution, it’s a very useful stop-gap.
Doug Bostrom says
TimTheToolMan says: 2 January 2010 at 6:32 PM
TimTheToolMan, earlier with regard to increasing ocean heat content you said “…any AGW “effect” could be a small increment to an otherwise naturally varying climate and the case for continued increases at the observed rates diminishes considerably.”
Presumably you have no problem with the data describing ocean temperature since you’re prepared to hypothesize a mechanism to explain why the oceans are becoming warmer. I’m wondering from where you believe that energy may be arriving?
Hank Roberts says
This, perhaps?
http://www.whoi.edu/science/PO/people/jprice/ekman/price86.pdf
That’s field work done with a research vessel built to actuall stand itself in a vertical position; was or is called “RV Flip” — don’t know if it’s still in service. This paper is from the research into how the warm surface water mixes into deeper ocean water.
Found by finding one of the early papers on this kind of work, clicking for citing papers and going through them.
http://www.scopus.com/results/citedbyresults.url?sort=plf-f&cite=2-s2.0-0020098315&src=s&imp=t&sid=GZHbmRmUePTuf4VU_I_Rvoc%3a30&sot=cite&sdt=a&sl=0&origin=inward&txGid=GZHbmRmUePTuf4VU_I_Rvoc%3a2
I’m just guessing this may be on point, I haven’t a clue about what Tim is trying to ask about.
Tim, if you can, please ask a clear question. Tell us what “mechanism” means to you? Do you mean in a computer model? In actual measured behavior in the ocean? Some kind of machine? Something else?
Dave Salt says
Thanks, John P. Reisman (#311).
I much appreciate your efforts to describe examples of positive feedbacks. However, my question was about their dominance within the Earth’s climate system and the real-world evidence that would support this assertion.
Identifying some jig-saw pieces is a first step; showing they can fit together to form a coherent picture is another; demonstrating that this picture is a true reflection of the real-world is the final step of this scientific puzzle. Your information relates to the first step while my question relates to the final one.
Concerning the type of evidence I’m looking for, my earlier post (#246) included the one and only example that I’m aware of (i.e. the predicted “hot spot” above the equatorial troposphere) and my hope was that someone here could provide more. Obviously, absence of evidence is not evidence of absence, so I’ll try to keep an open mind and continue to ask the question in the hope that someone out there can better enlighten me.
Thanks, dhogaza (#315)
I’m not making assertions but simply drawing a conclusion, based upon what I read in the IPCC reports and some of the references provided in this thread (see attached ‘clippings’ concerning model uncertainties).
Thanks, CM (#349)
I appreciate the spelling correction and the didactic comments. I hope my reply to Ray Ladbury (below) answers most of your concerns/criticism.
Thanks, Ray Ladbury (#320)
I took some time out between family celebrations to look at the “MOUNTAINS of evidence” that you and others here have referred to and was astonished by what I read.
In summary, there’s not one example of a ‘prediction’, let alone one verified by real-world evidence. However, there’s an absolutely HUGE amount of candidness about the uncertainties within all of the models and, more importantly, that the biggest and most important of them is the uncertainty over the effect of clouds (see attached clippings).
I did come across one link listing model predictions that it claimed were verified by real-world observations (http://bartonpaullevenson.com/ModelsReliable.html). However, this was rather unconvincing because it contained no references or supporting evidence and included the trophospheric ‘hot spot’, which current data seems unable to verify.
You say that the “climate senstitivities are constrained by evidence” but the references all show that these constraints are rather wide and, more importantly, include a significant range that would imply non-catastrophic AGW (i.e. less than 2C). Moreover, as the models used by these sensitivity assessments contain ‘structural uncertainties’, the confidence in these derived constraints may well be questionable. Are you telling me that these uncertainties have now been resolved and that the effects of clouds are now sufficiently well understood so that we can confidently remove any reasonable doubt about model results?
I’m actually very grateful for the pointers that everyone here has provided, although they weren’t exactly what I was expecting. Nevertheless, they’ve helped to clarify many of my previous ideas about this subject and lead me to draw the following set of conclusions:
1) The low-level IPCC documents provide a candid and thorough summary of the current status of climate science. They are an honest attempt to tell a complex story where some of the most significant details are still poorly understood.
2) Much of the ‘basic’ physics could be classed as “settled science” (e.g. the IR absorption of CO2) but it would be naive in the extreme to describe the system-level interactions that the models try to simulate as “settled science” because many of the basic mechanisms have yet to be fully accounted and/or there interactions are still not sufficiently well understood (e.g. clouds, aerosols, vegetation).
3) The controversy that rages over this issue appears to derive from the way that the low-level details are ‘interpreted’ for policy makers. Worst case scenarios seem to be highlighted, while the associated uncertainties appear to lack the necessary emphasis.
I suspect these conclusions will be unacceptable by many here and so attach the following selected ‘clippings’ for those who may be interested. If you feel they have been taken out of context or that I’ve omitted details that show they are not that important, please let me know.
——————————————————-
http://ipcc-wg1.ucar.edu/wg1/Report/AR4WG1_Print_Ch08.pdf
8.6.2.3 What Explains the Current Spread in Models’ Climate Sensitivity Estimates?
Using feedback parameters from Figure 8.14, it can be estimated that in the presence of water vapour, lapse rate and surface albedo feedbacks, but in the absence of cloud feedbacks, current GCMs would predict a climate sensitivity (±1 standard deviation) of roughly 1.9°C ± 0.15°C (ignoring spread from radiative forcing differences). The mean and standard deviation of climate sensitivity estimates derived from current GCMs are larger (3.2°C ± 0.7°C) essentially because the GCMs all predict a positive cloud feedback (Figure 8.14) but strongly disagree on its magnitude.
8.6.3.2 Clouds
In the current climate, clouds exert a cooling effect on climate (the global mean CRF is negative). In response to global warming, the cooling effect of clouds on climate might be enhanced or weakened, thereby producing a radiative feedback to climate warming.
…
Therefore, cloud feedbacks remain the largest source of uncertainty in climate sensitivity estimates.
8.6.3.2.1 Understanding of the physical processes involved in cloud feedbacks
The sign of the climate change radiative feedback associated with the combined effects of dynamical and temperature changes on extratropical clouds is still unknown.
…
The role of polar cloud feedbacks in climate sensitivity has been emphasized by Holland and Bitz (2003) and Vavrus (2004). However, these feedbacks remain poorly understood.
8.6.3.2.4 Conclusion on cloud feedbacks
Despite some advances in the understanding of the physical processes that control the cloud response to climate change and in the evaluation of some components of cloud feedbacks in current models, it is not yet possible to assess which of the model estimates of cloud feedback is the most reliable.
8.6.4 How to Assess Our Relative Confidence in Feedbacks Simulated by Different Models?
A number of diagnostic tests have been proposed since the TAR (see Section 8.6.3), but few of them have been applied to a majority of the models currently in use. Moreover, it is not yet clear which tests are critical for constraining future projections. Consequently, a set of model metrics that might be used to narrow the range of plausible climate change feedbacks and climate sensitivity has yet to be developed.
——————————————————-
http://ipcc-wg1.ucar.edu/wg1/Report/AR4WG1_Print_Ch09.pdf
9.6.4 Summary of Observational Constraints for Climate Sensitivity
Structural uncertainties in the models, for example, in the representation of cloud feedback processes (Chapter 8) or the physics of ocean mixing, will affect results for climate sensitivity and are very difficult to quantify.
——————————————————-
http://www.climatescience.gov/Library/sap/sap3-1/final-report/sap3-1-final-all.pdf
4.1.1 Equilibrium Sensitivity and Transient Climate Response
We understand in much more detail why models differ in their equilibrium climate sensitivities: the source of much of this spread lies in differences in how clouds are modeled in AOGCMs.
4.2.1 Cloud Feedbacks
Differences in cloud feedbacks in climate models have been identified repeatedly as the leading source of spread in model-derived estimates of climate sensitivity (beginning with Cess et al. 1990). The fidelity of cloud feedbacks in climate models therefore is important to the reliability of their prediction of future climate change.
——————————————————-
http://www.iac.ethz.ch/people/knuttir/papers/knutti08natgeo.pdf
Structural problems in the models, for example in the representation of cloud feedback processes or the physics of ocean mixing, in particular in cases in which all models make similar simplifications, will also affect results for climate sensitivity and are very difficult to quantify.
——————————————————-
http://chriscolose.wordpress.com/2009/10/08/re-visiting-cff/
Clouds are the largest source of uncertainty in quantifying the extent of climate feedbacks.
…
clouds have competing effects between reflecting sunlight (low clouds mostly) and influencing the outgoing infrared radiation (high clouds mostly). It is still not clear how these two effects will balance out, and thus the magnitude and even the sign of the feedback is not well constrained.
——————————————————-
David B. Benson says
TimTheToolMan (513) — Mixing certainly occurs in the upper ocean. For example, the top 10 or so meters are rapidly mixed by wave action. In addition, some tiny marine animals swim up at night to eat and then down during the day (I suppose to avoid being eaten); there is a significant amount of energy added to the ocean and it seems likely this activity promotes mixing as well as the means Gavin Schmidt mentioned in a reply to you.
phil c says
A couple papers with some interesting statistical analysis of some of the temperature datasets.(with a reference for those of us who aren’t statisticians: http://web.duke.edu/~rnau/411home.htm). The most interesting observation is that the temperature datasets are nonstationary. That means the data can’t be used for forecasting, without transforming it to get it at least approximately stationarity. I haven’t ever seen a reference to that process in any of the climate forecasting/models. The most typical “forecast” is a linear regression over a period, which is the worst possible predictor of the future. To quote Neils Bohr(and Yogi Berra) “Making predictions is very difficult, especially about the future”.
Environmental Modeling and Assessment
Nonstationary, long memory and anti-persistence in several climatological time series data.
Luis A. Gil-Alana
Contact Information
Faculty of Economics, Edificio Biblioteca, Entrada Este, University of Navarra, E-31080 Pamplona, Spain
Received: 27 August 2004 Accepted: 26 July 2005 Published online: 2 November 2005
Abstract In this article we examine the stochastic behaviour of several daily datasets describing sun (total irradiance at the top of the atmosphere and sunspot numbers) and various climatological anomaly series by looking at their orders of integration. We use a testing procedure that permits us to consider fractional degrees of integration. The tests are valid under general forms of serial correlation and deterministic trends and do not require estimation of the fractional differencing parameter. Results show that the series are all nonstationary, with increments that might be stationary for those variables affecting sun, and anti-persistent for those affecting air temperatures.
Journal of Geophysical Research vol. 107, no.0 2002
On nonstationarity and antipersistency in global temperature series.
O. Kärner
Tartu Obserfvatory, Törarevere, Estonia
abstract: Statistical analysis is vcarried out for satellite-based global daily tropospheric and stratospheric temperature anomaly and solar irradiance data sets. Behavior of the series apperas to be nonstatinary with stationary daily increments. Estimating long-range dependence between the two series reveals a remarkable difference between the two temperature series. The global average tropospheric anomaly behaves similarly to thye solar irradiance anomaly. Their daily increments show antipersistency for scales longer thanb 2 months. This property points at a cumulative negative feedback in the Earth climate system governing the tropospheric variability during the last 22 years……..The observed global warming in the surface air temperature series(Jones et al. 1999) is more likely produced due to over nonstationary variability of the Earth climate system under anti-persistent solar forcing.
Hank Roberts says
> mechanism?
A Darwinian mechanism for biogenic ocean mixing
Kakani Katija
(Bioengineering, California Institute of Technology)
John Dabiri
(Bioengineering and Aeronautics, California Institute of Technology)
Recent observations of biogenic turbulence in the ocean have led to conflicting ideas regarding the contribution of animal swimming to ocean mixing. Previous measurements indicate elevated turbulent dissipation in the vicinity of large populations of planktonic animals swimming in concert. However, elevated turbulent dissipation is by itself insufficient proof of substantial biogenic mixing. We conducted field measurements of mixing efficiency by individual _Mastigias_ sp. (a Palauan jellyfish) using a self-contained underwater velocimetry apparatus. These measurements revealed another mechanism that contributes to animal mixing besides wake turbulence. This mechanism was first described by Sir Charles Galton Darwin and is in fact the dominant mechanism of mixing by swimming animals. The efficiency of Darwin’s mechanism (or drift) is dependent on animal shape rather than fluid length scale and, unlike turbulent wake mixing, is enhanced by the fluid viscosity. Therefore, it provides a means of biogenic mixing that can be equally effective in small plankton and large mammals.
http://meetings.aps.org/link/BAPS.2009.DFD.BE.5
Hank Roberts says
Phil C
— that was for a while a very popular paper in certain circles:
http://scholar.google.com/scholar?cites=6701307676645840349&hl=en&as_sdt=2001&as_vis=1
Hank Roberts says
Phil C —
See also:
http://www.google.com/search?q=site%3Atamino.wordpress.com+nonstationarity
David B. Benson says
phil c (519) — Unfortunately your link is broken. Here it is:
http://www.duke.edu/~rnau/411diff.htm
Proper use of statistics for geophysical time series is found at
http://tamino.wordpress.com/
with clear, concise expositions.
Ray Ladbury says
Dave Salt, Barton Paul Levenson has supplied references for his list–I think it was on this very post. I’m sure he’ll supply you with the references again if you like. And in any case, you ought to be familiar with most of the items on his list, and they provide pretty strong evidence.
I think that your dismissal of the constraints on CO2 sensitivity is entirely too cavalier. Yes there are a few of the constraints that are quite broad and do have reasonable probability below 2. Taken together, though, they strongly favor a sensitivity of around 3 degrees per doubling and all but preclude sensitivity below 2.1. What is more, if the best-fit value is wrong, it is far more likely to be to low than to be too high.
This brings me to yet another cavalier dismissal in your missive–this time of the models. The models actually are among the most effective constraints limiting sensitivity at the high end. If you don’t believe in them, then warming could well be above 4 degrees per doubling, and since the consequences effectively scale exponentially with the warming, that would be grave indeed. If you look at Knutti and Hegerl figure 5, this starts getting us into pretty risky territory even for an CO2 concentration of 450 ppmv, which we will almost surely exceed.
See, this is the point I don’t understand. You guys are so eager for the models to be wrong–as if you could make the problem go away if you could falsify the models. It doesn’t work that way. There is sufficient evidence to demonstrate a credible threat just from the temperature record and the paleoclimate. Without the models, we have a threat where we can’t bound the risk–and the only appropriate response in those situations is risk avoidance.
Fortunately, though, I think you are wrong about the models. Your continual emphasis on uncertainties in cloud feedbacks doesn’t take into account that the best evidence indicates that such feedbacks, rather than the strong negative feedback you would require to have any validity to your position. In fact there is simpy no evidence backing your sanguine attitude. The best you can come up with are statements about “structural uncertainties” that do not in fact invalidate the known science.
It is not enough to say “Climate is too complicated to understand.” That is an unscientific attitude. The fact is that we must understand it. The consensus model we have today really does a pretty good job, and it suggests clear guidance for policy. The fact that there remain some uncertainties in the model certainly does not invalidate its successes, and it most certainly does not sanction doing the exact opposite of what the models indicate we should do.
In the end, Dave, the choice is between following the guidance of science or going 180 degrees against it–science or anti-science. Pick!
Rattus Norvegicus says
phil c
One thing you are forgetting is that climate models are *NOT* statistical models. AFAIK, nobody has tried to make a climate projections in the last couple of decades or so using a statistical model.
As for the two papers you provide, the first one has not been cited by anyone. Nobody, nada, nothing. The second has been cited 15 times. One is a self cite and most of the others are by the likes of Bob Carter, McKitrick, Veizer: you should get the picture. Some of the other cites were in papers completely unrelated to climate.
Ray Ladbury says
Timthetool, In heating the oceans, there are several processes that are important. Sunlight is certainly the dominant heating source. However, under normal circumstances, the oceans would remain in equilibrium by radiating away heat at the surface to the atmosphere.
However, IR radiation produces a skin effect decreasing temperature gradient right at the surface, and this limits heat loss–so yes back radiation contributes to heating the oceans by decreasing energy lost. Because of turbulence, the skin effect is transient. However continual mixing between the sunlight-heated upper portion of the oceans and the portion down to several meters distributes the heat. There’s nothing that controversial here. Did you look at the webpage I provided?
Don Shor says
Hank: “That’s field work done with a research vessel built to actuall stand itself in a vertical position; was or is called “RV Flip” — don’t know if it’s still in service.”
Yes, FLIP is still part of the Scripps research fleet.
http://scripps.ucsd.edu/voyager/flip/
jyyh says
assuming there are causes and consequences (such as melting of ice in increased heat) statistics maybe useful. however, it cannot predict one-time events in a given system. but then again, it might be useful in predicting the timing of a one-time event not included in the description of the said system (yes, i like tamino’s site).
john byatt says
the aussies here will have heard of senator steve fielding
the anti science loony ,
his forum is a laugh a minute , latest post was linked to , ‘its the scientists stupid” at http://www.rense.com/general89/easta.htm global warming created by evil scientists.
Steve Fish says
Comment by Dave Salt — 2 January 2010 @ 7:25 PM:
Dave:
Having read your questions and the responses to them I think I understand what you are asking. I think you want to know of some simple proof of positive feedback to heating of the earth by the CO2 greenhouse effect. If so, check out the following.
The cycle of the ice ages, consisting of a regular recurrence of glacial and interglacial periods, are coordinated with different length cycles of orbital distance of the earth from the sun, axial tilt of the earth relative to the sun, and precession of tilt, called Milankovitch cycles. When the separate cycles align to reduce solar input to the northern hemisphere a glacial period ensues. When they align to maximize solar input to the northern hemisphere the ice melts in an interglacial. You can check this out on Wikipedia or search the Real Climate site for articles.
The actual changes in solar input has been calculated and the forcing is not nearly enough to cause glacial and interglacial periods, so there must be positive forcing that enhances the sun’s warming to cause interglacial periods, and positive forcing to enhance the sun’s cooling to cause glacial periods. A positive feedback enhances a forcing whether it is hotter or cooler.
There are a lot of different feedback mechanisms, but two big ones are the CO2 greenhouse effect and ice albedo. When insolation decreases a little, snow cover stays longer at lower latitudes and it reflects sunlight to enhance (positive feedback) cooling. Over time the oceans cool a little and this enhances solubility of CO2 in the ocean which, in turn, reduces the greenhouse effect to cause further cooling of the planet (also a positive feedback). Eventually glaciers and snow cover a large portion of the earth.
Similarly, when the orbital and tilt cycles align to warm the northern hemisphere a little the snow recedes to reduce albedo to warm the earth further (again a positive feedback). After the ocean warms, CO2 outgases from the ocean and warms the earth by the greenhouse effect. CO2 varies with temperature, but the time to warm the ocean explains why there is a lag when it is a feedback mechanism. Eventually the earth warms for an interglacial period.
The earth appears to be in a delicate balance such that a small change in insolation combined with net positive feedbacks can make major changes in climate. The ice ages would not have been possible without positive feedback, and this is why the anthropogenic increase in CO2 is so potentially disrupting.
Steve
Hank Roberts says
Don, thanks for the link!
http://scripps.ucsd.edu/voyager/flip/
Folks, if you don’t know about this one, don’t miss it.
TimTheToolMan says
Tim, if you can, please ask a clear question. Tell us what “mechanism” means to you? Do you mean in a computer model? In actual measured behavior in the ocean? Some kind of machine? Something else?
Its not the kind of question that can be asked or answered “simply” in the absence of background information but with reference to the RC article “Why greenhouse gases heat the ocean”, I was hoping Gavin would understand fairly quickly.
As far as the answer in practical terms for the models, I was expecting downward longwave radiation to eventually become some X W/m^2 term that is applied to the ocean for each time step and subsequently mixed down into the ocean.
Where I’m coming from is that Real Climate did an article some time ago explaining how anthropogenic CO2 might heat the oceans “Why greenhouse gases heat the ocean” found in the index under oceans – and this describes a completely different theory as to how the heating may occur.
As far as I’m aware the theory described in the RC article has never been followed up and the heating effect described has not been quantified so that proper values may be used in the models.
From my other reading on the ocean skin, it seems that the Real Climate theory “Why greenhouse gases heat the ocean” is truely needed because warming of the skin by downwelling LW radiation and mixing wont on average warm the ocean, it may simply cool it more or less quickly because on average (Schluessel et al., 1990) the skin temperature is less than the bulk ocean temperature.
Anyway, if someone has a reference to a paper that verifies that the observed ocean temperature increases can be attributed to anthropogenic effects, I’d like to see it.
Chris Colose says
Dave Salt (#246, 517)–
The uncertainties in climate sensitivity are not independent of the uncertainties in cloud feedbacks. So while you are correct that the sensitivity estimate uncertainties are quite large (about +/- 50% from the 3 C central estimate), that is still the uncertainty range, and there’s very little chance that the right answer is outside that range. This is still consistent with my statement that you quoted, “Clouds are the largest source of uncertainty in quantifying the extent of climate feedbacks.”
Concerning climate models, all models can be shown to be “wrong” in some sense, but being wrong may not take away from their usefulness. For instance, most (all?) models are too conservative in simulating the NH sea ice loss, but there’s many other variables to climate than just sea ice, and not too many sane people will use this fact to say “well models are wrong, there’s nothing to worry about, minus well throw a party.” Similarly, there’s much more to climate than the tropical “hot spot.” Currently, the observational data and signal-to-noise ratio are such that it is difficult to tell with high confidence whether models currently capture expected or real-world behavior very well. There’s no evidence that an obvious discrepancy exists, but if it did, it would have several interesting implications (and scientists like new things!). It would matter quite a bit to the hurricane community. The implications for climate sensitivity would likely be small, but could mean an even stronger sensitivity, since the lapse rate feedback would be less negative with a stronger vertical temperature gradient. These type of issues appear in all fields of science. If all the issues were “settled” then there would be no reason to have researchers in the field or students interested.
edrowland says
>> Fortran is simple, it works well for these kinds of problems, it complies efficiently on everything from a laptop to massively parallel supercomputer, plus we have 100,000s of lines of code already. If we had to rewrite everything each time there was some new fad in computer science (you know, like ‘C’ or something ;) ), we’d never get anywhere. – gavin]
—
No, really. This is unbelievably shocking stuff. This is a coding style that’s almost 40 years out of date, and even then not done to then-current 1960s standards. It violates every single coding principle used by modern programmers, without exception.
In modern programming languages, 100,000 lines of code isn’t really that much. But 100,000 lines of code in that coding style is 100,000 lines of unsupportable code. Embedded constants all over the place, not so much as a comment as to what the data is, let alone the provenance of the data, or what’s being done with it. No structuring of data. Insane variable names. Global data being set on an all-but-arbitrary basis.
For something so important, this is not well done, at all.
[Response: We have a serious refactoring exercise going on to better structure the code, and that will help. But the language will still be Fortran. – gavin]
Rod B says
re [Response: No. They aren’t LW emitters. – gavin]
333 Watts/m^2 seems like an awful lot to be coming from clouds, water vapor and CO2. Is there any other contributor to the downward LW radiation? Or is it what the GHGases actually emit whether it seems like a lot to me or not?
Ernst K says
Tim the TM said: “I was expecting downward longwave radiation to eventually become some X W/m^2 term that is applied to the ocean for each time step and subsequently mixed down into the ocean”
That’s basically what the article is doing, except it’s putting far more attention on the “applied to the ocean” part than you apparently expected and taking it more or less for granted that the reader already understands the “downward longwave radiation to eventually become some X W/m^2 term” part.
There’s a figure in that article that breaks it all down into 5 steps.
PeteB says
Dave Salt, here is my understanding
Dave Salt said : ” The low-level IPCC documents provide a candid and thorough summary of the current status of climate science. They are an honest attempt to tell a complex story where some of the most significant details are still poorly understood.”
Agreed up to the point of ‘the most significant details’. The areas that are not well understood are significant but not ‘the most significant’
Two well understood elements that make up the climate sensitivity are the CO2 ‘greenhouse effect’ and the combined water vapour / lapse rate feedback (that I think brings you to around 2.4 Deg C) These are now well understood (see Box 8.1 in http://ipcc-wg1.ucar.edu/wg1/Report/AR4WG1_Print_Ch08.pdf in section 8.6) and confidence has significantly improved since the TAR.(see 8.6.3.1.2)
The other areas have varying degrees of uncertainty associated with them, with clouds being perhaps the most uncertain. These uncertainties give rise to quite a big range for climate sensitivity (the often quoted 2 – 4.5 Deg C with around 3 Deg C the most likely)
Remember there are two approaches to this, one is to come up with a climate sensitivity based on understanding and modelling each contributing factor and running the model and seeing what you come up with, the second is to observe overall what effect on temperature an increase/decrease in a known climate forcing has (taking into account error bars in the forcing and the resultant temperature change). From this you can calculate the overall feedback and what the likely amplification of the CO2 ‘greenhouse’ forcing is. This is covered in chapter 9 http://ipcc-wg1.ucar.edu/wg1/Report/AR4WG1_Print_Ch09.pdf
As mentioned in chapter 9, if you combine the probability distributions from several independent observation events this can improve the estimate of climate sensitivity e.g. http://www.jamstec.go.jp/frcgc/research/d5/jdannan/GRL_sensitivity.pdf
Point 2) – There are certainly some areas of the model that are not well understood, but there are a lot of areas that are well understood and conform well to the observations (see Gavin’s posts here on models)
Dave Salt said “3) The controversy that rages over this issue appears to derive from the way that the low-level details are ‘interpreted’ for policy makers. Worst case scenarios seem to be highlighted, while the associated uncertainties appear to lack the necessary emphasis.”
But these are covered very fairly in the Summary for Policymakers, with each statement having a ‘likelihood’. I think you would have a hard time arguing these are minimising uncertainties or emphasising the ‘worst case’, I would say this is a very conservative (small c) document.
http://ipcc-wg1.ucar.edu/wg1/Report/AR4WG1_Print_SPM.pdf
http://ipcc-wg1.ucar.edu/wg1/Report/AR4_UncertaintyGuidanceNote.pdf
FurryCatHerder says
I’m going to finish reading the 4 pages of comments I’ve not read yet, but I can no longer restrain myself in regards to the carbon tax proposals.
The critical fact that bears keeping in mind is that 100% of what we “buy” goes all the way back to “energy”. If you say “but raw materials don’t!”, wrong — raw materials require energy to extract from wherever they are hiding. “Services” — people use energy to show up for their service sector job. And so on. It’s all energy, which is why renewable energy is such a fantastically wonderful idea.
In short, a “carbon tax” is a flat tax scheme, and as such is regressive. No one is really going to suggest that. A “carbon tax with quotas” — starting to make more sense, but that just gets us to tax brackets as we have them today. And instead of “loopholes”, you’d have the rich buying “offsets”.
As regards the comment that even Liberal Democrats don’t believe in AGW, I have to agree 100%. If Al Gore actually BELIEVED what he’s saying, no one would have found those 22,000KWh a month electric bills. I used about 3,800KWh last year (yeah, that’s the entire year — it comes out to about 320KWh per month, or about 1/70th of what Al Gore uses) and I’m far from a Liberal Democrat. And I live in Central Texas. And run the A/C. And have a house full of computers that run 24/7.
When Al Gore cuts his electric consumption by 98.5%, then I think people will look at AGW more seriously. Because if Mr. Power Point can’t actually reduce his actual use of fossil fuels, why should I?
Jacob Mack says
First of all happy new years to all! Gavin, very clear, accurate and honest piece. I like how straightforward the graphs are, the explanations and admissions of margins of error. The models have greatly improved since the 1980’s and they are continuing to be improved upon.AGW remains a serious issue based upon sound science and mathematical analysis.
Richard Palm says
You’ve said several times that the models are not fitted to the temperature data. Would it be correct to say that the model outputs are derived solely from calculations based on physical laws?
Rob says
@226
What would the output of the GCMs look like if there were no CO2 increase? Any links to a graph? thanks!
[Response: Assuming you mean no forcings of any kind, then the ensemble mean would be flat, but you’d still see excursions of the same magnitude as the grey bands above. – gavin]
No, I mean just without an increase in CO2, all others stay the same. Thanks. (Sorry for the delayed follow-up, forgot I asked…)
Ray Ladbury says
Timthetool:
I have repeatedly suggested that you go to Minnett’s website:
http://www.rsmas.miami.edu/personal/pminnett/Complete_List/complete_list.html
There are several papers there that characterize the skin effect. Once you know the thermal profile of the skin, it is just thermodynamics to calculate heat transfer across the skin layer. Keep in mind that this is a transient effect. The skin is established and breaks on a regular basis. This would make it a sub-grid effect, so you would be average the effect over time and space subject to prevalent conditions within the grid square. This is an educated guess on my part, based on my (very) general knowledge of the models. I’m sure Gavin will correct me if I am wrong. However, we know that oceans are warming and that the models reproduce the effect pretty well. What is more, the physics is pretty well known.
Damien says
re #534
“We have a serious refactoring exercise going on to better structure the code, and that will help. But the language will still be Fortran. – gavin”
Putting aside the legacy code issue, why Fortran? With Fortran 2003, some of the interoperability issues with C have been sorted and it’s going to be very hard to argue that Fortran will run more efficiently than C?
Just to venture even more off topic, why not profile the code and re-write a small percentage of critical routines in assembly? It’s routine in both games programming and embedded systems and can boost performance over vanilla C by quite a bit.
Curious, how valuable would it be to cut (say) 10% of the time off a model run?
grumpy software architect says
edrowland #534 – another hysterical coder… what exactly is awful? Throwing around phrases like “every single coding principle” just displays your lack of connection with modern programming. Unless of course you think that modern programming is defined by VB.Net.
grumpy software architect says
#543 – skills do not have zero cost. If a team knows how to be productive in a language you will get more bugs and less productivity by switching to something else. Adding assembler to the mix (when I suspect the critical routines are doing arithmetic that Fortran has been VERY good at for some decades) is a recipe for producing garbage.
dhogaza says
Those insisting that C is more readable than FORTRAN may be forgetting that it is C, not FORTRAN, that comes with its own puzzle book …
Hank Roberts says
For Dave Salt, I wonder if you’re not overly focused on “uncertainty” — which you talk a lot about above and point to in the IPCC references. I don’t know how much science background you have, but it’s important to realize that “uncertainty” in science isn’t a problem or a failure–it’s the whole focus of the enterprise, and it gets narrowed down but not eliminated.
The uncertainty is talked about all the time, it’s where the excitement and fun and challenge is.
The very best science can do is say “probably” — probability is what science does. Proof is done in mathematics, not in science.
Compare: http://imgs.xkcd.com/comics/science_montage.png nails this nicely.
Hank Roberts says
Oh, I see the problem I think:
> warming of the skin by downwelling LW radiation and mixing wont on
> average warm the ocean, it may simply cool it more or less quickly
> because on average (Schluessel et al., 1990) the skin temperature is
> less than the bulk ocean temperature.
That paper published 20 years ago, and I don’t think it means what you think it does. Where are you finding a reference to that? Have you come across something by Gerlich and Tscheuschner that you’re relying on?
Look at the actual paper and the citing papers, and the author’s extensive subsequent work, and watch out for blogs arguing that the “second law of thermodynamics” proves warming can’t happen. That line of thinking is proposed, debunked, and rebunked quite often.
If I’ve guessed wrong about where your questions arise, my apology, but I’m still trying hard to figure out what assumptions you’re basing your whole line of questioning on. This seems the most likely recent guess.
Hank Roberts says
For example, you might been confused by something like this–it appears to be the first in a series of articles claiming to disprove global warming; also featuring references to Niels Corbyn and other familiar names:
http://juxte.files.wordpress.com/2009/11/climate-change-series-for-reservoir-magazine-jan-may-2009-dr-a-neil-hutton.pdf
Ken W says
FurryCatHerder (538):
The “Al Gore is XYZ, therefore AGW is false” argument doesn’t sway anyone with even the slightest understanding of global warming. Al Gore could be a cannibalistic pedophile that’s burning coal in his basement and that wouldn’t have the slightest effect on the scientific evidence for AGW.
FYI:
http://www.snopes.com/politics/business/gorehome.asp