Gavin Schmidt and Stefan Rahmstorf
John Tierney and Roger Pielke Jr. have recently discussed attempts to validate (or falsify) IPCC projections of global temperature change over the period 2000-2007. Others have attempted to show that last year’s numbers imply that ‘Global Warming has stopped’ or that it is ‘taking a break’ (Uli Kulke, Die Welt)). However, as most of our readers will realise, these comparisons are flawed since they basically compare long term climate change to short term weather variability.
This becomes immediately clear when looking at the following graph:
The red line is the annual global-mean GISTEMP temperature record (though any other data set would do just as well), while the blue lines are 8-year trend lines – one for each 8-year period of data in the graph. What it shows is exactly what anyone should expect: the trends over such short periods are variable; sometimes small, sometimes large, sometimes negative – depending on which year you start with. The mean of all the 8 year trends is close to the long term trend (0.19ºC/decade), but the standard deviation is almost as large (0.17ºC/decade), implying that a trend would have to be either >0.5ºC/decade or much more negative (< -0.2ºC/decade) for it to obviously fall outside the distribution. Thus comparing short trends has very little power to distinguish between alternate expectations.
So, it should be clear that short term comparisons are misguided, but the reasons why, and what should be done instead, are worth exploring.
The first point to make (and indeed the first point we always make) is that the climate system has enormous amounts of variability on day-to-day, month-to-month, year-to-year and decade-to-decade periods. Much of this variability (once you account for the diurnal cycle and the seasons) is apparently chaotic and unrelated to any external factor – it is the weather. Some aspects of weather are predictable – the location of mid-latitude storms a few days in advance, the progression of an El Niño event a few months in advance etc, but predictability quickly evaporates due to the extreme sensitivity of the weather to the unavoidable uncertainty in the initial conditions. So for most intents and purposes, the weather component can be thought of as random.
If you are interested in the forced component of the climate – and many people are – then you need to assess the size of an expected forced signal relative to the unforced weather ‘noise’. Without this, the significance of any observed change is impossible to determine. The signal to noise ratio is actually very sensitive to exactly what climate record (or ‘metric’) you are looking at, and so whether a signal can be clearly seen will vary enormously across different aspects of the climate.
An obvious example is looking at the temperature anomaly in a single temperature station. The standard deviation in New York City for a monthly mean anomaly is around 2.5ºC, for the annual mean it is around 0.6ºC, while for the global mean anomaly it is around 0.2ºC. So the longer the averaging time-period and the wider the spatial average, the smaller the weather noise and the greater chance to detect any particular signal.
In the real world, there are other sources of uncertainty which add to the ‘noise’ part of this discussion. First of all there is the uncertainty that any particular climate metric is actually representing what it claims to be. This can be due to sparse sampling or it can relate to the procedure by which the raw data is put together. It can either be random or systematic and there are a couple of good examples of this in the various surface or near-surface temperature records.
Sampling biases are easy to see in the difference between the GISTEMP surface temperature data product (which extrapolates over the Arctic region) and the HADCRUT3v product which assumes that Arctic temperature anomalies don’t extend past the land. These are both defendable choices, but when calculating global mean anomalies in a situation where the Arctic is warming up rapidly, there is an obvious offset between the two records (and indeed GISTEMP has been trending higher). However, the long term trends are very similar.
A more systematic bias is seen in the differences between the RSS and UAH versions of the MSU-LT (lower troposphere) satellite temperature record. Both groups are nominally trying to estimate the same thing from the same data, but because of assumptions and methods used in tying together the different satellites involved, there can be large differences in trends. Given that we only have two examples of this metric, the true systematic uncertainty is clearly larger than the simply the difference between them.
What we are really after is how to evaluate our understanding of what’s driving climate change as encapsulated in models of the climate system. Those models though can be as simple as an extrapolated trend, or as complex as a state-of-the-art GCM. Whatever the source of an estimate of what ‘should’ be happening, there are three issues that need to be addressed:
- Firstly, are the drivers changing as we expected? It’s all very well to predict that a pedestrian will likely be knocked over if they step into the path of a truck, but the prediction can only be validated if they actually step off the curb! In the climate case, we need to know how well we estimated forcings (greenhouse gases, volcanic effects, aerosols, solar etc.) in the projections.
- Secondly, what is the uncertainty in that prediction given a particular forcing? For instance, how often is our poor pedestrian saved because the truck manages to swerve out of the way? For temperature changes this is equivalent to the uncertainty in the long-term projected trends. This uncertainty depends on climate sensitivity, the length of time and the size of the unforced variability.
- Thirdly, we need to compare like with like and be careful about what questions are really being asked. This has become easier with the archive of model simulations for the 20th Century (but more about this in a future post).
It’s worthwhile expanding on the third point since it is often the one that trips people up. In model projections, it is now standard practice to do a number of different simulations that have different initial conditions in order to span the range of possible weather states. Any individual simulation will have the same forced climate change, but will have a different realisation of the unforced noise. By averaging over the runs, the noise (which is uncorrelated from one run to another) averages out, and what is left is an estimate of the forced signal and its uncertainty. This is somewhat analogous to the averaging of all the short trends in the figure above, and as there, you can often get a very good estimate of the forced change (or long term mean).
Problems can occur though if the estimate of the forced change is compared directly to the real trend in order to see if they are consistent. You need to remember that the real world consists of both a (potentially) forced trend but also a random weather component. This was an issue with the recent Douglass et al paper, where they claimed the observations were outside the mean model tropospheric trend and its uncertainty. They confused the uncertainty in how well we can estimate the forced signal (the mean of the all the models) with the distribution of trends+noise.
This might seem confusing, but an dice-throwing analogy might be useful. If you have a bunch of normal dice (‘models’) then the mean point value is 3.5 with a standard deviation of ~1.7. Thus, the mean over 100 throws will have a distribution of 3.5 +/- 0.17 which means you’ll get a pretty good estimate. To assess whether another dice is loaded it is not enough to just compare one throw of that dice. For instance, if you threw a 5, that is significantly outside the expected value derived from the 100 previous throws, but it is clearly within the expected distribution.
Bringing it back to climate models, there can be strong agreement that 0.2ºC/dec is the expected value for the current forced trend, but comparing the actual trend simply to that number plus or minus the uncertainty in its value is incorrect. This is what is implicitly being done in the figure on Tierney’s post.
If that isn’t the right way to do it, what is a better way? Well, if you start to take longer trends, then the uncertainty in the trend estimate approaches the uncertainty in the expected trend, at which point it becomes meaningful to compare them since the ‘weather’ component has been averaged out. In the global surface temperature record, that happens for trends longer than about 15 years, but for smaller areas with higher noise levels (like Antarctica), the time period can be many decades.
Are people going back to the earliest projections and assessing how good they are? Yes. We’ve done so here for Hansen’s 1988 projections, Stefan and colleagues did it for CO2, temperature and sea level projections from IPCC TAR (Rahmstorf et al, 2007), and IPCC themselves did so in Fig 1.1 of AR4 Chapter 1. Each of these analyses show that the longer term temperature trends are indeed what is expected. Sea level rise, on the other hand, appears to be under-estimated by the models for reasons that are as yet unclear.
Finally, this subject appears to have been raised from the expectation that some short term weather event over the next few years will definitively prove that either anthropogenic global warming is a problem or it isn’t. As the above discussion should have made clear this is not the right question to ask. Instead, the question should be, are there analyses that will be made over the next few years that will improve the evaluation of climate models? There the answer is likely to be yes. There will be better estimates of long term trends in precipitation, cloudiness, winds, storm intensity, ice thickness, glacial retreat, ocean warming etc. We have expectations of what those trends should be, but in many cases the ‘noise’ is still too large for those metrics to be a useful constraint. As time goes on, the noise in ever-longer trends diminishes, and what gets revealed then will determine how well we understand what’s happening.
Update: We are pleased to see such large interest in our post. Several readers asked for additional graphs. Here they are:
– UK Met Office data (instead of GISS data) with 8-year trend lines
– GISS data with 7-year trend lines (instead of 8-year).
– GISS data with 15-year trend lines
These graphs illustrate that the 8-year trends in the UK Met Office data are of course just as noisy as in the GISS data; that 7-year trend lines are of course even noisier than 8-year trend lines; and that things start to stabilise (trends getting statistically robust) when 15-year averaging is used. This illustrates the key point we were trying to make: looking at only 8 years of data is looking primarily at the “noise” of interannual variability rather than at the forced long-term trend. This makes as much sense as analysing the temperature observations from 10-17 April to check whether it really gets warmer during spring.
And here is an update of the comparison of global temperature data with the IPCC TAR projections (Rahmstorf et al., Science 2007) with the 2007 values added in (for caption see that paper). With both data sets the observed long-term trends are still running in the upper half of the range that IPCC projected.
Barton Paul Levenson says
AEBanner writes:
[[So it is agreed that when the CO2 molecule which had absorbed a photon next decays by collision, its excited internal energy is transferred to the atmosphere. But this is where the energy came from in the first place! So nothing has changed. The temperature of the atmosphere has not been changed as a result of this process. It is just as if nothing had happened.]]
Your idea violates conservation of energy. If a molecule transfers energy to the atmosphere, the atmosphere heats up. Layers of atmosphere may transfer energy to one another, but the energy originally comes from the sun, and while energy is conserved, temperature is not.
Ray Ladbury says
Rod, Of course radiation from the surface cools the surface, but that radiation depends mainly on surface temperature (ignoring emissivity for simplicity). It isn’t changed by adding more CO2. Adding CO2 merely decreases the probability of energy leaving the system (as outgoing IR, the only way it can), whether the IR is emitted from the ground or from excited CO2 in the atmosphere.
pat n says
Timothy, the link you provided in your comment (547) supports my earlier point that the 1920s-1940s was a warm bubble (367, 514).
Thanks.
Douglas Wise says
Thank you , Raypierre, for commenting on my post #550. I will, indeed, read your book to which you made reference although whether I’ll be able to assimilate the maths is debatable.
I did feel your somewhat irritable comment that “you were tired of these endlessly recurring drawn out exchanges about how the greenhouse effect works” was something of a put down. Essentially, your advice boils down to the fact that I should either accept the establishment view because of the expertise of those proferring it or become an expert on the subject myself. To an extent, I can sympathise with your point of view. For you, the exchanges may be proving repetitive. For me, a newcomer to the subject, the only questions I have repeated are those which haven’t been answered or that have been answered in a manner that I regard as not totally satisfactory. I can assure you that I read very extensively on this website before seeking to ask any questions. Essentially, the only area of your Archive that caused me to have any doubts related to the two parts of the Saturation Argument which I have read over and over again and remain somewhat unhappy with. I believe that answers to the questions I have posted would probably remove my residual doubts. Perhaps reading your book will achieve the same objective and will save you or others time and trouble.
Hank Roberts says
> if the verbal explanations are unsatisfying, you just have to
> go read the textbooks or go take a course. –raypierre]
Nominating Ray’s comments along this line, in this and other threads: recommend compiling under the “Start Here” button, to point people to.
Including a “this much math is needed” prerequisite explanation.
Leif Svalgaard says
Response to ‘response’ to 284: “Solar reconstructions back to AD 1610 are based on sunspot data, not the cosmogenic isotopes”
But see http://www.leif.org/research/TSI%20From%20McCracken%20HMF.pdf
(TSI Reconstruction 1428-2005, Santa Fe, SORCE 2008)
Abstract: We have used a recent reconstruction of the long-term solar wind magnetic field based on cosmic ray data [McCracken, 2007; McCracken and Beer, 2007] to infer the secular variation in TSI, based on a sensitivity of 0.5 Wm2/nT. The TSI inferred in this manner increases by ~3 W/m2 (~0.22%), from 1428-2005. Even more remarkably, the TSI reconstruction based on McCracken’s deduced HMF strength increased by nearly 3 W/m2 in the first half of the 20th Century, a result that finds no support in other recent TSI reconstructions which show at most a ~0.5 W/m2 increase over this period. This result casts further doubt on the McCracken [2007] solar wind magnetic field reconstruction which is at strong variance with other recent HMF reconstructions. Alternatively, the sensitivity of TSI to HMF could be much smaller than the 0.5 Wm2/nT suggested by Fröhlich [2008].
Ray Ladbury says
Douglas Wise,
Perhaps if you told us what your background in math and physics was, we (or more likely Ray Pierrehumbert) could suggest an appropriate reference. I personally found Ray’s book quite helpful, but I do have a physics background. Basically, what it comes down to is that adding greenhouse gasses decreases the probability that an IR photon emitted from below will escape. Less photons escaping means rising energy and temperature. And if you have energy in a particular mode or place, it always finds a way to distribute itself throughout the rest of the system (position and degrees of freedom). I’d be happy to try to answer questions off-line. My email is not hard to find and I do hereby give Ray my permission to forward it to you in the interest of not hijacking discussions.
Chris Colose says
I do have a question regarding the observed lack of trend in the decrease of Diurnal temperature range. As I understand, this is caused mainly by increased cloud cover (rather than some night time effect of the greenhouse gases themselves?). UHI effects DTR, as well as aviation. However, from many feedback papers which propose a positive cloud feedback, the low clouds which control albedo more than any other kind should decrease. My questions are 1) how exactly do increasing GHG’s reduce the DTR, 2) what is going on recently, and 3) are trends of increased cloud cover responsible for a decline in DTR consistent with models which show decling low level cloud cover in the future as a response to warming ??
Chris
Howard Hawhee says
I was thinking about how to graphically illustrate the way signal/noise relationships mesh with “common-sense” perceptions, and I came up with a crude little excel sheet and graph. What it does is to simulate daily temperatures over many years with a cyclic variation overlain by a long-term upward trend similar to that in IPCC forecasts.
The way it works is to assume a cyclic variation of some dependent variable over a several thousand time periods (if one makes the length of each cycle be about 365, then one could interpret this as daily temperatures over 30 years in say the northern or southern temperate zone). Then I introduce a long-term increase factor that you can tune (and one could perhaps interpret this as high, low, and middle IPCC projections for global temperature changes over the next century) and “pro-rate” that increase into an incremental amount for each smaller discrete time period (or “day”). So now we have a smooth cycle gradually tending upwards. Then, I introduce some random variation factor for any given period’s temperature (within a certain tuneable range above or below any “day’s” underlying cycle value).
We graph that, and we get something that looks pretty “noisy”, and that even looks like some of the lowest lows are toward the end of the period and some of the highest highs are toward the beginning, but that demonstrably tends upwards over time.
I know it’s pretty crude, but it does make the point that eyeballing a bunch of values that have a steady trend won’t always show that the trend is obvious. I think it would be very hard to spot any obvious trend by just looking at the graph.
I am just a computer programmer without any formal background in statistics. What I’d like to know from someone with more formal training in statistics, is whether or not this is really a valid way to illustrate signal/noise problems to people.
Fair weather cyclist says
Howard,
I think you’re right, but the converse is also true. Random sequences can look like they’re showing a trend.
tamino says
Re: #559 (Howard Hawhee)
It seems to me that your approach will correctly make the points, namely that when the signal-to-noise ratio is low: 1) even with steady increase we can’t expect a new record every year, or even every decade; and 2) the eye can fool ya.
Your approach sounds similar to the one I emplyed in this post.
Howard Hawhee says
tamino (#561), thanks for pointing me to your post — very nice. My intent is somewhat different, though very close to yours — I build an increase into artificial data and then (also artificially) give it some random short-term variance. In other words, we know beyond any doubt that there is a real upward trend in the “temperature” of my fake model — yet it is still very hard to make out by any “common-sense” approaches.
Also, Fair weather cyclist (#562) is of course right that random variation can look like it shows a trend, but the longer the time series, the less likely that random variation will look like a trend. One could use my spreadsheet model to show this too: set the upward trend parameter to 0 so that there is only random variation. The trend extracted from the series will diminish as time goes on.
Howard Hawhee says
oh, sorry tamino (561), I skimmed your post quickly & at first understood that you were also using randomly generated data — it looked so real!
tamino says
Re: 563 (Howard Hawhee)
Yes, it does look real! You’re not the first to mistake it for actual temperature data. In fact, one reader objected strongly to my stating that “we know without doubt that the trend is increasing,” insisting that we know no such thing — so I had to reiterate that it’s artificially generated data with a built-in non-stop upward trend. And as the post mentions, I didn’t have to work hard to make this happen. The very first artificial series I created with a realistic trend and noise level is the one featured in the post.
CobblyWorlds says
Chris Colose #558,
Re the evolution of DTR changes with time. I am just an amateur, but here goes…
Have you read Martin Wild “Impact of global dimming and brightening on global warming”?
http://www.iac.ethz.ch/people/mknut/2006GL028031.pdf
See sect 3.3. Also this conference report may be of use: http://www.gewex.org/5thGEWEXConf_E.Dutton.pdf
1) Consider the energy fluxes in the day and night. In terms of the 24 hour temperature changes (which lead to daytime max/night time minimum): During the day incoming shortwave radiation (solar) dominates, at night it’s outgoing longwave. Increase trapping of outgoing LW and you should increase temperature. If incoming (surface incident) SW remains steady as you increase the “greenhouse” trapping of longwave then the daytime trend should be a bit less than the night – hence a reduction of diurnal range which is:
DTR = (Daytime Max Temp) – (Night Min Temp). I think evaporation/evapotranspiration acts as a brake to daytime maximum more so than night. If the daytime surface insolation has been reduced by “dimming”, as the dimming abates the daytime warming takes off, and “catches up” with the night-time warming trend, so reducing the increase of DTR.
2) In the above paper Wild et al note they’ll be doing a regionally detailed study, but I’ve not tracked that down yet as I’m otherwise engaged. Googling BSRN as suggested below may help with up to date data, but I never work off datasets, I only use primary literature (I don’t know enough to trust any results I may come up with using raw data).
3) Levelling in DTR seems likely to be down to Sulphate Aerosols, not the same as warming driven cloud cover variance. I think!
Try googling: Baseline Surface Radiation Network BSRN. If you need more ask and I’ll have a dig around. I think I have more on this, but am moving onto other matters so my climate change stuff is mainly on CDs.
Douglas Wise says
Re Ray Ladbury #557.
It is extremely kind of you to offer to help sort my muddled thoughts in a private e-mail exchange. I realise that my questions are somewhat off topic on this thread and am particularly grateful to you. I have attempted to communicate with you direct but could only find an obsolete e-mail address on Google such that my message was returned undelivered. I thought we could get the ball rolling by my sending you my e-mail address. This is: douglas.wise@gmail.com.
stevenmosher says
howard, tamino and cyclist,
here is an old paper you’ll might enjoy. On internal climate variability
http://ams.allenpress.com/archive/1520-0469/35/6/pdf/i1520-0469-35-6-1111.pdf
enjoy
Wim Benthem says
Re #550
There is indeed no such thing as downwards convection, but upward convection depends on the difference in temperature between the surface and higher up. If the top of the atmosphere gets warmer this will reduce convection and this can also warm the surface.
Ru Kenyon says
Hi, excuse me if this is off topic, but can anyone offer their thoughts on the following-
1. Is James Lovelock right when he says that we are living in a ‘Fool’s Climate’ with aerosols currently masking about 3oC of warming?
2. If so, do we now need to get started on geo-engineering if we’re to have any chance of stabilising the climate?
If this isn’t a good place to ask this question, can anyone suggest where I could get some informed debate on this?
Cobblyworlds says
#569 Ru Kenyon,
Here’s a RealClimate commentary on Lovelock:
https://www.realclimate.org/index.php/archives/2006/02/james-lovelocks-gloomy-vision/
Also if you look at the top right of this page you’ll find some categorised links, under “Climate Science” click on “Aerosols” and you’ll find more there.
I don’t remember Lovelock saying 3degC of warming is masked, but from my reading this is much bigger than seems reasonable. Above I referenced a paper by Martin Wild (et al), they “estimate that, over the past decades, the greenhouse forcing alone has enhanced land surface temperatures by certainly more than 0.2degC per decade, but unlikely much more than 0.38degC per decade.” The observed change in global average temperature has been 0.2degC/decade for the last 3 decades, so they’re saying this would be higher but for “dimming” due to aerosols. That’s from only one paper, but it’s not a bad ball-park figure from what I’ve read.
With regards geo-engineering I agree with Gavin Schmidt as posted here:
QUOTE Think of the climate as a small boat on a rather choppy ocean. Under normal circumstances the boat will rock to and fro, and there is a finite risk that the boat could be overturned by a rogue wave. But now one of the passengers has decided to stand up and is deliberately rocking the boat ever more violently. Someone suggests that this is likely to increase the chances of the boat capsizing. Another passenger then proposes that with his knowledge of chaotic dynamics he can counterbalance the first passenger and indeed, counter the natural rocking caused by the waves. But to do so he needs a huge array of sensors and enormous computational reasources to be ready to react efficiently but still wouldn’t be able to guarantee absolute stability, and indeed, since the system is untested it might make things worse. ENDQUOTE
So I agree with those who don’t think we know enough to attempt geo-engineering.
Sorry, but for myself I’d say we have to shape up, sort ourselves out, and massively reduce emissions.
CobblyWorlds says
#570 correction.
I missed off the link to the RC article from which I quoted, see here:
https://www.realclimate.org/index.php/archives/2006/06/geo-engineering-in-vogue/
David B. Benson says
Ru Kenyon (569) — The IPCC AR4 report (linked on the sidebar) references another IPCC report section in which the forcings for each of the causes is estimated. This should be of some help for your question.
As for geo-engineering, I know a completely safe method: use hydrothermal carbonization to produce biocoal from biomass. Then bury the biocoal in abandoned mines or carbon landfills.
Since nature has managed to keep most fossil coal out of the biosphere for millions of years, we can be sure this will work. I estimate the cost to be about US $100 per tonne. Removing about 350 billion tonnes of carbon from the active carbon cycle ought to be enough, at least for this century, assuming the addtional 8.5 billion tonnes added yearly is either forgone or else this much is yearly removed, in addition, via biocoal sequestration.
Dick Lawrence says
I wanted to make a comment pertinent to models and this (11-Jan) posting is the closest I could find. With 571 comments already down, I’m not optimistic this will be seen, but …
Pete Best made some comments about IPCC scenario assumptions re. future quantities of FF – primarily oil and gas, but some new estimates on coal are pertinent also. I think he has some valid points, that FF reserve value estimates used in most IPCC scenarios are unrealistic and unlikely. Kjell Aleklett and Jean Laherrere have commented on this, which you may be familiar with.
First question is: from what sources do IPCC model scenarios get their FF burn rates (oil, gas, coal)? I know the modelers are not likely to be petroleum geologists, so they presumably go to whoever seems to be the authority in that realm – EIA? IEA? USGS?
I think James Hansen has recently investigated scenarios with reduced FF consumption rates but I can’t cite the specific news item that reported that – you may be able to confirm that, or show it didn’t happen after all.
Second question: this is a big enough topic – the intersection of Climate Change and oil/gas depletion – that it probably deserves a thorough and open dialog of its own, probably face-to-face for a couple of days, so that we could have a chance to see each other and realize that the rational science-based parties on both sides are not doomster wackos and they have something to learn from each other. We (ASPO-USA) have our national meeting in Sacramento in September – perhaps we could convene a dialog on the side to explore this in an open-minded way.
I am fully aware that some will jump to the conclusion “ASPO says oil and gas depletion will prevent global warming!” and similar stupid responses. We want to be clear that nobody associated with us is making that claim; but we do want to ensure that your models have the best available data, and then let the science and the models do what they’re supposed to do – give us the most accurate possible projections of future climate, to inform the public and policy decisions.
We believe the public and policy-makers must be well-informed on both Peak Oil and Climate Change issues; there is a huge overlapping (common) solution space and we must avoid being blindsided by (for example) a rush toward Coal-to-Liquid technology, as a consequence of oil becoming unaffordable to most, or unavailable.
You can email me directly at dlawrence (at) aspo-usa (dot) com if you want to communicate offline.
Thanks,
Dick Lawrence
ASPO-USA
CobblyWorlds says
#573, Dick Lawrence.
You are correct about the Hansen paper…
Kharecha & Hansen, “Implications of “peak oil” for atmospheric CO2 and climate”. Abstract & downloadable pdf available from NASA GISS: http://pubs.giss.nasa.gov/abstracts/submitted/Kharecha_Hansen.html You may want to first jump to figure 1.
You say the “intersection of Climate Change and oil/gas depletion”.
I’d suggest it’s a three factor issue, Climate Change, Oil/Gas Depletion, Population Pressures. I specify population because it’s a key factor in direct impacts on the land biosphere that demonstrably cause climatic changes, and due to feedbacks – such as energy costs and the regional eco-system implications of a switch to bio-fuels, or even just wood from the local forest becoming the cheapest way to heat and cook. Take any one of those alone and I am confident we’d muddle through, it’s all three together that (to me) raises the possibility that we may lose our “civilisation” by the 22nd century.
What I mean by civilisation is the organisation and common cause that allows many different cultures to cooperate together across the globe to achieve the “marvels” we do, and the technological and scientific progress that results. Something as mundane as a cell-phone is the “tip of the iceberg”, with a mass of supportive infrastructure behind it’s conception and final production. It was with the move to agriculture thousands of years ago that sufficient food was available to allow people to administer and study, rather than their lives being centred on food production. The three stressors I state above could threaten this. I am not suggesting a “Lovelock” style outcome (the few remaining breeding pairs etc etc), but I am suggesting that over this century the forward thrust of human and technical development may stutter and fail. When considering EROEI I fear this could be a permanent failing, unless we can crack fusion.
As an aside (possibly skewed by a UK perspective): If as seems likely we are about to slip into a recession on a (more or less) global basis, this will make the next few years very interesting from the point of view of current efforts to reduce emissions. We should be able to see how determined we really are, and to what degree the steps taken so far (e.g. carbon credits) are a mere daliance we can afford without costs we will seriously feel. How determined will we be to invest in renewables within a financially constrained situation, where fossil fuels are a cheaper option?
Ray Ladbury says
Cobblyworlds, I’d add to your 3 factors a fourth that will be equally if not more important than the other 3–economic development. India and China have reached takeoff. Several Asian nations are not far behind. Africa probably has a couple of decades yet. Once it happens, though, the competition for resources–especially energy–will only increase.
pete best says
Reply #573, conventional oil has been said by some people who have worked with MK Hubbert to have peaked. Therefore oil is to become more expensive whilst oil companies attempt to bring online alternative oil energy sources and convert coal an maybe gas to oil in the future. The very fact that tar, and shale sands as well as heavy oil is alreay being looked into and produced as well as deep sea oil means that oil is becomming scarce. Take a look at oil reserves, 300 billion barrels are made up as of the 1980’s by OPEC in order to pump more oil due to their quota system. 200 billion barrels are heavy tar ands from Albberta Canada.
Now oil is used to produce gas and coal and hence these two energy sources also become more expensive and harder to extract as it becomes more expensive to produce oil.
The bottom line is that with in a decade conventional oil is going ot cost a lot more than it does now and threaten many things (as we will have consumed another 300 billion barrels of the stuff) due to its scarcity. Therefore I reckon that AGW will be the least of our worries as nation states such as the USA drive to retain their status in the world.
It could be very scary.
Cobblyworlds says
#575 Ray,
I specifically didn’t include economic development as without the 3 factors I mentioned it would not cause such a problem. Although I accept that it underlies the others, and perhaps I am being too pedantic.
As an aside, from the BBC: http://news.bbc.co.uk/1/hi/sci/tech/7223663.stm
“An Epoch in the making…
Writing in the house journal of the Geological Society of America, GSA Today, Britain’s leading stratigraphers (experts in marking geological time) say it is already possible to identify a host of geological indicators that will be recognisable millions of years into the future as marking the start of a new epoch – the Anthropocene. “
Hank Roberts says
http://gristmill.grist.org/images/user/8/subtraction_slide2.jpg
From: http://gristmill.grist.org/story/2007/12/16/222024/73
Illustrating: http://www.ecoequity.org/GDRs
Why?
“… To those peoples in the huts and villages across the globe struggling to break the bonds of mass misery, we pledge our best efforts to help them help themselves, for whatever period is required—not because the Communists may be doing it, not because we seek their votes, but because it is right. If a free society cannot help the many who are poor, it cannot save the few who are rich.
To our sister republics south of our border, we offer a special pledge—to convert our good words into good deeds—in a new alliance for progress—to assist free men and free governments in casting off the chains of poverty. …
To that world assembly of sovereign states, the United Nations, our last best hope in an age where the instruments of war have far outpaced the instruments of peace, we renew our pledge of support—to prevent it from becoming merely a forum for invective—to strengthen its shield of the new and the weak—and to enlarge the area in which its writ may run.
Finally, to those nations who would make themselves our adversary, we offer not a pledge but a request: that both sides begin anew the quest for peace, before the dark powers of destruction unleashed by science engulf all humanity in planned or accidental self-destruction…
…
Now the trumpet summons us again—not as a call to bear arms, though arms we need; not as a call to battle, though embattled we are—but a call to bear the burden of a long twilight struggle, year in and year out, “rejoicing in hope, patient in tribulation”—a struggle against the common enemies of man: tyranny, poverty, disease, and war itself.
Can we forge against these enemies a grand and global alliance, North and South, East and West, that can assure a more fruitful life for all mankind? Will you join in that historic effort?
In the long history of the world, only a few generations have been granted the role of defending freedom in its hour of maximum danger. I do not shrink from this responsibility — I welcome it. I do not believe that any of us would exchange places with any other people or any other generation. The energy, the faith, the devotion which we bring to this endeavor will light our country and all who serve it — and the glow from that fire can truly light the world.
And so, my fellow Americans: ask not what your country can do for you—ask what you can do for your country.
My fellow citizens of the world: ask not what America will do for you, but what together we can do for the freedom of man. 26
Finally, whether you are citizens of America or citizens of the world, ask of us the same high standards of strength and sacrifice which we ask of you. With a good conscience our only sure reward, with history the final judge of our deeds, let us go forth to lead the land we love, asking His blessing and His help, but knowing that here on earth God’s work must truly be our own.
http://www.bartleby.com/124/pres56.html
Ray Ladbury says
Re 578. Damn! Don’t make’em like they used to, do they?
Hank Roberts says
http://images.ucomics.com/comics/db/2008/db080206.gif
Alexander Harvey says
Weather vs. Climate
Where do natural variability end and climate start?
Whatever one might think of the temperature record of the last 5-10 years to dimiss it as natural/random behaviour is not very scientific, unless one believes that the temperature record is largely a drunkard’s walk.
I believe the “apparent” stagnation “could be” hugely significant if we can understand or “at least” model it.
Currently I am looking at its implications for the “rate” (not the sign) of enthalpy uptake by the earth. Initial calculations seem to indicate that although the flux into the earth (primarily oceans) is positive, the rate of change has turned down significatly and is currently negative. Obviously this will not last. The continuing accumulation of CO2 should dictate that the trend will eventually turn once more upwards but the apparent “stagnation” is now of sufficient duration that I believe that it deserves a rational (physical) explanation.
I have done calculations for both a slab ocean (which has a severely negative trend) and a more reasonable diffusive ocean that although much more moderate also shows a negative rate over the last 5 or so years.
Just a flat rate must pose questions of the following sort.
If warming due to CO2 is continualy upward (as I suspect) what has the world been doing in the las 5-10 years to “stall” the rise of the last 30-40 years?
I think that it is more than plausible that some sort of change of climatic regime has taken place.
I also think that we should do our utmost to try and understand what has happened and what it implies for the future.
Even if this be “just” “natural variability” it is a vital that we understand its causes.
As I understand it the “models” are essentially weather models so are hopefully adapt at predicting just this sort of variability. If so what do they have to say about the climate for the last 10 years.
Best Wishes
Alexander Harvey
Barton Paul Levenson says
Alexander Harvey writes:
[[If warming due to CO2 is continualy upward (as I suspect) what has the world been doing in the las 5-10 years to “stall” the rise of the last 30-40 years?]]
It hasn’t stalled. Check here:
http://members.aol.com/bpl1960/Ball.html
Hank Roberts says
There’s your problem: start with a wrong assumption:
>If warming due to CO2 is continualy upward
> (as I suspect)
Try pasting your belief into the Google search box, followed by a question; their natural language search is rather good by now. They’ll first check your spelling, then give you a search with the words spelled right.
Imagine how profitable Google would suddenly become if they could convince people to look up their beliefs to check whether there was any good science on the questions.
Ah, but they’d first have to agree they were questions, wouldn’t they.
Pity.
“The predictability of timescales of seasonal to decadal averages is evaluated. The variability of a climate mean contains not only climate signal arising from external boundary forcing but also climate noise due to the internal dynamics of the climate system, resulting in various levels of predictability that are dependent on the forcing boundary conditions and averaging timescales….”
Volume 10, Issue 6 (June 1997) Journal of Climate
Atmospheric Predictability of Seasonal, Annual, and Decadal Climate Means and the Role of the ENSO Cycle: A Model Study
Wilbur Y. Chen and Huug M. Van den Dool
Climate Prediction Center, NOAA/NWS/NCEP
http://ams.allenpress.com/perlserv/?request=forward-links&doi=10.1175%2F1520-0442%281997%29010%3C1236%3AAPOSAA%3E2.0.CO%3B2
CobblyWorlds says
Alexander Harvey,
I agree, unless you use GISS in both Hadley/CRU and GHCN there’s an apparent short term levelling of the warming trend.
From what I can see (I can’t find research papers to back me up), the reason for the difference between GISS and the other 2 is that GISS seems to give more weight to the Arctic than the others. As to whether there really is marked warming in the Arctic, I think observations of ice conditions there suggests there will be.
I’ve commented on this here previously:
https://www.realclimate.org/index.php/archives/2008/01/the-debate-is-just-beginning-on-the-cretaceous/#comment-80047
This current situation is interesting, but measured against the sort of issue raised by RayPierre in the article at the top of that page, it’s not the most crucial gap in knowledge.
By the way Climate Models are Climate Models, not Weather Wodels. Each run of a climate model is like an individual realisation of the climate’s planet. If we had a set of identical Earths you wouldn’t expect them to have the same weather (short term) but their climates (long term) would be much more similar.
Furthermore here is another reason not to view climate models as climate models:
https://www.realclimate.org/index.php/archives/2007/05/climate-models-local-climate/langswitch_lang/in
When I was a sceptic I used to reassure myself every time there was a downward blip – i.e. I wasn’t being critical or sceptical at all. CO2 forcing changes from year to year are tiny in comparison to internal climate variability, it’s only on a decadal and greater scale that the trend becomes robust.
In closing, have a look at this image: http://cnx.org/content/m13859/latest/#fig2
For the bottom graph (note the -1 to -0.8 interval): If that whole X axis were to cover 2 seconds we’d not get into an argument about what the underlying signal was, because in the time it’d take to walk off and make a cup of tea it’d be apparent what was going on. Alternatively, knowing that the original signal in graph A was significant component in B would allow us to be very confident that despite the activity between -1 and 0.8, the following long term behaviour would be very different.
Ru Kenyon says
#570&572
James Lovelock does say that aerosols are masking ‘2-3 degrees’ of warming. http://royalsociety.tv/dpx_live/dpx.php?dpxuser=dpx_v12
His lecture is worth a watch.
His main argument is that the Earth is now in positive feedback & moving ‘ineluctably’ towards a hot state. The albedo feedback from the absence of ice at both poles he says would be of a similar magnitude to the forcing by all man made emissions. Anyone have a second opinion on this?
Given the shocking 2007 arctic melt, we now know the arctic will soon be ice free in summer. It now seems unlikely that the Greenland ice will not pass its threshold and commit us to 5 metres sea level rise, and that much of this could happen by 2100.
My point is that creating a new low carbon economy & stabilising atmospheric CO2eq concentration at say 430ppm by 2030 will not be enough to stop runaway global warming. The positive feedbacks are already coming into play. Geo-engineering is now imperative if we are to avoid runaway, dangerous climate change.
JCH says
There are several posts above, mostly by Bryan S, having to do with ARGO. This article is making the rounds:
The Mystery of Global Warming’s Missing Heat
http://www.npr.org/templates/story/story.php?storyId=88520025
Russ Willis says
Hasn’t a Hungarian Scientist, Ferenc Miskolczi who used to work at NASA, just published a paper claiming that there is a major flaw in the basic equations that are still being used to model global climate? Is this paper valid, and if not why not? This seems, at least to the interested observer, to be an important issue, which would be a perfect discussion point for this forum.
blue7053 says
1. During the decades of the ’80’s and the 90’s, volcanos at 10 degrees north (Philippines), 20N (Mexico), and 40N (Oregon)spread aerosoles around the world for ten years. I’ve never seen this discussed.
2. During the 150 years of coal burning, we have engaged in the massive process of dam building. The amount of water held on the land to evaporate is not insignificant. It must figure into the equation some way.
3. Due to population growth, 6(?) of the 10(?) largest rivers in the world no longer reach the sea.
4. One of the changes we are seeing is ‘stratification’ of the ocean’s ‘thermoclines’, diminished vertical mixing. Given the failure to ‘mix’, increased glacial melt (cold water), and a diminished input of rivers (hot water), is it any wonder the ocean’s are getting colder?
Doug says
#585
“Given the shocking 2007 arctic melt, we now know the arctic will soon be ice free in summer.”
We do?
#588
1. During the decades of the ’80’s and the 90’s, volcanos at 10 degrees north (Philippines), 20N (Mexico), and 40N (Oregon)spread aerosoles around the world for ten years. I’ve never seen this discussed.
Really good point. Something I’ve never seen discussed either.
Hank Roberts says
> is it any wonder the ocean’s are getting colder?
It would be if it were, but as it isn’t, it ain’t.
(I think Lewis Carroll said that originally.)
> never seen discussed
We can help you figure out why you failed to find it, if you’ll describe how you looked and failed to find it
Or you could try Google: aerosols volcanic decade
finds this comprehensive resource among much else:
http://www.volcano.si.edu/reports/bulletin/contents.cfm?issue=atmospheric
When you’ve read down that extensive page you’ll find this illustration, for example:
http://www.volcano.si.edu/reports/bulletin/atmoseff/2612atm1.png
Also helpful:
The “Start Here” link at the top of the RC page
The search box near it, type in the term you want
The first link under Science in the right margin
Leonard Herchen says
Two questions for Gavin:
I understand that an 8 year time frame is still not conclusive with respect to AGM. However, lets say, in the absence of any major event like a volcano, and ENSO cycles in the typical range etc etc, the same flat, or slightly downward trend is maintained for a few more years, how many years would it be when you would be willing to admit that the models have over predicted the impact of CO2 emissions significantly. Would 1 more year do it, 5, 10, 100?
Second question: I noticed in the posts above, you suggest that the 2003 heat wave in Europe, and 2007 NH ice hat are “extremely unlikely to be on it’s own just another fluctuation”
I’ve seen some numbers on Antarctic sea ice to be unusually high. Is the Antarctic sea ice “just another” fluctuation or something else?
Thanks for your response to so many posts.
John Burgeson says
The web site
http://yosemite.epa.gov/ee/epa/wpi.nsf/09133da7fb9a95db85256698006641d1/7a5516152467a30b85257562006c89a6!OpenDocument
contains a paper by Nicola Scafetta to the EPA recently.
Does this paper contain anything of interest to the IPCC climate scientists? Or is he just off course?
[Response: It’s worth noting that this is nothing to do with EPA or any official submission, it is simply placed in an online archive (like Arxiv). I’ll have a look to see if there is anything of note. – gavin]
John Burgeson says
Thanks, Gavin. The presentation is about stuff far beyond my own expertise; several of my ASA colleagues think it may be worthwhile. We’d appreciate an analyses very much.
Thanks
Burgy
Marcus says
Leonard Herchen: On Antarctic sea ice: go to http://nsidc.org/sotc/sea_ice.html and scroll about 3/4 of the way down to where a graph is shown with Antarctic and Arctic sea ice trends plotted by their standard deviations: You can see that the 12 month mean Antarctic ice is still about average for the satellite period, and has mostly stayed within a couple standard deviations of the long term average: the 12 month mean of Arctic sea ice, on the other hand, is almost 4 standard deviations below the long term trend, and 2007 was almost 8 full standard deviations below the long term mean.
Also see: http://nsidc.org/seaice/characteristics/difference.html
I’m not Gavin, but I’d personally be quite surprised if we haven’t seen a global mean temperatures back at at least 2005 levels within 3 years – the negative ENSO index is unlikely to last that long, and we’ll be moving up the solar cycle. If we don’t have a clear, new global record within 6 or 7 years (absent volcano, massive continuing negative MEI, or Maunder minimum type solar drop) then I’d personally be reevaluating my assumptions about climate sensitivity. Somewhere in the realclimate archives I remember a post showing how often a “new record” would be expected based on an underlying trend + noise, but I can’t find it right now…
Hank Roberts says
The charts here would be very much worth bringing up to the present date. They’re very helpful.
Chris Colose says
592: John, Gavin,
Not much physics or analysis to address. It’s just an array of fancy pictures and curves.
It really doesn’t matter if you use PMOD or ACRIM, since the differences are extremely small, and are evident over only a small portion of a solar cycle. There is no physical evidence that solar changes explain over 50% of the 20th centuey warming, and the forcing from solar changes is expected to be very small compared to the GHG changes. I don’t have the knowledge to make a judgment on which product is better, but it isn’t a huge deal for attribution efforts.
Slides 25-32 severely misrepresent the spatial and temporal scales of historical paleoclimatic events and the distribution on early 20th century warming. The “Expectation: A significant fraction of the warming observed during the last decades is natural (sun or something else)” on slide 34 does not follow from anything Scafetta provides, and his Phenomenological work as well as the Loehle Paleoclimate Reconstruction have all been addressed at RC before. Scafetta is (intentionally?) not accurately assessing the modern literature on the MWP, and attribution conclusions do not follow from them happening.
Timothy Chase says
Marcus wrote in 594:
I believe you are thinking of the chart titled “How long might you wait for a new record?” which is available in the post:
11 May 2008
What the IPCC models really say
https://www.realclimate.org/index.php/archives/2008/05/what-the-ipcc-models-really-say/
Jeff Masters’ expanded on this over at WunderBlog:
Is the globe cooling?
4 Feb 2009
http://www.wunderground.com/blog/JeffMasters/comment.html?entrynum=1187
You might also find the following two posts by Tamino to be of interest…
Global Temperature from GISS, NCDC, HadCRU
January 24, 2008
http://tamino.wordpress.com/2008/01/24/giss-ncdc-hadcru/
You Bet!
January 31, 2008
http://tamino.wordpress.com/2008/01/31/you-bet/
Marcus says
I love Scafetta’s graph that looks like it attributes the collapse of the Inca civilization to a “cold period” in the 1400s… I think most historians think that the spread of disease from the Europeans landing in Mexico was a more likely cause of the Incan collapse (and civil war, and then the arrival of Pizzaro was the straw that broke the llama’s back).
But it really seems that once someone starts down the Skeptic Path, they begin to lose all connection with reality – see Pielke Jrs blog where he attempts to claim that a cap-and-dividend won’t reduce emissions, Spencer’s blog and “non-anthropogenic CO2 rise”, etc.
Mark says
Ray Ladbury #575.
However, China has population controls that will counter the change. China is also building more renewable power sources than anyone else. It makes sense for them: China is not resource rich. And even if they were, it’s better economically to sell your resources rather than use them yourself as long as you can get away with it.
And both China and India will be able to buy newer and better technologies with less disruption than the first world would see, for much the same reason why Africa has managed to skip past the expensive and unscalable wired phone network and gone to the cheaper and more easily rolled out wireless (mobile) phone network.
They hadn’t sunk cost into a landline system and didn’t need to ensure their stockholders get their ROI on that landline investment and so didn’t have a need to make sure that wireless rollout was expensive enough to make up for landline revenue loss.
Timothy Chase says
John Burgeson wrote in 592:
If you put Nicola Scafetta’s last name in the search box at the top of any webpage of Real Climate you will find that several of his papers have been reviewed here in the past.
In particular:
Another study on solar influence
31 March 2006
https://www.realclimate.org/index.php/archives/2006/03/solar-variability-statistics-vs-physics-2nd-round
… which critiques:
Scafetta, N., and B. J. West (2006), Phenomenological solar contribution to the 1900–2000 global surface warming, Geophys. Res. Lett., 33, L05708
How not to attribute climate change
10 October 2006
https://www.realclimate.org/index.php/archives/2006/10/how-not-to-attribute-climate-change
… which critiques:
Scafetta, N., and B. J. West (2006), Phenomenological solar signature in 400 years of reconstructed Northern Hemisphere temperature record, Geophys. Res. Lett., 33, L17718
… and:
A phenomenological sequel
27 November 2007
https://www.realclimate.org/index.php/archives/2007/11/a-phenomenological-sequel
… which critiques:
Scafetta, N., and B. J. West (2007), Phenomenological reconstructions of the solar signature in the Northern Hemisphere surface temperature records since 1600, J. Geophys. Res., 112, D24S03