Another week, another ado over nothing.
Last Saturday, Steve McIntyre wrote an email to NASA GISS pointing out that for some North American stations in the GISTEMP analysis, there was an odd jump in going from 1999 to 2000. On Monday, the people who work on the temperature analysis (not me), looked into it and found that this coincided with the switch between two sources of US temperature data. There had been a faulty assumption that these two sources matched, but that turned out not to be the case. There were in fact a number of small offsets (of both sign) between the same stations in the two different data sets. The obvious fix was to make an adjustment based on a period of overlap so that these offsets disappear.
This was duly done by Tuesday, an email thanking McIntyre was sent and the data analysis (which had been due in any case for the processing of the July numbers) was updated accordingly along with an acknowledgment to McIntyre and update of the methodology.
The net effect of the change was to reduce mean US anomalies by about 0.15 ºC for the years 2000-2006. There were some very minor knock on effects in earlier years due to the GISTEMP adjustments for rural vs. urban trends. In the global or hemispheric mean, the differences were imperceptible (since the US is only a small fraction of the global area).
There were however some very minor re-arrangements in the various rankings (see data [As it existed in Sep 2007]). Specifically, where 1998 (1.24 ºC anomaly compared to 1951-1980) had previously just beaten out 1934 (1.23 ºC) for the top US year, it now just misses: 1934 1.25ºC vs. 1998 1.23ºC. None of these differences are statistically significant. Indeed in the 2001 paper describing the GISTEMP methodology (which was prior to this particular error being introduced), it says:
The U.S. annual (January-December) mean temperature is slightly warmer in 1934 than in 1998 in the GISS analysis (Plate 6). This contrasts with the USHCN data, which has 1998 as the warmest year in the century. In both cases the difference between 1934 and 1998 mean temperatures is a few hundredths of a degree. The main reason that 1998 is relatively cooler in the GISS analysis is its larger adjustment for urban warming. In comparing temperatures of years separated by 60 or 70 years the uncertainties in various adjustments (urban warming, station history adjustments, etc.) lead to an uncertainty of at least 0.1°C. Thus it is not possible to declare a record U.S. temperature with confidence until a result is obtained that exceeds the temperature of 1934 by more than 0.1°C.
More importantly for climate purposes, the longer term US averages have not changed rank. 2002-2006 (at 0.66 ºC) is still warmer than 1930-1934 (0.63 ºC – the largest value in the early part of the century) (though both are below 1998-2002 at 0.79 ºC). (The previous version – up to 2005 – can be seen here).
In the global mean, 2005 remains the warmest (as in the NCDC analysis). CRU has 1998 as the warmest year but there are differences in methodology, particularly concerning the Arctic (extrapolated in GISTEMP, not included in CRU) which is a big part of recent global warmth. No recent IPCC statements or conclusions are affected in the slightest.
Sum total of this change? A couple of hundredths of degrees in the US rankings and no change in anything that could be considered climatically important (specifically long term trends).
However, there is clearly a latent and deeply felt wish in some sectors for the whole problem of global warming to be reduced to a statistical quirk or a mistake. This led to some truly death-defying leaping to conclusions when this issue hit the blogosphere. One of the worst examples (but there are others) was the ‘Opinionator’ at the New York Times (oh dear). He managed to confuse the global means with the continental US numbers, he made up a story about McIntyre having ‘always puzzled about some gaps’ (what?) , declared the the error had ‘played havoc’ with the numbers, and quoted another blogger saying that the ‘astounding’ numbers had been ‘silently released’. None of these statements are true. Among other incorrect stories going around are that the mistake was due to a Y2K bug or that this had something to do with photographing weather stations. Again, simply false.
But hey, maybe the Arctic will get the memo.
Barton Paul Levenson says
[[Could a model now predict the mean global temperature for 2010-2015 accurately? If it cannot, it is completely worthless.]]
Classic non sequitur. A model can predict the temperature in 2015 given that everything else (including trends) stays the same. Nobody can predict the future exactly, but we don’t have to in order to know that global warming is a bad idea.
Barton Paul Levenson says
[[ I find it ridiculous that many people assume that GW is a net negative across the globe.]]
You’d be surprised by how many people consider trillions of dollars in property damage and millions of deaths to be a bad thing.
Nick Gotts says
Re #448 [Nick Gotts — Actually, this technique IS the best scientific practice. Rather than taking the (large amount of) time to polish and document the program, it is more productive to write and polish the paper.
I have never heard of a case in which a referee wanted to see the computer code to complete his task of checking that the submitted paper was, at least on the surface, correct.]
I think we’ll have to agree to disagree. I don’t dispute that yours is currently the majority opinion, but one journal I write and review for (Journal of Artificial Societies and Social Simulation) now includes in its form for reviewers a question about whether code is made available. I believe (I’d need to check this) there has also been discussion recently in Nature about whether statistical programs used to generate results should be required to be open source. Also, it’s not just a matter of publication: if code is impenetrable and undocumented, even the author themselves cannot reasonably be confident it is doing what it is supposed to do.
Hank Roberts says
Carl G. writes: “I find it ridiculous that many people assume that GW is a net negative …”
Carl, rate of change. We know quite well how fast natural systems adapt to changes.
The rate of global change from human activity is 100x faster than any event in the past other than a major asteroid strike.
http://www.sciencemag.org/cgi/content/abstract/303/5659/827?ijkey=5S8jAr5Xb.x0A
From that page look at the Supplemental Material (PDF), the second chart on p.6. That’s the change in the rate of change — very little change for many tens of thousands of years, then rapid change in the ecology as the climate changed naturally since the end of the last ice age. And we’re now driving change 100x faster.
We know how fast natural systems can change from paleoecology work — because they either moved along with the climate or they died out if the rate of change was too fast. Rates of change well within what has happened in the past, naturally, suffice to drive species out of areas where they can live. When it happens one continent at a time, life goes on elsewhere.
100x faster, this century, globally.
Timothy Chase says
Hank Roberts (#431) wrote:
For example, if one models the cretaceous period, one should better be able to identify the regions where oil shale will have formed. This would cut down on the cost of exploratory drilling, or better yet, identify areas which are likely to contain oil deposits – prior to obtaining the rights to drill there.
David B. Benson says
Nick Gotts — That is interesting and new to me.
I meant to write that the code is impenitrable by anybody but the suthor. For the sort of one-off that I am considering, no documentation is required because it is written in a very clear programming language (Standard ML) and I know what each part is supposed to do. But it was not written as a product, anymore than a typical lab bench experiment is intended to be used, or understood in detail, by any but the experimenter.
Computer programs written for distribution are another matter. There, attention is paid to the readability of the code, and there is some documentation provided.
DavidU says
#450
The current models have done a good job. For the accuracy on state of the art models you will have to ask someone else, e.g. Gavin, I’m a computational physicist, not a climate modeler, myself. But if you look at the paper Gavin mention at the end of #440 you can see how close to the current actual climate the models in the late 80’s got. Needless to say both models and computers have advanced a lot since then.
Most skeptics seem to object to far less concrete things than the ones you mention, but the things you mention are what they, at least indirectly, challenge.
Alastair McDonald says
Re #457
David, you don’t really think that Gavin is going to cite papers that are not close to the climate modeled, do you?
Every financial document supplied in the UK has to contain the phrase “Past performance is not a guide to future performance.” Yet people still have the simplistic idea that if scientists can find a projection by them which fits the past performance of the climate to the past, then that is a guide to the future performance of the climate.
Both the economic system and the climate system are chaotic. Need I say more?
Lawrence Brown says
Re 440: “Could a model now predict the mean global temperature for 2010-2015 accurately? If it cannot, it is completely worthless.”
My understanding is that models make projections of possible future outcomes based on choices that society makes in the economic,social and political arenas. There’s a time lag between choices made today and their future effects, which are not yet affecting short term projections.
As far as your feeling, that if they can’t accurately predict the warming for 2010-2015 then they’re completely worthless, falls short on two grounds. First,just because something isn’t completely efficient,doesn’t mean it’s unnecessary. If this were so we’d have scrapped our Dept. of Defense long ago. Second, do the words ‘risk assessment’ mean anything to you? If you own a home and carry fire insurance on it, is it because you have accurately predicted that your home will burn down? Of course not. You’ve made an assessment that the cost of paying the insurance ocmpany is worth the protection, against the worst possible outcome, should the uncertain occur. It’s preferable to doing nothing even though uncertainty exists.
There are uncertainties in AGW for sure. But look around. Check out the difference in extent of most mountain glaciers compared to 30 years ago, or the extent of sea ice in the arctic ocmpared to a few decades ago.There are many manifestations all around us. Should we stand around whistling Dixie? Or should we be doing what the wise and practical homeowner does and start protecting our one and only world?
Timothy Chase says
Alastair McDonald (#458) wrote:
The weather is chaotic, Alastair. The climate system is stable except at certain branching points – if for example you were balanced on the edge of an albedo flip. A switch in bimodal ocean circulation would be another. This is all physics. No tulip craze here.
Matt says
#429 Nick Gotts: It’s your analogy that’s flimsy. The oil companies (primarily Exxon) that are funding and supporting denial that AGW is a serious problem, must be assumed to want to convince governments and publics of this position.
Your belief is that if another fuel source arrived tomorrow that Exxon et al would be out of business? Exxon is in the business of moving energy from point A to point B. Today, point A is the middle east, and point B is your local gas station. Where point A begins in the future is largely irrelevent. Someone will need the skills to broker deals and ensure supplies are flowing. If it’s biodiesel you can bet Exxon will still play a major role, and even if it’s locally produced hydrogen you can bet the major oil companies will still play a role. Never underestimate the power of a brand. Dell could be selling lawn tractors starting tomorrow and they’d quickly figure out the supply chain and beat John Deere at their own game.
The same can be said for electric cars. GM wants to sell you a car, with a belief attached to that car that the car makes your life better. If it’s an electric car that folks want, the GM will make an electric car. If it’s a squirrel powered car, then GM will make a squirrel powered car.
As the old saying goes, the railroad companies failed because they beleived they were in the business of trains. They weren’t. They were in the business of moving goods from point A to point B. They viewed trucks as competition rather than as a way to shore up their weaknesses.
Walmart, FedEx, Microsoft, Exxon all exist NOT for the reason most believe they exist. The rules can change, and they will still be on top. Why? Because few can do what they do as efficiently. Their strength is understanding the complex. Feeding a country of several hundred million the trillions in $ they require in fuel is not a task for anyone except a large corporation. The fuel can be whatever you want it to be.
Matt says
#460 Timothy Case: The weather is chaotic, Alastair. The climate system is stable except at certain branching points – if for example you were balanced on the edge of an albedo flip. A switch in bimodal ocean circulation would be another. This is all physics. No tulip craze here.
And you think financial markets aren’t chaotic?
Chaotic systems are also usually very not well understood. Afterall, if you really understood the knobs that mattered, then slight changes in initial conditions wouldn’t result in big changes at the output.
Matt says
#457 DavidU: But if you look at the paper Gavin mention at the end of #440 you can see how close to the current actual climate the models in the late 80’s got.
Here’s a fun exercise (unfortunately I can’t post graphics). Take a historic global temperature record expressed as anomoly. Import that into your favorite graphics package, and build a triangle shape that graphically shows Hansen’s prediction: In 20 years, the warming will be a max of +0.7’C or a min of 0.17’C.
Now slide that triangle around the record and get a feel for just how often that prediction would be correct. The “prediction” he made is correct about 50% of the time. If you limit the prediction to 15 years, that prediction is correct about 65% of the time since 1850. Hansens prediction was also true from 1905 all the way through 1948. In fact, if you believe the temp will increase, find a period in time where Hansen’s prediction was NOT true! There weren’t that many times.
I could make a statement today about the the position of the DJIA in 20 years quite easily. Does that mean I understand the DJIA? Not really. All it really shows is that I was able to look at past performance, and come up with a window that would fit about 50% of the time in a bull or bear market. Make a prediction, then wait.
[Response: You confuse statistical forecasting which knows nothing about the underlying physics (and in your case is simple linear extrapolation) with physical modelling based on first principles. How is a linear extrapolation going to help assess a prudent level of emissions? Or assess the likelihood of changes in rainfall or storminess? Physics-based modelling is both much harder, and more useful. – gavin]
Neil B. says
You folks should know about this if you don’t already and have a response. It’s not an amateurish looking product:
http://xxx.lanl.gov/PS_cache/arxiv/pdf/0707/0707.1161v2.pdf
Falsification Of The Atmospheric CO2 Greenhouse Effects Within The
Frame Of Physics
Authors: Gerhard Gerlich, Ralf D. Tscheuschner
[edit]
[Response: It’s garbage. A ragbag of irrelevant physics strung together incoherently. For instance, apparently energy balance diagrams are wrong because they don’t look like Feynman diagrams and GCMs are wrong because they don’t solve Maxwell’s equations. Not even the most hardened contrarians are pushing this one…. – gavin]
Aaron Lewis says
Re 434
One of the things the Darwin did, is write letters asking people to go, look, and measure for him. When, he was writing his book on worms, he went and talked to guys with mud on their boots.He was my kind of a guy.
Chuck Booth says
Re # 432, 439 CarlG: “Finally, I am curious how these models are cross-validated (or if they are validated at all) and what type of model is typically used to fit the data.” ““…the modeling process” may differ widely from model to model, in which case I’d appreciate hearing some things about the variety of what’s out there.”
You might check these references (links provided to the abstracts):
D. Sornette, A. B. Davis, K. Ide, K. R. Vixie, V. Pisarenko, and J. R. Kamm (2007)Algorithm for model validation: Theory and applications. PNAS 104, 6562-6567
http://www.pnas.org/cgi/content/abstract/104/16/6562
R. K. Heikkinen, M. Luoto, M. B. Araujo, R. Virkkala, W. Thuiller, and M. T. Sykes (2006)Methods and uncertainties in bioclimatic envelope modelling under climate change. Progress in Physical Geography 30, 751-777
http://ppg.sagepub.com/cgi/content/abstract/30/6/751
G. A. Fine (2006)Ground Truth: Verification Games in Operational Meteorology. Journal of Contemporary Ethnography 35, 3-23
http://jce.sagepub.com/cgi/content/abstract/35/1/3
M. Lahsen (2005)Seductive Simulations? Uncertainty Distribution Around Climate Models. Social Studies of Science 35, 895-922
http://sss.sagepub.com/cgi/content/abstract/35/6/895
M. Donatelli, M. Acutis, G. Bellocchi, and G. Fila (2004)New Indices to Quantify Patterns of Residuals Produced by Model Estimates. Agron. J. 96, 631-645
http://agron.scijournals.org/cgi/content/abstract/96/3/631
Oreskes N., Shrader-Frechette K., and Belitz, K. (1994) Verification, Validation, and Confirmation of Numerical Models in the Earth Sciences. Science,
Vol. 263. no. 5147, pp. 641 – 646
http://www.sciencemag.org/cgi/content/abstract/263/5147/641
Steve Reynolds says
Barton Paul Levenson> You’d be surprised by how many people consider trillions of dollars in property damage and millions of deaths to be a bad thing.
While I will agree that sea level rise will eventually cause a lot of property damage, where is the evidence that AGW would cause more deaths than would be caused by mitigation efforts?
That includes extending the time required for the people of developing nations to rise out of poverty, and the likely reduction of resources supplied from developed nations to help provide clean water and reduce disease.
Hank Roberts says
Chuck’s right about those, and those are just a few examples. The literature on comparisons of the various large climate models, and the work done comparing their results, is just astonishing to me as an outside reader, as soon as I start looking for it.
The Pew Trust had addressed this a while back, I just found their page here, with a pointer to the Lawrence Livermore climate model comparison and validation program (that I’d just found a few days ago from another angle).
Pew covered this in describing Crichton’s errors: http://www.pewclimate.org/state_of_fear.cfm
Excerpt follows:
Crichton seems unaware that the discussion of climate model validation is a common feature of publications utilizing these models and model errors and biases are often explicitly quantified and described. Similarly, Crichton appears unaware of the various model comparison, evaluation, and validation projects that currently exist. For example, the Program for Climate Model Diagnosis and Intercomparison at Lawrence Livermore National Laboratory has been conducting model comparison and validation tests since 1989 (including the climate models used by the IPCC), and published a publicly available report of its research in the summer of 2004. [see http://www-pcmdi.llnl.gov/ ]
SomeBeans says
As far as I know environmental organisations fund very little *research*, they’re substantially PR organisations. In the UK I’d expect climate research to be mainly funded by the NERC, I assume there are some direct funding mechanisms for the Met. Office.
I don’t see any problems with climate modelling results coming out of the oil industry, they do a lot of science in other areas rather competently. If it goes into the peer-reviewed literature and stands the test of time then it’s a useful contribution.
DavidU says
#458
Yes Alastair unless you want to do more than mentioning buzzwords you need to say a lot more!
If you take some time to learn the mathematics behind chaotic systems you will learn that even though they are unpredictable in some ways, this is the part that is emphasised in the popular science books, they are also highly predictable when it comes to averages. This kind of stability is studied in an area called “ergodic theory”, and it is a theory the mathematical sense of the word not in the sense “theory = hypothetical”.
You also seem to be repeating the the exact misstake that I have already talked about. Climate models are not statistical fits to the past! They are first principle models, or simulation if you like that word better. If you do not believe that system with chaotic behaviour can be modeled in this way then you have far better things to worry about. For example, the air flow models used to design modern aeroplanes are exactly this kind of model, including chaotic turbulent flows around the wings of the aeroplane.
Financial systems are not hard to model because they are chaotic, they are hard to model because we do not know any mathematical laws underlying them. Because of this we have to make guesses and introduce random components in the models, and that leads to a completely different kind of unpredictability.
Regarding your first point, yes I expect that if there were papers where a model made a bad prediction, and the reason for the problem was not already well understood, Gavin would mention it. Gaps like that, in any model, normally point the way to good improvements and are the kind of things scientists really keep an eye open for.
Jeffrey Davis says
re: 442 and axes
An objection about axes (axises?). In this thread? Well, at least there’s no irony shortage.
Alastair McDonald says
Re #470
I should have written that both the financial markets and the climate are dynamical systems. Chaos is just one state of a dynamical system when there is strong positive feedback. The global climate system has been relatively stable for the last 10,000 years because negative feedbacks have dominated. Now, however, with the positive feedbacks from the ice albedo effect out of control in the Arctic, then we can expect the whole climate system to become unstable and a runaway warming to follow.
It is possible to build a computer model which reproduces chaos, but since the climate has been stable, the climate scientists have not needed to incorporate that sort of code. In other words they have matched their models to the climate they know, not the real one!
When they have tried to reproduce the abrupt changes which happened before 10,000 years ago they have failed, but tehy don’t like to admit that.
There were about twenty models used in the IPCC report and all gave different results for climate sensitivity. If Gavin and Jim Hansen’s model gave the right results then the other nineteen must have given the wrong results. Why didn’t Gavin cite them? They outnumbered his model 19 to 1. They are the consensus!
Hank Roberts says
Gavin, I understand what you said — the two graphs at the top came from GISS, and illustrate
“… different quantities, with different levels of noise. Why should they be on the same scale? Global means have much less variability than regional ones, or local ones.”
It might be useful to help people get a better idea how their region is varying, compared to other regions.
I know where I live, in the Pacific coastal fog belt, we’re not experiencing much local change.
If it weren’t for the science being done, we locally wouldn’t have a clue anything was changing in the world.
It might help people be a bit more aware that change is happening elsewhere, beyond their horizon.
There’s a song called “Before Believing”
____
“How would you feel if the world was falling apart around you
Pieces of the sky were falling in your neighbor’s yard
But not on you
Wouldn’t you feel just a little bit funny
Think maybe there’s something you oughta do ….”
————-
——— http://www.lyricsdepot.com/emmylou-harris/before-believing.html
Nick Gotts says
Re #467 [While I will agree that sea level rise will eventually cause a lot of property damage, where is the evidence that AGW would cause more deaths than would be caused by mitigation efforts?
That includes extending the time required for the people of developing nations to rise out of poverty, and the likely reduction of resources supplied from developed nations to help provide clean water and reduce disease.]
Aside from sea-level rise, which if more than minimal will deprive many millions of homes and livelihoods, likely effects include disappearance of high-altitude glaciers and snows on which around 1/6 of the world’s population depend for water supply; drought-affected areas increasing; flood events increasing; disruption of fisheries; increase in malarial areas. Tropical areas and the poor are likely to be disproportionately affected – most of what benefits there are will accrue in higher latitudes. (All this from the IPCC AR4 WGII Summary for Policymakers). These are only the most likely and most direct effects. Disruption of the Asian monsoon, if it happens, will be devastating. Increased variability and unpredictability of conditions year to year, likely in a dissipative system being pushed out of its current basin of metastability, could make things very difficult for farmers worldwide. Less directly, many of the above effects would carry the risk of refugee crises, disease outbreaks, state breakdown, and what I’d call “wars of distraction” launched by ruling elites under pressure.
On the other side, while there will undoubtedly be high costs to any serious attempt at mitigation, this would also require something like a global agreement (covering at least the rich world, India and China, and probably other states with large and currently poor populations) which would inevitably have to bring in issues other than greenhouse gas emissions – such as those you mention – if only because these states will say, reasonably enough, that they cannot bring their populations on board without serious help in those other areas.
In other words, if you really care about clean water and reducing disease, get behind climate change mitigation. Not only is it directly essential to solving those issues (tropical glacier melt, malaria); it could also serve as a catalyst for wider international cooperation on more equal terms than in the past.
Lynn Vincentnathan says
RE #427 & If AGW were true, then you can bet every government in the world would be falling all over themselves to ensure it didn’t blossom into a larger problem. But in spite of the *trillions* that world governments have on tap, they have opted to do nothing about it.
See, this is how it works. Politicians need money to run for office. The oil companies are happy to oblige. For instance, they give about the same amount to both the Clinton and Bush Sr campaigns. That way, no matter who wins, they have a door into the Oval Office.
Then there are the media — who not only report on global warming (or fail to do so), but also on elections. Well who funds them? Advertisers. I remember when ComEd was coming up for a long term contract in Chicago, all of a sudden I saw lots of ads on TV for ComEd electricity. Since IL was not deregulated and ComEd was a natural monopoly there, I thought, that’s strange that they would advertize when everyone has to buy from them. Then it dawned on me, yes, their contract is coming up & what they are buying is media support more than public support.
So, actually, we probably never will get a president who does anything about global warming. They’ve all sold their souls to the dev-oils.
Matt says
Gavin says in #463: [Response: You confuse statistical forecasting which knows nothing about the underlying physics (and in your case is simple linear extrapolation) with physical modelling based on first principles. How is a linear extrapolation going to help assess a prudent level of emissions? Or assess the likelihood of changes in rainfall or storminess? Physics-based modelling is both much harder, and more useful. – gavin]
Exactly! Predicting with a highly simplified model of a not-well-understood complex system amounts to little more than guessing. We can put some simple physics behind it to make it look impressive, but in the end it’s guessing.
This paper, to me, really underscores this: http://www.cgd.ucar.edu/ccr/publications/meehl_cmip.pdf
Here we have a bunch of models in very close agreement on CO2 sensitivity, but not even close on precipitation rates. How can that be?
How do you rule out that each modeler went in with a pre-conceived notion of what the CO2 sensitivity should be (and remarkably all models got very close to that) but at the same time the models were all over the place for rain? Is there really a rigorous physical model at the heart of all this, or is it a bunch of small physical systems strung together with feedbacks and feedforwards tweaked by the modeler to get the output they expect (confirmation bias)?
if the former, then we’d expect independently derived models to have the very similar output for CO2, rain, clouds, etc. If the latter, then at the heart of each model is really nothing more than intuition (a la Hansen’s prediction).
[Response: Your reading is based on an incorrect assumption that climate sensitivity is something you can easily fix. That’s just not so. The range in the CMIP simulations wasn’t all that narrow in any case (2 to 4.1ºC at equilibrium I think), and yes, precipitation is more variable (1 to 3% increase per degree of warming). But the tweaking is not done on the sensitivity to CO2 but to the present day climatology – of rainfall, clouds etc. All of the interesting sensitivities are emergent properties, and I assure you that they are not simply tunable based on intuition. – gavin]
Steve Reynolds says
Nick Gotts> …disappearance of high-altitude glaciers and snows on which around 1/6 of the world’s population depend for water supply; drought-affected areas increasing; flood events increasing; disruption of fisheries; increase in malarial areas.
Water supply issues can be addressed (and improved over present conditions) by building dams.
Most of your other concerns are speculation, with no consensus that they are likely to be severe (or even exist for some, such as malaria).
If human life is your standard, aggressive AGW mitigation is likely a very counter-productive approach.
James says
Re #461: [GM wants to sell you a car, with a belief attached to that car that the car makes your life better. If it’s an electric car that folks want, the GM will make an electric car.]
This is at best a half truth, because GM has invested large amounts of money to a) persuade large segments of the buying public that it wants a particular sort of car; and b) establishing a design & manufacturing infrastructure to build the sort of car that they’ve persuaded their market to want. Those establish a feedback cycle with a lot of inertia, so that they keep on building & selling particular sorts of cars, even when – as we’ve seen over the last 40 years or so – large parts of the market clearly prefer something different. Undoubtedly any company caught in such a cycle could break out of the loop, but few actually manage to do so.
I see this same pattern at all levels, from individuals who seemingly can’t alter self-destructive behaviors on up to entire societies. Indeed, I expect this is at the root of a lot of denialism: the denialists have established a set of habits, and reject any information that suggests a need to change them.
DavidU says
#472
Yes they are both dynamical systems, if they were not then both of them would be static in time. Feedbacks are not what gives rise to chaos, but for some dynamical systems they can. The mathematically simplest chaotic system are simply composed of two linear functions, no fancy stuff needed.
The fact that you think that one needs to input some kind of special “chaos code” into the models just show that you lack a basic understanding of chaotic systems. First principles models used to model both climate and fluid flows are in themselves chaotic on some scales unless there is something artificially dampening the system.
I can only repeat the basic fact that you again ignore. These models are _not_ constructed by fitting them to climate in the past. They are based on setting up the laws of physics together with our best knowledge of the state of earth at _one_ point in time and are then alloved to run.
This is exactly the kind of first-princples modelling used to build aeroplanes, cars, boats, computer chips, and used by e.g NASA and ESA to compute orbits for space mission to other planets.
Hank Roberts says
Hansen distinguishes between the captains of industry and the jesters.
He’s right; there are very, very few people right now whose opinions matter — the big top decision-makers.
Timothy Chase says
Steve Reynolds (#477) wrote:
What we are speaking of is years of drought over large areas occasionally interrupted by periods of severe flooding. These will include Australia, large parts of Asia (with over a billion facing severe water shortages in the latter half of this century due to the disappearance of glaciers on the Tibetean Plateau), and severe droughts in both the US southwest and southeast as the result of the expansion of the Hadley cell to the south.
You might try dams and irrigation – assuming you can find the fresh water to dam and have means to transport it, and can somehow irrigate a large enough area to make a dent in the problem, but then you face a much higher rate of evaporation from a more arid climate. The soil will dry out more quickly. Then there is the increasing heat stress which already appears to be having a measurable effect in terms of the atmosphere upon the ability of plants to sequester carbon dioxide during the hotter, drier years.
Steve Reynolds (#477) continued:
No doubt it is possible to cherry-pick the more speculative. However, hemorrhagic dengue is already in Taiwan and establishing itself in Mexico.
Lethal type of dengue fever hits Mexico
By Mark Stevenson
Sunday, April 1, 2007 – Page updated at 02:04 AM
http://seattletimes.nwsource.com/html/nationworld/2003645837_dengue31.html
In fact hemorrhagic dengue is in the process of becoming endogenous to Taiwan due to warmer winters…
Second dengue fever patient dies in Taiwan
(November 1, 2006)
http://www.sciencedaily.com/upi/index.php?feed=Science&article=UPI-1-20061101-13131500-bc-taiwan-dengue.xml
… and then there is India:
More dengue fever cases reported in India
(October 17, 2006)
http://www.sciencedaily.com/upi/index.php?feed=Science&article=UPI-1-20061017-10114900-bc-india-dengue.xml
But lets look at some more and what climatology has to say about this sort of thing:
Steve Reynolds (#477) continued:
Not much more speculative than my assumption that I will be able to get to work tomorrow without becoming a casuality in a fatal car accident, that the company I am working for will still be in business by the time I arrive and the building that I work at will not have burned to the ground as the result of arson. In fact, I am quite hopeful that I will receive a paycheck by the end of the pay period. With regard to malaria, see above.
Steve Reynolds (#477) continued:
I suppose it depends upon what you mean by “aggressive AGW prevention.” If you meant getting rid of all internal combustion engines tomorrow, I believe you might be right. But when one considers the droughts and famine which are virtually a given with the current business-as-usual approach, changing our trajectory over time while we have the time would seem the proper course. Mitigation will be an almost futile exercise if we don’t.
Philippe Chantreau says
RE Steve 477: I find that view simplistic at best. Let’s push the reasoning: water supplies threatened, let’s build dams. In the process of doing so, let’s have a few more species compromised or extinct. The ecosystems including these species then are changed and will inevitably become less productive and less generous of their services to us humans. So we will have to devise other (energy consuming) engineering solutions to compensate for that: no fish, let’s do aquaculture, and industrially produce the pellets to feed the fish. Of course, this will require more work and energy than the natural systems (check out how much energy and organic matter is necessary to make a pound of farmed salmon). From one engineering solution to another, we progressively change this all planet into an exclusive support system for humans. When that is all done, what kind of quality of life can these humans expect? How much work will they have to shoulder? How “rich” will any one of them actually be? I remember something similar to that being considered in a SF book called “The Godwhale” (T.J. Bass). It was not a very optimistic view of how satisfying, or even viable, such a world would be.
We have no precise no precise idea of how our societies will be affected by large scale collapses of ecosystems. However, it looks like we are willing to do that experiment.
DWPittelli says
Timothy Chase (#322):
Perhaps they also taught you in Statistics 101 not to assume a normal distribution in your errors.
Such an assumption is especially unwarranted in the case of temperature readings, given that siting problems almost always will increase temperature readings compared to clean sites (e.g., UHI, asphalt, buildings; I have not yet heard of a thermometer placed 10 feet from a tank of outgassing liquid nitrogen, although such is possible).
[Response: Think about this. If a tree grows, or a station is moved from the south side to the north side of a building, if you go from a city centre to an airport, if you ‘do the right thing’ and get rid of the asphalt etc… all of these will have the add a cooling artifact to the record. Assumptions that all siting issues are of one sign is simply incorrect. – gavin]
Ike Solem says
Matt, I think you’re missing Gavin’s point about the difference between statistical and physical modeling, so let’s take a simpler example: planetary orbits.
If you were to sit out in the backyard every night with a good telescope and collect data every night on the positions of all the planets, after a while you’d have a dataset that you could then use to predict the future positions of the planets – no physics required at all. How long would you have to do this in order to get a comprehensive predictive dataset? (Hint – it relates to the longest orbital period… for example, Saturn takes 29.4 years to circle the sun…)
On the other hand, you could use the physics developed by Kepler and Newton to mathematically predict the positions of the planets into the future – but could you make an absolute deterministic prediction? No, you could not – and see http://www.pnas.org/cgi/content/full/98/22/12342 for a discussion as to why not.
Now, let’s try and make the leap to the climate system. Just as one simple example of the statistical effort to look at climate, there’s a historical record known as “The Farmer’s Almanac”. This is a simple compilation by farmers of planting dates and crop yields – and in the past, when the climate was stable, this was actually a good guide for farmers. In today’s increasingly unstable climate, no one pays any attention to the old almanacs. This should in itself be a qualitative indication that the climate is becoming unstable – again, that’s a purely statistical argument.
So, what about deterministic modeling of the climate system? There’ve been many, many discussions of this on RC, and perhaps you should go look at Learning from a simple model, as just one of many examples. Or see the links in #385 above.
It all depends on what kind of questions you want to ask. For weather predictions, accuracy disappears within a few weeks – but for ocean forecasts, accuracy seems to have decadal scale accuracy – and when you go to climate forcing effects, the timescale moves toward centuries, with the big uncertainties being ice sheet dynamics, changes in ocean circulation and the biosphere response.
Thus, in climate science, both the statistical and the ‘deterministic’ methods are coming up with similar results. Most scientists would view that as pretty good confirmation of predictions made back in the 70s.
Steve Reynolds says
Timothy Chase > But when one considers the droughts and famine which are virtually a given with the current business-as-usual approach…
Timothy, your sources seem to be mostly news articles and interest groups. Where is the peer-reviewed science that indicates these dire predictions are likely? Even the IPCC admits that they do not know if the cost of mitigation is less than the cost of doing nothing.
There is a similarity in ignoring real science in both the extreme alarmist and denialist arguments.
Ike Solem says
Dodo,
You are the one who seems to be guilty of cherry-picking here. Why did you choose
that one graph in #362 to emphasize? Why not link to the whole collection at:
http://data.giss.nasa.gov/gistemp/graphs ?
Anyway, noone is trying to hide problems under the carpet – with the possible
exception of NOAA, whose remarkable decision to switch to a 1971-2000 baseline for
their temperature anomaly calculations has been utterly ignored by press outlets.
Here’s the background:
“As far as the NOAA issue goes, the use of a baseline to calculate temperature
anomalies relates to the issue of what is meant by ‘anomaly’. Now, in 2000 NOAA
decided to start using the time period 1971-2000 as the baseline for calculating
their anomalies, in contrast to the widely accepted use of the 1961-1990 time period
for their baseline.
The differences in the two anomalies are fairly dramatic; see for example
using NOAA’s 1971-2000 baseline, summer 2006:
http://www.emc.ncep.noaa.gov/research/cmb/sst_analysis/images/archive/monthly_anomaly/monanomv2_200606.png
Using the 1961-1990 baseline: summer 2006
http://www.bom.gov.au/bmrc/ocean/results/SST_anals/SSTA_20060625.gif
Also, NOAA uses the 1971-2000 baseline for their 2005 Arctic Climate Report, but
does not explicitly discuss this. Obvious, this gives a perception that warming in
the Arctic is much less severe then it actually is.”
So, why did NOAA do that? I’ve emailed them repeatedly asking for an explanation,
with the constant response of “We’ll get back to you on that”. They’ve held
workshops on this issue, and it turns out that NOAA data plays a different role than
NASA-GISS data:
“NOAA’s climate data and forecast products are key elements of the weather-risk
market. The lack of bias in NOAA’s products and NOAA’s provision of equal access are
of utmost importance in providing a “level playing field” for the weather-risk
market. Most of the major participants in the weather-risk market employ specialists
with an expertise in meteorology and climatology. Broadening the weather-risk market
to a wider range of companies requires that NOAA’s climate data and forecasts evolve
and become easier to understand and use.
Let’s walk through this: NOAA shifts to a 1971-2000 baseline in 2000 (they are
claiming that this is the ‘normal’ period). If you buy weather risk insurance,
deviations from ‘degree-day-normal’ conditions are used to calculate payouts. Thus,
if more recent conditions are considered ‘normal’, that creates a) a perception that
the warming is less than it is, and b) less financial risk for the weather insurance
industry.
This is deliberate data manipulation by a government agency for very questionable
purposes (and they still won’t go on record about their rationale for doing this!).
Obviously, they should switch back to the generally accepted 1961-1990 baseline,
shouldn’t they?
catman306 says
This kind of spinning makes me gag.
http://www.boston.com/news/globe/editorial_opinion/oped/articles/2007/08/19/warming_debate_scene_1_take_2/
Steve Reynolds says
Philippe Chantreau> From one engineering solution to another, we progressively change this all planet into an exclusive support system for humans.
While I think that is a little overstated, I understand your point.
Please notice that I qualified with ‘If human life is your standard,…’. If human life is not your standard, then you probably can justify aggressive AGW mitigation.
I’m defining ‘aggressive AGW mitigation’ as something more than a small (less than $20/tC) carbon tax or equivalent.
Dave says
US landmass is 2% of total earth surface, about 8% of dry land right?
what percentage of GISS cited temp guages are in US versus rest of world?
How many temp guages are sited in oceans? percentage of total?
Ron Taylor says
Re #467 Steve Reynolds says:
“While I will agree that sea level rise will eventually cause a lot of property damage, where is the evidence that AGW would cause more deaths than would be caused by mitigation efforts?
That includes extending the time required for the people of developing nations to rise out of poverty, and the likely reduction of resources supplied from developed nations to help provide clean water and reduce disease.”
I would be curious to know what commitments to these worthy efforts would be undermined by programs to mitigate CO2 emissions? Somehow, I doubt that commitments in those areas are likely to increase dramatically in the midst of the mounting problems created by climate change. I believe that a major source of funds for CO2 mitigation would come from resources otherwise committed to business-as-usual energy production.
The cost of climate change in agriculturally productive areas, plus damage from sea level rise is likely to vastly exceed the cost of mitigation. Just a personal estimate, of course.
Timothy Chase says
Steve Reynolds (#485) wrote:
You state, “Even the IPCC admits that they do not know if the cost of mitigation is less than the cost of doing nothing.”
So you stated in the 16 May 2007 thread A bit of philosophy, comment #30:
I responded in #57:
You state, “Timothy, your sources seem to be mostly news articles and interest groups. Where is the peer-reviewed science that indicates these dire predictions are likely?”
Well, lets look at drought.
Here is a brochure from the Hadley Centre – a little more readable than some of the more technical papers, but it has references to the peer-reviewed papers if you decide you want to look them up.
Effects of climate change in developing countries
November 2006
http://www.metoffice.gov.uk/research/hadleycentre/pubs/brochures/COP12.pdf
It describes some of the recent drought conditions, compares observed drought and modeled drought conditions from 1950 (observed was roughly 20%) to 2000 (observed was roughly 30%), then makes projections based upon climate models and the business as usual SRES A2 scenario where roughly 50% of the world’s land will be experiencing drought by 2100 at any given time. But this is probably on the conservative side.
Now lets look at glaciers – since they are what feed many of the world’s major rivers – such as the six major rivers of China.
Here is something from the beginning of a paper in Current Science:
Now just limiting ourselves to what has gone on so far:
To get a sense of what is happening worldwide, you can check the global glacier mass balance chart at the bottom of this page:
SOTC: Glaciers
http://nsidc.org/sotc/glacier_balance.html
That is from the National Snow and Ice Data Center’s:
State of the Cryosphere
http://nsidc.org/sotc/
You can also check here to see a map of glacier advance and retreat throught the world:
http://www.globalwarmingart.com/wiki/Image:Glacier_Mass_Balance_Map_png
What little blue you see is where glaciers are advancing. All the yellow and brown represent retreat.
Now remember, glaciers feed many of the world’s major rivers. Once the glaciers of the Tibetan Plateau are gone, the six major rivers running through China will dry up. Incidentally, if you are concerned only with climate change insofar as it affects US agriculture I can look that up, too.
Steve Reynold wrote:
I am refering to the science, and as far as I can tell you haven’t even made an effort to bring any of that to the table.
Timothy Chase says
PS to #489
I should mention that Steve Reynolds #485 was responding to #481. Somehow he only remembered to mention who he was responding to rather than what he was responding to. I thought I would help.
… and now for something completely different:
Well, not really.
Here is an interesting paper from earlier this month on the Asian Brown Cloud. Locally it is supposed to amplify the greenhouse effect in the lower troposphere as much as greenhouse gases. Reducing the pollution will significantly reduce the rate at which glacier mass balance is lost in the Himalayas. Anyway, the bit I will quote indicates the importance of these glaciers to the fresh water supply of Asia.
However, there is a price of sorts – while locally the Asian Brown Cloud amplifies the greenhouse effect, globally it masks the greenhouse effect due to the aerosol-induced global dimming.
ziff house says
belated response to steve bloom 394, snrjon 399, i was at the head of Tanquary Fiord, an area sheltered by 2 mountain ranges and i think in recent years those temps are not unusual. Streams that were trickles 11 yrs ago were running high [uncrossable]. on the other hand glacier toes were in the same position as they were on the 75 topo. from the air every stream had good flow from permafrost melt, there is little snow as the arctic is quite dry. the ice sheets looked desiccated like ice cubes left in the freezer too long.
Timothy Chase says
DWPittelli (#483) wrote:
I was speaking of random errors – and given the nature of the “error” which was discovered, this seemed quite appropriate, but you are speaking of a systematic bias on the part of a given site or sites.
If you are speaking of a poorly installed site, this will not create a continually rising trend. If you are speaking of the urban heat island effect, we can use just rural sites and we get virtually an identical trend. If you are speaking of the installation of a barbacue or air conditioner right next to a site, this will produce a jump, not a continually rising trend.
If you are thinking about urban growth, this is something which can be and is now tracked by satellite measurements of night lights. Likewise, we can perform satellite measurements of the lower troposphere which produce trends with virtually the same slope, although somewhat higher variability. Not much room for the Urban Heat Island effect there.
*
Given the nature of the world, conclusions are rarely guaranteed by any given set of evidence. But the more evidence one acquires and/or the more independent lines of inquiry which lead to the same conclusion (e.g., that the average global temperature is rising at a given rate), the more justification that conclusion receives.
Yet there are those who think that if it is possible to cast some doubt upon any given piece of evidence, no conclusion can ever be justified. Such an approach is roughly comparable to the obsessive/compulsive disorder in which one has to keep checking to make sure that the door is locked because the last five times one checked to see whether it was locked, it is at least possible that one is misremembering. If someone with such a disorder keeps this up, they will never leave the house.
If your house is burning down, you better hope that the firemen who are getting ready to leave and put out that fire aren’t suffering from this disorder.
I believe we have a fire to put out.
PS
Gavin does an inline to #483 which is worth checking out and deals with some other aspects, namely, whether errors in site temperature measurements will generally be positive.
Dylan says
Is anyone aware of a website somewhere hosting versions of the GHCN monthly mean data in CSV format, or at least in a format easily read into Excel etc.? Something like the format of the GISS data would be perfect.
Steve Reynolds says
Timothy Chase(not using numbers because they change)> Here is a brochure from the Hadley Centre…
I looked at your referenced PR material, but did not see much supported by peer-reviewed studies supporting your dire predictions. They did have projections of increased food production due to higher CO2 concentrations, though.
For all the glacier concerns: I still have not seen a good reason why building dams is not a complete solution. It is what we already do where glaciers do not exist.
Dodo says
Re 483. “[Response: Think about this. If a tree grows, or a station is moved from the south side to the north side of a building, if you go from a city centre to an airport, if you ‘do the right thing’ and get rid of the asphalt etc… all of these will have the add a cooling artifact to the record. Assumptions that all siting issues are of one sign is simply incorrect. – gavin]”
Isn’t this a misunderstanding? What Gavin describes are just measures for removing a warm bias from the microclimate, not adding a cooling artefact. A thermometer is supposed to be in the shade anyway, so trees and south side-north side issues will not cause artificial cooling in well-mixed air.
Julian Flood says
Ike Solem: quote for ocean forecasts, accuracy seems to have decadal scale accuracy unquote
Do the ocean forecasts for the NH correctly display the late thirties/early forties temperature blip? Better still, do they post facto track the blip when it is not smeared by the Folland Parker adjustment?
I see there’s an interesting model that uses SSTs as the driver for climate change — I’ve wondered why no-one uses lighthouse records (westerly facing lighthouses would give a fair proxy for SSTs) to clean up the SST record.
JF
Barton Paul Levenson says
[[ where is the evidence that AGW would cause more deaths than would be caused by mitigation efforts?]]
Common sense?
Barton Paul Levenson says
[[For all the glacier concerns: I still have not seen a good reason why building dams is not a complete solution. It is what we already do where glaciers do not exist.]]
Less glacial runoff = less water to dam
More drought = less water to dam
Hotter, dryer conditions = faster evaporation from dam reservoirs.