With the blogosphere all a-flutter with discussions of hundredths of degrees adjustments to the surface temperature record, you probably missed a couple of actually interesting stories last week.
Tipping points
Oft-discussed and frequently abused, tipping points are very rarely actually defined. Tim Lenton does a good job in this recent article. A tipping ‘element’ for climate purposes is defined as
The parameters controlling the system can be transparently combined into a single control, and there exists a critical value of this control from which a small perturbation leads to a qualitative change in a crucial feature of the system, after some observation time.
and the examples that he thinks have the potential to be large scale tipping elements are: Arctic sea-ice, a reorganisation of the Atlantic thermohaline circulation, melt of the Greenland or West Antarctic Ice Sheets, dieback of the Amazon rainforest, a greening of the Sahara, Indian summer monsoon collapse, boreal forest dieback and ocean methane hydrates.
To that list, we’d probably add any number of ecosystems where small changes can have cascading effects – such as fisheries. It’s interesting to note that most of these elements include physics that modellers are least confident about – hydrology, ice sheets and vegetation dynamics.
Prediction vs. Projections
As we discussed recently in connection with climate ‘forecasting‘, the kinds of simulations used in AR4 are all ‘projections’ i.e. runs that attempt to estimate the forced response of the climate to emission changes, but that don’t attempt to estimate the trajectory of the unforced ‘weather’. As we mentioned briefly, that leads to a ‘sweet spot’ for forecasting of a couple of decades into the future where the initial condition uncertainty dies away, but the uncertainty in the emission scenario is not yet so large as to be dominating. Last week there was a paper by Smith and colleagues in Science that tried to fill in those early years, using a model that initialises the heat content from the upper ocean – with the idea that the structure of those anomalies control the ‘weather’ progression over the next few years.
They find that their initialisation makes a difference for a about a decade, but that at longer timescales the results look like the standard projections (i.e. 0.2 to 0.3ºC per decade warming). One big caveat is that they aren’t able to predict El Niño events, and since they account for a great deal of the interannual global temperature anomaly, that is a limitation. Nonetheless, this is a good step forward and people should be looking out for whether their predictions – for a plateau until 2009 and then a big ramp up – materialise over the next few years.
Model ensembles as probabilities
A rather esoteric point of discussion concerning ‘Bayesian priors’ got a mainstream outing this week in the Economist. The very narrow point in question is to what extent model ensembles are probability distributions. i.e. if only 10% of models show a particular behaviour, does this mean that the likelihood of this happening is 10%?
The answer is no. The other 90% could all be missing some key piece of physics.
However, there has been a bit of confusion generated though through the work of climateprediction.net – the multi-thousand member perturbed parameter ensembles that, notoriously, suggested that climate sensitivity could be as high as 11 ºC in a paper a couple of years back. The very specific issue is whether the histograms generated through that process could be considered a probability distribution function or not. (‘Not’ is the correct answer).
The point in the Economist article is that one can demonstrate that very clearly by changing the variables you are perturbing (in the example they use an inverse). If you evenly sample X, or evenly sample 1/X (or any other function of X) you will get a different distribution of results. Then instead of (in one case) getting 10% of models runs to show behaviour X, now maybe 30% of models will. And all this is completely independent of any change to the physics.
My only complaint about the Economist piece is the conclusion that, because of this inherent ambiguity, dealing with it becomes a ‘logistical nightmare’ – that’s is incorrect. What should happen is that people should stop trying to think that counting finite samples of model ensembles can give a probability. Nothing else changes.
Eric (skeptic) says
John (#148), I don’t who data360 is, but their numbers are off. The economist article that they link to doesn’t contain the numbers they claim. The numbers are flat, see for example: http://www.eia.doe.gov/emeu/25opec/sld020.htm Over that same period our economic growth was about 70%. I agree with your ideas on farming. In general the US pays farmers without consideration to self sufficiency or energy usage. Perhaps you thought I implied cause and effect for globalization. It is that increasing incomes for the poorest of the poor leads to stagnating incomes for the rich (i.e. middle class Americans), as fallout from a more efficient division of labor.
Nick Gotts says
take a look at the papers linked to at http://climate.envsci.rutgers.edu/nuclear/, and particularly at:
NUCLEAR WINTER REVISITED WITH A MODERN CLIMATE
MODEL AND CURRENT NUCLEAR ARSENALS:
Re #125 [re 121
The nuclear winter scenario involves the nuclear destruction of cities, and consequently large amounts of soot lofted into the stratosphere. Individual nuclear tests don’t generally do that. You got some of it with Hiroshima and Nagasaki, but not enough to be significant.
=============
I recall reading recently that the amount of particulate matter tossed up in the atmosphere by Krakatoa in 1883 was along the order of 13,000 times that of Little Boy…]
The attacks on Hiroshima and Nagasaki did not cause firestorms. In a modern nuclear war, using many much more powerful weapons, if cities were targeted a considerable number of firestorms are almost inevitable. These would loft large amounts of black carbon into the stratosphere, where it would remain for years to decades. I’ve no idea whether your figure for Krakatoa is accurate, but the nature of the particulate matter would have been completely different. In a discussion in April this year, I posted the following:
[Take a look at the papers linked from http://climate.envsci.rutgers.edu/nuclear/
especially:
STILL CATASTROPHIC CONSEQUENCES
Alan Robock, Luke Oman, and Georgiy L. Stenchikov (J. Geophys. Res, in press)
This indicates that global nuclear war would have utterly devastating atmospheric and climatic effects – global dimming and cooling (to mean temperatures below that at the LGM 18000 years ago)and a sharp drop in precipitation, largely wiping out agricultural production for perhaps a decade, plus extensive destruction of the ozone layer. Still, at least we wouldn’t need to worry about AGW any more.]
Nick Gotts says
Re #152 Sorry, my last got rather garbled – please ignore everything before “Re #125”.
Hank Roberts says
Nick, fact check: http://www.atomicarchive.com/Effects/effects11.shtml
———— begin excerpt ————
Firestorms
Under some conditions, the many individual fires created by a nuclear explosion can coalesce into one massive fire known as a “firestorm.” The combination of many smaller fires heats the air and causes winds of hurricane strength directed inward toward the fire, which in turn fan the flames. For a firestorm to develop:
* There must be at least 8 pounds of combustibles per square foot.
* At least one-half of the structures in the area are on fire simultaneously.
* There is initially a wind of less than 8 miles per hour.
* The burning area is at least 0.5 square miles.
In Hiroshima, a firestorm did develop and about 4.4 square miles were destroyed. Although there was some damage from uncontrolled fires at Nagasaki, a firestorm did not develop. One reason for this was the difference in the terrain. Hiroshima is relatively flat, while Nagasaki has uneven terrain.
http://www.atomicarchive.com/Effects/Images/WE09.jpg
Firestorms can also be caused by conventional bombing. During World War II, the cities of Dresden, Hamburg, and Tokyo all suffered the effects of firestorms.
Hank Roberts says
> Eric 27 August 2007 at 5:54 AM
You’re comparing kilowatt hours to british thermal units there, and different time spans, but the charts don’t look inconsistent. I emailed the data360 guy, maybe he’ll check in.
Jim Bouldin says
If ensembles cannot be used to create probability distributions, and thus likelihoods of particular outcomes, as Gavin argues here, then I presume their use must be for evaluation against empirical data, in order to determine that same likelihood. Can anybody clarify the main purpose of running ensembles?
[Response: Most ensembles are ‘initial conditions’ ensembles and are performed to average out ‘weather’ to get to the forced response. The perturbed parameter ensembles, I think, are useful in exploring phase space and finding good solutions that might not otherwise have been found, but whether they can be usefully averaged to get robust results is still being explored. – gavin]
Rod B says
The feds, dhogaza, can regulate almost anything manufactured within the US, regardless where sold. Also, sometimes the regulation is quite successful arm-twisting and extortion rather than legal code based. [edit]
(maybe I can sneak this past Gavin …[;-) [nope – G.]
Nick Gotts says
Re #154 Hank, thanks for the correction. All: apologies, I relied on memory in saying that there was no firestorm at Hiroshima, and it let me down.
James says
Re #147: […but they *still* do hard physical labor all day.]
Whereas I, having the advantage of living in a technological civilization (and being, if you’ll forgive the immodesty, amongst the technical elite), sit at a desk all day, then spend most of my evenings doing hard physical exercise to compensate for the effects of that sitting. Forgive me if I fail to see that the benefits are quite so one-sided as you imply :-)
This seems to reflect on a broader philospohical question. People tend to discuss wealth in connection to AGW and various mitigation options, but what exactly is it? Just having more money? But how if the price of those goods you want increase even faster than your income? Am I getting richer, because I can afford more consumer goods (most of which I don’t want), or poorer, because the world has changed in such a way that I’m less able to afford what I do want?
FurryCatHerder says
Re: #147
John —
My use of computers as an example was because you and I are both in that industry. However, there are advances in technology that are outside of the computer biz.
Sticking strictly with “energy”, since that’s where you are focusing, 100% renewable power is falling below the costs of non-renewable, for example solar concentrating photovoltaics — http://www.solar.unlv.edu/amonix_system.php . Parabolic reflectors can be used as well, and simpler single axis automatic trackers can be used, rather than the two axis variety used by Amonix. This is an example of technological advanced produced wealth.
It is the nature of technological advancement that there aren’t a lot of hard numbers. For example, until the first super-scalar processors were made, it was considered impossible to issue more than one instruction per cycle. My recollection is that Gene Amdahl (and now I’m way into geek history) is the source of that “rule”. I don’t know anyone who thought phone modems would ever exceed 9,600 bps, and a number of us were absolutely amazed when Telebit introduced the “Trailblazer” line of equipment. The last modem I bought, before getting broadband, was capable of 118,000 bps on an ordinary phone line.
Non-geek examples: Airplanes were supposed to be impossible. The electric starter for the automobile — impossible. Space flight — impossible. Each of these problems was solved by technology of some sort or another and there is simply no evidence that “energy” won’t also be solved by technology.
Andrew says
“Each of these problems was solved by technology of some sort or another and there is simply no evidence that “energy” won’t also be solved by technology.”
On the other hand, “perpetual motion” is a problem that was not solved by technology, and hoping that technology will “solve” it, is not that good a bet.
You see, there is this small matter of thermodynamics. And, oddly enough, any time you are going to play with large amounts of energy, thermodynamics gets to be the referee.
John Mashey says
re: #151 Eric
Thanks for the useful pointer, in fact, it is well worth reading the whole presentation,www.eia.doe.gov/emeu/25opec/sld000.htm, even though it is 10 years old, and has some “amusing” elements, such as slide 4, showing the price of gasoline, with a horrific peak around 1982 or ~$2, before subsiding again to its nice stable price ~$1.20 :-) It’s well worth looking at different ways to slice all this, but I do urge you to read the economics references I gave back in #127.
Note that per capita use of energy declined over 25 years, because after the OPEC embargo, there was more emphasis on efficiency. The chart notes that since 1983, energy use has increased at 0.8 percent per year. Efficiency always matters, and of course, there are big differences in GDP/energy consumption by the economic mix: one would expect areas with big software companies to have high GDP/energy, for instance.
Here are a few more interesting graphs:
http://www.fi.edu/guide/hughes/topic5.html
http://www.nationmaster.com/graph/ene_usa_per_per-energy-usage-per-person
http://www.eia.doe.gov/emeu/cabs/carbonemiss/chapter4.html is especially useful, since it describes/compares Asian economies in various phases of usage, showing that both energy supply & efficiency matter.
re: #159 james
I’d never say the benefits are one-sided; I grew up on a small family farm, and there are many merits to that [outdoors, early responsibility, learn about ecology/sustainability before you’re 10, parents do comprehensible work that you can actually help with, more coherent community relationships, and some people really love the independence of running their own business.]
Nevertheless, it’s hard work, bad weather can wipe you out, and (especially with livestock) it never stops. Unsurprisingly, a lot of farm kids decide they’d rather do something else.
I’ve often seen urbanites rhapsodize about how nice it would be to have a farm in the country, and for some people, that’s a good move. I’ve heard people talk about spending a wonderful week, say on a Swiss farm. My suggestion is, before you buy one, try it for a year, *for real*, and make sure you like it.
A lot of people in rich countries really don’t understand how food happens. [My favorite was a NYC guy at grad school with me, whose only office decoration was a NYC subway map. He liked chocolate milk, so we took him to the Penn State fields to show him the dark cows that gave it… He wasn’t sure we were joking. How was he to know? He’d never gotten his milk by milking a cow, but by going to the store.]
There are also big differences among:
1) First-world farms, embedded in high-tech societies with good transport and communication. [Even *I* might like a week on a Swiss farm].
2) Developing-world farms [like some of those India & China ones], organized around cooperative villages. Compared to 3), these are incredibly prosperous. However, in China, you may have noticed that many people have been leaving their lovely villages for “better prospects” in coastal manufacturing, whose conditions would not thrill many.
3) Real third-world subsistence farms, say in sub-Saharan Africa.
====
Exactly what wealth is, I don’t know. The economists I referenced use specific definitions for production, but that isn’t necessarily wealth.
However, I suspect a typical Kansas farmer would be viewed as unimaginably wealthy by most Chinese farmers…
Eric (skeptic) says
Hank (#155), thanks, I knew the units were different, but the trend should be the same. John (#162), I downloaded and read “reintegrate” and I’ll get to the rest soon. Interesting perspective, I always thought that more integrated models would be a good answer to get common ground on policy issues. It would help prevent political debacles like ethanol and get people thinking about ways they can reduce their waste they would have never thought of.
Barton Paul Levenson says
[[Airplanes were supposed to be impossible. The electric starter for the automobile — impossible. Space flight — impossible.]]
None of the arguments against these depended on physical principles; they were arguments about engineering. Heavier than air flight is obviously possible, since birds do it. But using a steam engine you couldn’t build an airplane light enough. The arguments against space travel were just wrong, and could be shown to be wrong ever since Newton published the Principia.
FurryCatHerder says
Re 141:
Andrew writes:
On the other hand, “perpetual motion” is a problem that was not solved by technology, and hoping that technology will “solve” it, is not that good a bet.
You see, there is this small matter of thermodynamics. And, oddly enough, any time you are going to play with large amounts of energy, thermodynamics gets to be the referee.
There is nothing about our current energy consumption that cannot be solved because it violates any of the laws of thermodynamics. All of the electricity requirements of the 120 homes that make up my neighborhood can be satisfied on the currently unused land within my neighborhood. The subdivision east of me, which I think is about 2,000 homes is the same way.
The obstacles aren’t physics, they are things like “I don’t want to use those lights because then I can’t use my dimmer”, or “I don’t want to see a wind turbine in my neighborhood”.
FurryCatHerder says
Re #164:
None of the arguments against these depended on physical principles; they were arguments about engineering.
And all the arguments against what myself and others have said are likewise not about physical principles, they are based on politics, economics, personal nature, or just plain crankiness.
It’s like American car companies — they claim they can’t build fuel efficient cars, but other countries have been doing it for years, have higher corporate fuel economies, and seem to stomp our butts in the marketplace. Quit it with the excuses, already.
Andrew says
“The mathematical physicist v. Neumann once said to his young collaborators: ‘If you allow me four free parameters I can build a
mathematical model that describes exactly everything that an elephant can do. If you allow me a fifth free parameter, the model I build will forecast what the elephant will say.’
Isn’t the same problem that portrays relying on Climate Modeling as predictors?”
I don’t think von Neumann said that, it’s a very old story with many forms. And, of course, if you think like an early twentieth century engineer, you might believe it. But it’s obviously false. No mammalian genome can be accurately described with a small number of parameters, so that anecdote should be retired.
More to the point in climate, an early attempt to approach the number of parameters required for atmospheric prediction is Ed Lorenz’ work on “analogs” (Lorenz, E. N., 1969: Atmospheric predictability as revealed by naturally occurring analogues. J. Atmos. Sci., 26, 636–646) and you get a lower bound on the number of parameters of about 500. Yes, when you go from weather to climate you average a lot of stuff out, but it’s going to be tough to whittle that down to the proverbial 6 parameter charging rhinoceros.
Many fields now routinely confront problems with thousands of parameters. This is the twenty first century. If, for example, we consider a less politically charged subject, say wine making. These guys (http://dynopt.cheme.cmu.edu/papers/preprint/paper19.pdf) are working with over 33,000 variables. That’s not even that big compared to other complex systems.
The bottom line is that yes, it is the same for climate modelers – they have a lot of variables and they have to take some care of that. But it’s the same as for a lot of fields these days, you can take appropriate care, and get good results. The fact that there are a lot of parameters does not mean that models will not perform well.