Peter Doran, the lead author on a oft-cited, but less-often read, Nature study on Antarctic climate in 2002 had an Op-Ed in the NY Times today decrying the misuse of his team’s results in the on-going climate science ‘debate’. As we discussed a while back (Antarctic cooling, global warming?), there is a lot of interesting stuff going on in Antarctica: the complexities of different forcings (ozone in particular), the importance of dynamical as well as radiative processes, and the difficulties of dealing with very inhomogeneous and insufficiently long data series. But like so many results in this field, it has become a politicized ‘talking point’, shorn of its context, that is mis-quoted and mis-used by many who should (and often do) know better. Doran complained about the media coverage of his paper at the time, and with the passage of time, the distortion has predictably increased. Give it another few years, maybe we’ll be having congressional hearings about it…
Rhampton says
Senator Inhofe makes clear why he is motivated to combat the global warming “myth.” IMHO — the primary sponsor is Exxon-Mobil.
I guess the Senator is unaware that corporate giants like Wal-Mart (Fortune, July 27 2006 “Wal-Mart sees green”) have not only accepted anthropogenic warming, but have started to take action.
Roger Smith says
“Roger, we already have a buried charcoal-like material derived from plant matter . . . it’s called coal. I’m afraid that digging up old coal and burning it for the energy to make new charcoal would just exacerbate the problem of AGW.”
While this seems at least in part in jest, it’s hard to tell as it makes no sense. We can take *new* plant matter that stores carbon while it is alive, and make it a permanent carbon sink by turning it into bio-char and burying it for good. Otherwise it dies and the carbon ends up back in the air. This will not provide any energy, but the stored carbon won’t go anywhere or dissolve underground mines through acidity, unlike gaseous CO2.
It isn’t an either/or situation. I work for an NGO to reduce GHG emissions and this must be the bulk of the solution.
We’re also not thrilled by geoengineering ideas like increasing SO2 pollution with other obvious disbenefits.
Hank Roberts says
>80, 100
I can’t help you with the info, aside from suggesting you ask Science or the author for it.
As an academic yourself, I imagine the ISI database should be available to you, and searchable; have you tried replicating the result, without changing the search terms as Peiser did, to see what you get?
There are a great many studies _of_ the ISI product, for example
http://www.ala.org/ala/acrl/acrlpubs/crljournal/crl2004/crlmarch04/nisonger.pdf
My impression is that given the date range and search term, people simply look this stuff up using the same terms, to get the same information.
If you don’t, you have a publication right there — although don’t make Peiser’s error!
Dano says
RE 103 (Roberts):
I imagine the ISI database should be available to you…have you tried replicating the result, without changing the search terms as Peiser did, to see what you get?
I did this and commented at Lambert’s old site. I was able to easily replicate the result.
As to whether Wacki can look at how Oreskes categorized the abstracts, I have no problem asking a researcher for info like this (not as the Dano character, but me as an applied researcher asking) and getting a reply. E-mail her, give your creds, and ask.
—–
RE 101 (Rhampton):
I guess the Senator is unaware that corporate giants like Wal-Mart…have not only accepted anthropogenic warming, but have started to take action.
Oh, I’m confident he’s well aware.
Best,
D
Wacki says
Re #103 “As an academic yourself, I imagine the ISI database should be available to you, and searchable; have you tried replicating the result, without changing the search terms as Peiser did, to see what you get?”
If I only provided my conclusions in my line of work I’d be in some serious trouble very quickly. I honestly can’t believe publishing raw data isn’t standard practice in any statistical study. I have e-mailed Oreskes and she says she will probably post the list on her website as others have suggested it. As of now it only exists in note form in her lab.
shargash says
Re: #99 – “Oil companies reckon that there is a lot more Oil out there to be found”
Some oil companies claim that; others (BP, frex) do not. However, discoveries of new oil have been declining since the 1960s, and there were several periods of high oil prices during that period. The vast majority of places where oil might be found have already been searched, and searched thoroughly. No new elephant fields have been discovered since the 60s. Most “big” new fields discovered contain less than 10 billion barrels. Consumption is 84 million barrels a day. Do the math.
What’s worse is that the oil reserves of all OPEC nations is almost certainly inflated. In fact Kuwait was caught cheating late last year and has downgraded its reserves. Saudi Arabia inexplicably inflated its reserves when OPEC was formed, and almost no one believes the kingdom has as much oil as its inflated estimates.
For oil to last until the end of the current century at current consumption rates, several times more new oil would have to be discovered than has ever been discovered in history. That’s fantasy.
Peak oil is vital to climate change, because declining oil production will force a rapid and large expansion of the use of coal. If the conversion to coal has a panicy nature, as it will if oil prices shoot up fast enough, there is very little chance the conversion to coal will be accompanied by carbon sequestration methods.
Brian Gordon says
Re: peak oil
Some folks point to Alberta’s oil sands as the saviours of our oil-based civilisation, but there are some major problems with that:
1. Because of the greatly increased cost of extracting oil from the tar sands, Alberta agreed to a 1% royalty until capital costs are repaid. Alberta currently gets ~25% (or more) from conventional oil extraction, so this is a huge loss of income.
2. Conventional oil and natural gas reserves in Alberta have been declining.
Add these two together and you can see that Alberta is in big trouble financially. Add the colossal environmental destruction caused by extracting oil from the tar sands. Add global concern about global warming. The net result may well be that Alberta/Canada voluntarily limit or even end digging through the tar, much as the US chose not to drill in the ANWR.
http://www.cbc.ca/news/background/realitycheck/20060705sheppard.html
“…production from so-called conventional oil and gas reserves is dwindling noticeably, and has been for a few years…. Resource revenue alone totalled a record $14.3 billion last year, slightly more than half of all government spending.
…
A barrel of $50 bitumen earns the province only about a quarter, the Globe and Mail calculated recently. And this looks to be a real concern over the next several years, because as the Canadian Association of Petroleum Producers points out, the oilsands are expected to account for 90 per cent of Alberta’s oil production within the next 10 years.
…
In the past, Alberta has expected to reap roughly 25 per cent as the province’s share of oil and gas revenues. But according to the Calgary Herald, even Alberta Energy has acknowledged that rate has slipped to 19 per cent as of 2004, while other analysts have put it closer to 15 per cent.
Glen Fergus says
Re 10:
Thanks for that. I’ve been looking for an update on Svalbard, but with that and a few deft clicks I can do my own, and no Norwegian language skills required! See http://members.optusnet.com.au/~gsfergus/Svalbard%20to%206-06.png
After all the drama this winter-spring, June came in at a mere +4.1°C. But it’s the trend that’s impressive, particularly once you read the temperature scale … +6°C in 100 years; +4°C of that in the last 30 years. But as Rasmus said, it’s just one point.
[Monthly anomalies are relative to the station average of all means for the month. I’ve moved Luft up by 2°C and Isfjord down by 1°C so that the regression lines meet at the station change, and so that they cross zero near 1975. So anomalies are roughly relative to the usual 1960-90 interval.]
Chris Rijk says
Re: #86
Those models are being tuned with new observations however. Wide-scale pumping of sulphates into the atmosphere would be harder to model accurately due to lack of data. The longer AGW goes on for, the better the models will become, due to better data, improvements in the models themselves, and far greater computer resources. Extreme counter-measures will get easier to model for similar reasons – except they won’t have more data. So the uncertainties will still be large.
With something as complex as the climate for the entire planet, there are massive difficulties – like the sheer amount of data needed, and all the “edge cases” which are particularly important for positive and negative feedbacks. However, the planet is effectively computing the problem for us – it is reacting to increased greenhouse gasses (and other human activity) and giving us an “answer”. One of the current problems for modelers is the lack of high-quality, high-resolution data for the whole planet (eg satelite measurements), since we only have a few decades worth (or less) in many cases. In 50/100 years time, that’d be much less of an issue.
While general understanding of the climate will improve over time, doing something new will still have a lot of uncertainty.
I already said “The sad thing is, smart solutions to reducing CO2 output would have many short/medium-term benefits.”
Countries have been able to both grow their economies and reduce consumption of crude oil in the past – eg with the efficiency drives after the second round of oil shocks. While a long term efficiency drive for energy usage would have transition costs, they would mostly be temporary, and the benefits would be long term.
With the rise in oil prices (and similarly for other sources of energy), businesses are already reacting – with energy costs rising as a percentage of costs, energy efficiency is becoming a higher priority. For example, the computer industry has become a lot keener on this in the last year or so.
Besides, if your opinion of politicians is so low, why bet on “radical” solutions. If in 50-100 years time, one country or small group of countries decided to unilaterally “fix” the global warming problem with drastic measures (such as blasting lots of sulphates into the whole atmosphere, or bio-engineering the world’s oceans), that could cause a major war. Putting it another way, major countries would not be willing to risk the fallout unless there was a global consenus on the “remedies”. If things have already gotten so bad, you want to then assume politicians over the whole world will suddenly come to their senses?
In comparison, for reducing greenhouse gas emmissions, each country can do it themselves without having to worry about how others might react. After all, as I have already argued, improving energy efficiency dramatically would help long term economic growth. Not many politicians have realised this yet though, unfortunately, which is why many of the proposed solutions to date are rather clunky.
Eric (skeptic) says
Chris, thanks for your detailed response. As you might suspect, I am much less of a believer in global consensus than many on this site. But if we agree that raising the price of oil has led to efficiency and raising the price of carbon would do the same, would not raising the price of warming itself lead to the most efficient solutions for warming? In that case we would have to rely on models with sufficient accuracy to predict the effects of, for example, converting a large amount of cattle farming back to rainforest. Would that outweigh the benefits of shutting down a coal power plant or would it cost more than sequestering carbon? Again, back to the models. If I had to trust one thing in 10-20 years time, it would be the ability of the models to accurately predict warming instead of the ability of politicians to come to a sensible consensus.
In our representative democracy lobbyists on both sides will be pushing their solutions (or pushing for no action) and the compromises could easily combine their worst aspects.
Regarding the time frame, I don’t think we need 50-100 years of measurements, but rather some very accurate snapshots to initialize the models and a lot of computing power to get the required resolution. I think that’s only a decade away or two at the most.
Hank Roberts says
Eric, it’s true getting that information is a decade away, but you have the sign wrong on your time arrow.
http://www.google.com/search?q=Hansen+climate+dangerous&start=0
Steve Bloom says
Re #111: Let’s take the example of the Amazon rain forest, where cattle ranching is a major factor in deforestation. So, we pay the famously rapacious Brazilian ranchers to remove their cattle and let the rainforest grow back in exchange for not having to reduce some quantity of coal power plant emissions. Unfortunately, rain forest grows slowly and AGW-induced drought continues, with the consequence that the Amazon converts to savanna anyway and all of those billions of tons of biomass end up in the air anyway. The one guarantee in all of this is that said ranchers will enjoy a lovely retirement. I imagine them laughing about it while sipping iced drinks on Ipanema.
Chris Rijk says
Re #110
On a personal note, I don’t see it as a matter of belief – more a case of trust, or verifiability. A theory that explains the facts is good – one which made predictions which turned out to be correct is better. “May the best theory win”, as it were, and they are tested to destruction. Though sometimes it takes a while for a theory proven wrong to die away completely. Several decades ago, the general consensus among scientists was that the climate changes slowly and had only ever changed slowly – ie like geology. However, a large accumulation of facts showing otherwise forced a re-think.
Unfortunately, modelling the Earth, which we only have one of, is rather hard. Can’t “reproduce the problem” or “replicate the experiment”. So I’m not surprised the error bars on predictions are large. Still, any alternative to the consensus still has to explain the known facts.
I don’t know if existing “carbon trading” schemes allow for other greenhouse gasses. No matter how you do that though, there would have to be “fudge factors”. Basically, the relative effect of CO2 vs methane varies over time, since methane doesn’t last very long in the atmosphere. So any “price of warming” would depend on what time scale you use. You might as well simply equate a conversion rate between CO2 and methane – 1 ton of methane = xx tons of CO2.
In other words, I think “price of carbon” and “price of warming” would be pretty similar in the end.
The good thing about carbon trading is that it tackles the problem at source, and simply leaves organisations to best decide how to adjust. But can it work well at an individual level? For example, can you do carbon trading on cars directly, or just the fuel? Can you get a “carbon discount” by buying a more power effiecient TV?
I certainly think that in general, more information would help – even if it requires more legislation and more laws. For example, if every car (van, truck, bus etc) had a real-time miles-per-galon and galons-per-trip meter on the dashboard, then drivers could see how their driving affects it – eg, how much a quick 2 mile drive costs. Most drivers would then adjust their driving to be more efficient to save themselves money. Similar things could be done for electrical consumption: instead of simply calling on consumers (and businesses) to save wasted electricity, if you make sure they have detailed info on what they’re spending right now, then they’ll adjust anyway.
What kind of “models” are you talking about here? Pure climate models will not give you such answers. You’d need economic models. It would also be rather silly to do a climate model for the differences of a single small event like the above.
This week’s Economist has an article on carbon offsetting/trading:
http://www.economist.com/opinion/displaystory.cfm?story_id=7252897
(including some of the problems with schemes to date…)
Such problems will vary from place to place – in the US, some states are taking independant action, even at an international leve (eg: http://news.bbc.co.uk/1/hi/world/5233466.stm ) and most people expect the next US presidental election to feature such issues prominently from the major candidates.
A sanely done international carbon trading scheme would be very nice – and most businesses expect this to happen sooner or later. Personally, I’d like to see a more general effort to improve energy efficiency – punish (eg with taxes) inefficient companies, buildings, products etc, and reward the efficient ones (with the taxes you collected from the inefficient ones) creating an overall revenue/tax netural scheme apart from overheads. This would be done on efficiency bands rather than abitary values – eg bottom 10% get taxed most, top 10% get most reward. The companies who are better at creating efficient products will then out-grow and out-compete the bad ones. This creates a perpetual drive to improve efficiency (so long as the official measures of efficiency are strongly relevant to the real world). It would also be nice to see prizes for particular technical achievements, in a similar style to the X-Prize – doesn’t have to be government funded. That would drive the “bleeding edge” of technology – improving the state of the art.
The reason I gave 50-100 years was because by then, things might have gotten bad enough for some countries to contemplate drastic action. It wasn’t about how long it’d take models to improve. ie, by the time things could get really bad, the models will have improved a lot as well (most likely well before).
Eric (skeptic) says
Chris,
I think it would be quite useful to do climate models on millions of small events like the cattle farmers example. I would not ever suggest linking it to an economic model because I believe that is what the market is for. The models might ultimately need to be leased or loaned to the WalMarts of the world who might otherwise create unintended side effects from their actions. The Economist is being unduly pessimistic, there’s very little that can go wrong with planting trees that grow for 100 years, but the full energy used to do it and impact of it must be modeled.
To Steve, I have been planting part of my 7 acres that were bulldozed by the previous owner, first a cover crop to reduce erosion, now quick shade from locust along with native grasses, ultimately a mature native forest. There’s plenty of incentive for me to raise the value of my land without even considering the carbon benefits. But only focussing on carbon loses the rest of the potential cheaper climate solutions.
Looking at the total energy impact is one potential solution, at least we would stop subsidizing hyprids and ethanol that use up more energy in their production and full life cycle than conventional alternatives. A better approach is your X Prize or similar, tied to full climate models to achieve the greatest efficiency. Likewise keeping “green” companies truly green. I can get a free permit to harvest newly fallen trees in the national forest next to me, imagine if those trees were RFID tagged and I got a small credit for removing the ones most likely to produce methane. The alternative, I’m afraid, is going to be tweaking the tax code to subsidize green-looking power generation while taxing my wood stove
Hank Roberts says
>hybrids
Is that the study that assumed a 300k mile useful life for the gas engine?
There was one rather bogus study out; did you have a reference?
Chris Rijk says
Re: #114
You seem to be suggesting that each type of activity that could have an impact on the climate be modelled individually. I don’t see how this would give different results to conventional models which look at the components (gasses, aerosoles, effects of soil erosion etc).
I don’t think the Economist is being pessimistic about trees – the problem is not such much things like disease, drought or forest fires, but humans. ie, if you pay someone to plant some trees to offset your carbon emissions then a few decades later those trees are cut down, then those offsets are gone.
Semi-related: One reason why I push energy efficiency in general is I think that it should result in better economic growth. More economic growth means more profits / disposable income, which means further investment to improve efficiency can be better afforded. Of course, any investment to improve energy efficiency would have short term costs – it’s the medium/long term gains that are important (start with low hanging fruit first of course). Policies which permanently hurt growth, profits and disposable income make it harder to afford further investment (though might make sense on grounds of health and safety). Policies which are neutral in terms of growth/profits but reduce global warming should still be done.
In many places, petrol/gasoline is subsided already though (directly or indirectly). I agree with the general sentiment that the best solution (in terms of economics and climate change) should win, not the politicans personal pet projects. In general, I think hybridisation is a useful tool to improve efficiency, and that volume production and more R&D should help improve the befits while reducing the costs. However, I strongly dislike the thinking that the only green car is a Prius (or similar), or that a 4×4 car is automatically evil. On safety grounds, other things being equal, a 4×4 car (you can get compact 4x4s for urban driving) has better handling – while the thin tires on a Prius hurt the handling. Basically, cars should be compared on actual fuel usage / CO2 emissions, rather than whether they’re a hybrid or not, or whether they’re a 4×4 or not.
On the political policy side of things, unfortunately some political parties use global warming as an excuse to push policies which just advance their standard prejudices – and then claim to have the moral high ground. Which is why it would be best for all political parties to agree that AGW is a serious problem that needs to be addressed now.
Going back to cars for a bit, given the rate at which battery technology is improving and considering the technology being developed today, and the advantages of electric motors (max torque at min revs, get more efficient the larger they get, unlike fuel powered motors), I think cars like the Tesla Roadster might be a better indication of the future than the Prius. Apparantly, Tesla have only spent $25m developing the car to date, which is pretty darn cheap for a completely new car. Mass volume of advanced lightweight composite materials (such as, for maximum irony, carbon fiber) could not only reduce manufacturing costs, but improve efficiency and safety.
Eric (skeptic) says
I saw that study and 300k for a truck to 100k for the hybrid was clearly bogus. My conclusions are from watching hybrid users here in VA and many are not driving in the city or traffic where they make sense but on the open highway to get the HOV exemption. That means batteries and electric motors that do nothing but add weight most of the time.
Grant says
Re: #116
I think perhaps this is a myth. Can you give me specific examples?
Eric (skeptic) says
Hi Chris, I don’t understand why the models of aerosols, erosion, etc can’t be combined the climate models. Although weak they do create climate feedback and millions of actions combined will make a difference. Erosion, for example, will allow quicker moisture evaporation, less soil moisture and warming. Wouldn’t that add up to something on a worldwide scale? The key is complete enough models to see all the effects and possible unwanted side-effects and creating the incentives to use the most complete models.
I’m still not sure I agree on the trees, I have no incentive not to leave my 50-100 year old walnuts alone unless they blow down, at which point I have cut some into firewood and others into beams that I am air drying. I have a bigger investment value leaving the other ones to grow, it’s part of my retirement plan (I am 43).
On Tesla I agree wholeheartedly. Imagine the difference in energy used to design and build it versus any antifreeze cooled gasoline competitor.
Brian Gordon says
Re: Eric(s): “would not raising the price of warming itself lead to the most efficient solutions for warming?”
Yeah, baby! I am entirely in favour of this – it’s part of sustainability. You can provide any (legal) good or service you wish, provided you do it sustainably. Meaning, feed the tailpipe of the car into the passenger compartment. If the exhaust isn’t safe enough for the driver to breathe, it’s not safe enough for the pedestrians and locals you drive by to breathe.
Sustainability means no longer externalizing problems – it means accepting responsibility for what you do.
Chris Rijk says
Re #118
The best example I can think of is the desire in the UK by some “left” groups and politicans to put massive massive taxes on “Chesea Tractors” (ie SUVs) using AGW as an excuse, which looks more like a retroactive rich tax to me. Given the massive taxes on fuel in the UK, it’s not like poor fuel efficiency isn’t already being punished.
The same types also tend to forget that busses and trains could be made a lot more fuel efficient too.
On the “right” side, some seem overly keen to subsidise next-gen nuclear power on the grounds that nuclear power generation doesn’t generate CO2. However, mining and refining uranium is energy intensive, nuclear power stations are much more expensive (and energy intensive) to build and decommission and waste desposal is hardly cheap. Nuclear power stations are also not very economical when current government subsidies are removed. Seems to make much more sense to spend the same amount of money on wind, solar or tidal power, and general improvements to the production of electricity.
Chris Rijk says
Re #119
Here’s a graph showing the effects of the top 5 forcing effects on global temperatures:
http://en.wikipedia.org/wiki/Image:Climate_Change_Attribution.png
Certainly things like soil erosion are modelled too, but I think the overall effect is too small compared to the above 5 to be given that much attention. With regards to feedbacks, which have a very big effect, the effects of clouds and water vapour has big error bars, and something like soil erosion would be lost in the noise.
However, my point in the previous post was not that such things shouldn’t be modelled but that you seemed to be saying each industry should be modelled separately, not the actual components – CO2 is CO2, the source doesn’t matter.
I’m not doubing your situation. But, if you consider carbon offsetting programs using trees world-wde, what’s the chance a number of them in the next 100 years will go bankrupt and get taken over by sometime who decides to chop the lot down. Or get hit by forest fires, or pollution, or disease – all of which become bigger problems with climate change.
Basically, a few well publicised bad apples could undermine such programs. Better to avoid the potential problems in advance.
Hank Roberts says
Eric(S), aren’t you asking for a 1:1 map before going forward? Unless NASA gets really, really good at distributed networks of instruments at the “smart dust” level
http://robotics.eecs.berkeley.edu/~pister/SmartDust/
nobody’s going to be able to model accurately enough and update quickly enough to satisfy that.
We know enough to do the easy and immediately financially profitable work now — despite the complaints of the Western Fuels (coal industry) lobby. Yes, it’s going to pinch them to do this. But it requires the early and obvious choices be made now.
Read Hansen, not secondary sources. Try the pages linked to this one, is my suggestion. Others may know better and more recent summaries, but I found this very good:
http://www.giss.nasa.gov/research/features/altscenario/discussion.html
“Discussion of “An Alternative Scenario” By James E. Hansen
Expanded from an MIT Workshop at Johns Hopkins University, Oct. 16, 2000
“I appreciate this opportunity to clarify our paper. Some thoughtful people did not understand very well what we were trying to say, so I accept the blame for not being clear enough. Even usually reliable sources such as The New York Times had either inaccuracies or emphases in their description of the paper that were misleading about its thrust. I have found it difficult to correct the mischaracterizations…..”
————————–
Ten years from now is after hundreds more old tech big coal-burning plants go into operation — guaranteeing the short term profits the Western Fuels people want to book, and also guaranteeing the CO2 output that guarantees a big problem in the longer term for the next generation.
Check what you hear and check who is telling you, and ask why you believe what they are telling you. You’ll hear and you’ve said that acid rain wasn’t a problem. Not true. You’ll hear that mercury pollution from coal is not a problem. Not true. You’ll hear that particulates are not a problem. Not true.
You’ll hear that there’s no better technology. It’s being tested:
http://www.technologyreview.com/printer_friendly_article.aspx?id=17053
“The Alternative — Catastrophic climate change is not inevitable. We possess the technologies that could forestall global warming. Why can’t we use them? —-By Jason Pontin
Quote:
“Technology Review is sunnily confident that technology is the single greatest force for expanding human possibilities. But honesty compels us to confess that technology has created the prospect of catastrophic climate change. Technology, too, must provide a solution.
“This month, in a package of stories edited by our chief correspondent, David Talbot, we argue that “It’s Not Too Late”.
http://www.technologyreview.com/read_article.aspx?id=17055&ch=biztech
We believe the energy technologies that could forestall the worst effects of global warming already exist. Rather than waiting for futuristic alternatives like the much-bruited “hydrogen economy,” the nations of the world could begin to control the growth of greenhouse-gas emissions today. What is lacking is any considered strategy….”
———- end quote——
If we pull together a decade of conservation, energy efficiency and smart development will buy us a future worth leaving to the next generation.
Eric (skeptic) says
Hank, I don’t think I’m asking for anything that grand. I am more interested in laying down a challenge: if models or the analysis of models can be used to predict bad consequences then they also can evaluate adjustments of parameters that might lead to beneficial consequences. One potential downside is that economics is not modeled (I would deliberately leave that out) and it could lead to “tyranny by model” where details of daily life are prescribed and proscribed by the model. The upside is it gives a continuous global perspective to any locality contemplating global-affecting action and can ultimately be extended to every individual.
A carbon credit or license precludes more economically efficient ways of preventing environmental problems. Taxes and licenses should not be a start, but a finishing point or safety net. People don’t donate always to charity just because of tax advantages, and the environment wouldn’t necessarily require it either although a government “safety net” would obviously be a good thing. As just one example, instead of contributing to the propane and natural gas fuel fund like I did last year, I could contribute to a green heating fund for the poor.
Pianoguy says
#117 – re: hybrids
It’s a common misconception that hybrids don’t get better highway mpg than conventional cars. There are two reasons why this is wrong, especially for the Prius, the Camry hybrid, and the Ford Escape:
1. The engines in these cars use the Miller cycle instead of the traditional Otto cycle. This means the power stroke is longer than the compression stroke, with a resultant increase in efficiency into the diesel range. (The defect of the Miller cycle is low torque at low rpm, but in a hybrid, the high-torque electric motor conpensates for that.)
2. In addition, all hybrids have regenerative braking, which means that on any downhill – which even freeways have – they can convert kinetic energy into electricity.
As a result, I routinely get around 60 mpg at 60 mph in my Prius. None of my diesel-powered friends comes close to that.
Hank Roberts says
Eric/S — even better, we could contribute to attic insulation for poor people. California tried to set that up back in the late 1970s when Jerry Brown was governor and the some combination of the lead paint and landlords’ lobby defeated it, as I vaguely recall, because the work would have had to be done with awareness of the lead hazard and would have meant improving the actual energy efficiency of the private property.
Thirty years ago (sigh)…. turns not taken.
Globalcooler says
Hi Friends,
We need to better educate the public on globl warming. I am impressed with alGore.
globalwarming
Rhampton says
Chris Rijk says
Re #124
Climate models cannot possibily dictate daily life. Climate models don’t say how to solve the problem – the problem is CO2 (and similar). If two different programs would reduce CO2 emissions at the same rate, from a climate modelling point of view, they are equivalent. Hence, economics, politics, market forces etc would decide which program to go with (or whether to do both).
Certainly climate models can be used to project the effect of cutting global CO2 emissions by a certain amount or to a certain amount, or the relative long term difference between say CO2 concentrations increasing by x% per year vs double that. Those sorts of things are already being done.
What becomes tricky is costing the differences. Certainly economics would be involved with that – but it’s not so simple. For example, if the sea levels rise 5m, well, most countries could cope with that by building bigger, higher sea barriers, which is relatively predictable to cost. Countries and areas that rely on their current coast line (eg beaches) would suffer a lot though. However, the more land moves below sea level, the higher the cost of major storms will be – because a breach of the sea walls would lead to massive flooding. With higher sea temperatures, fiercer storms would also be expected, and possibly in new areas. Sea walls would also make a terrorist target. Local weather patterns could change significantly, which could significantly affect what crops can be grown, and how economicly. At a bigger scale, how would you cost the effect of massive changes on the whole monsoon season, for example.
Timeline would also matter a lot. The difference between having a 50 year and 200 year timeline for cost/benefit analysis could be huge (more than 10x).
As a side note, I certainly expect some people to get increasingly interested in climate change models – and to do modelling themselves. A good example would be insurance companies – the effects and risk from AGW would not be evenly distributed. I doubt you’d be able to get AGW insurance (would be rather meaningless I think) but rather, the insurance companies would be concerned about long term risks and insurance rates.
jhm says
To laugh or to cry, that is the question.
Click if your dare.
Rhampton says
Laurence Williams says
In An End To Global Warming (ISBN 0-08-044045-2) I proposed a way to terminate the burning of fossil fuels. Yes, it will be hard work, but accomplishments of value require hard work. Yes, it will cost a lot of money but the world currently spends 6 to 8 trillion (yes trillion) per year on energy. To put us on a path to stop the combustion of fossil fuels will take about 30 billion for 8 or 10 years, and 20 to 30 years to complete the conversion. We meed to take action soon prevent a disaster for our children and grandchildren.
Brian Allen says
going way back to #4 above. Lindzen (professor of Atmospheric Science at MIT, a very impressive badge) stated that “satellite data showed no warming since 1979”. I am very interested in this and have not seen a reference to this study if it is one. Thanks, BA
Jim Dukelow says
Re #133
Well, it depends a bit on WHEN Lindzen said/wrote that.
It is easy enough to check on the trends. Simply download the Christy/Spencer monthly anomalies from the MSU website at http://www.nsstc.uah.edu/data/msu/t2lt/tltglhmam_5.2 and turn your favorite statistical software loose on it.
I used MS Excel and calculated linear trends from December 1978 to the end of each year since 1980 we find:
Negative trends for end of 1982, 1983, 1984, 1985, 1986, 1987, 1989, 1993, and 1994. Of those negative trends, only the trends for 1984, 1985, and 1986 were statistically significantly differnt from zero (using the F-test).
All other years had positive trends, with the trends for 1980, 1981, 1982, 1991, 1998, and all years since 1998 being statistically significantly larger than zero.
The observant will note the coincidence of the negative trends of 83-85 with the eruption of El Chichon and the negative trends of 93-94 with the global cooling caused by the eruption of Mt. Pinatubo.
The current trend (December 1978 to July 2006) is +0.134 deg C per decade (with an approximate 95% confidence interval from +0.11 to +0.158 deg C per decade), somewhat smaller than the trends in the surface data sets.
Every other group that has analyzed the raw MSU data has calculated higher trends than those produced by the Christy/Spencer algorithms.
If the Lindzen quote is recent, it can be left to the reader to decide if he is negligently unobservant or aware but ingenuous.
Best regards.
Jim Dukelow
Jim Dukelow says
Re #133 and #134
Curioser and curioser.
I did a bit of Googling to see when Lindzen wrote “satellite data showed no warming since 1979”. It was quite recent, in his July 2006 op-ed whine in the Wall Street Journal. He wrote: “The models imply that greenhouse warming should impact atmospheric temperatures more than surface temperature, and yet satellite data showed no warming in the atmosphere since 1979.”
Similarly, in a Geophysical Research Letters paper with Constantine Giannitis (also at MIT), GRL, v. 29, no. 0, pp. X-1 to X-3, they write “The surface data suggests a warming of about 0.25C, while the satellite data shows no significant increase (more precisely, the trend in the satellite data through 2001 is 0.035 +/- 0.06C/decade.”
As I noted in Comment #134 above, the trend from December 1978 to December 2001 is positive and statistically significantly different from zero, a result inconsistent with the Lindzen et al. GRL quote. Specifically, the 78-2001 trend is +0.0944 with a s.d. of +0.016 and an F-statistic value of 32.4032 with 276 d.f. A 95% CI for the trend slope will be somewhat greater that +/- two standard deviations, because of the autocorrelation of the temperature anomaly time series.
What can account for the discrepancy between Lindzen and Giannitis’ 2002 GRL paper and my analysis of current MSU temperature anomalies?
As it happens, I have among my souvenirs MSU data from 2001 and 2002. My analysis of the 2002 data agrees roughly with Lindzen and Giannitis (I get slope of 0.03779 deg C/decade for Dec 78 to Jan 2002, with s.d. of 0.01704). Why the difference between a 2002 analysis and current analysis?
It is a bit like the old joke about the Econ professor who gave the same final exam every year. The questions were the same but the correct answers changed every year. MSU data downloaded today differs substantially from MSU data downloaded early in 2002. The Christy/Spencer algorithms for converting the 50-60 GHertz screams of oxygen atoms into average temperatures for large blocks of the atmosphere are on approximately Version 7, with numerous changes required from the initial circa 1990 version as C/S and others discovered various errors in the algorithms.
That conversion is a bit like making sausage or legislation — not something you want to be watching too closely.
All of this supports the option that Lindzen was simply too busy (?) to verify that things he had written that were almost true once continued to almost true as the scientific process worked on the issue.
Best regards.
Jim Dukelow
Brian Allen says
Jim Dukelow: Thank you very much for your response in # 134 above. I appreciate the accurate information on this.
BA
Gavin says
I deleted some comments that got way off topic. This is not the place to debate terrorism.
Alastair McDonald says
Re #135 It is true that if you look at the upper troposphere, then there is warming especially in the Northern hemisphere where the Asian Brown cloud and other dark industrial aerosols absorb solar radiation, but even so the amount of warming is less than that predicted by the computer models. They predict that the upper troposhere should warm more than at the surface not less. The other two regions of the lower atmospher are both cooling or have not warmed. See Christy and Spencer’s site: http://www.ghcc.msfc.nasa.gov/MSU/atmos_layers.html
Other groups have calculated higher trends, as there has been a desperate struggle to prove the satellite results wrong because they contradict the models. Now that they have nearly suceeded, they are using their new results to recalibrate the readings from the radiosondes, which also disagree with the computer models. See; Sherwood et al. “Radiosonde Daytime Biases and Late-20th Century Warming ” http://www.sciencemag.org/cgi/content/abstract/1115640v1
For an opposite POV see: Sherwood at https://www.realclimate.org/index.php/archives/2005/08/the-tropical-lapse-rate-quandary/
Jim Dukelow says
Re #138
Alastair McDonald writes, “They predict that the upper troposhere should warm more than at the surface not less. The other two regions of the lower atmospher are both cooling or have not warmed. See Christy and Spencer’s site: http://www.ghcc.msfc.nasa.gov/MSU/atmos_layers.html ”
Sigh!
If you follow Alistair’s link, you see a schematic view of the atmosphere that has layers for lower stratosphere, upper troposphere, and lower troposphere. Somehow, Alastair gets four layers out of this, including two layers of lower atmosphere that are “both cooling or have not warmed”. The C/S graphic shows the lower atmosphere (0 to 5 miles) as showing a “slight cooling”. But current C/S data shows the same region with a fairly strong statistically significant “current trend (December 1978 to July 2006) is +0.134 deg C per decade (with an approximate 95% confidence interval from +0.11 to +0.158 deg C per decade).”
What’s going on?
Well, if you look at the bottom of the web page that Alastair sent us to, it says, “Last updated: July 31, 1997”.
The C/S assertion of slight cooling up to July 1997 was not true then (although it might have been “true” according to their data as it was then). Since 1998, the assertion has been ludicrously false.
Best regards.
Jim Dukelow
John L. McCormick says
RE #138,
Just when I believe it is coming together in my way of thinking, a new point of view about
[warming especially in the Northern hemisphere where the Asian Brown cloud and other dark industrial aerosols absorb solar radiation].
I believed the aerosols in the NH atmosphere were actually (in their limited way) shielding the surface from warmer temperatures due to their albedo. At least, that is what Dr. Hansen has been offering.
And, Mt. Pinatubo released about 10 million tons of S. Hansen calculated a radiative cooling of 4.5 W/m2 caused by 6 TG S, the amount of S that remained in the stratopshere as sulfate six months after the initial eruption (10 MM tons S.)
Perhaps I am confusing dark from gray aerosols. Or, maybe it is warming from the surface more than from solar insolation?
Hank Roberts says
John, type “aerosol” in the Search box, top of page.
For example, this will help: https://www.realclimate.org/index.php/archives/2006/04/global-dimming-and-climate-models/#comment-11848
John L. McCormick says
Hank, thank you. That helps. And, I’ll go back to the Hansen paper. So much to understand. So little time.
wayne davidson says
There is within the last 4 years significant Upper Air warming, while using a refraction method, that is what I gathered through several hundred observations (more than 700). The method in question is a direct measurement of refraction boosts, at several z.d.’s (zenith distances). It is in my experience the best method, extremely precise and accurate of measuring mostly the lower troposphere. I am not aware of satellite techniques, but if they don’t show a significant yearly trend, then there is a disagreement. The method hopefully will be reviewed one of those days, but other key location technique replication (indepedent of me) is more important, I compared refraction boosts from year to year since 2002, at the same polar and temperate locations, both show a significant consistent warming, results should be published after peer review.
pat neuman says
re 143
Wayne, in an earlier post I mentioned my concern that the growing world drought conditions could be related to upper air warming.
Yesterday the government declared the northern half of Minnesota as a disaster area due to drought.
Also,
Besides greater warming in higher latitude areas than areas in mid or low latitudes,
Excerpts:
… According to general circulation models of future climate in a world with double the preindustrial carbon dioxide (CO2) concentrations, the rate of warming in the lower troposphere will increase with altitude. …
… An analysis of 268 mountain station records between 1°N and 23°S along the tropical Andes indicates a temperature increase of 0.11°C/decade (compared with the global average of 0.06°C/decade) between 1939 and 1998; 8 of the 12 warmest years were recorded in the last 16 years of this period (3). Further insight can be obtained from glaciers and ice caps in the very highest mountain regions, which are strongly affected by rising temperatures. In these high-altitude areas, ice masses are declining rapidly (4-6). Indeed, glacier retreat is under way in all Andean countries, from Columbia and Venezuela to Chile (7).
In Science 23 June 2006
Threats to Water Supplies in the Tropical Andes
Alastair McDonald says
Re #139 I have to admit I did not realise that the figure was from 1997. If you follow the links from that page to the Lower Troposphere and Lower Stratosphere, they run until 2006, and I had assumed the figure was also up to date.
I was trying to defend Lindzen’s claim that “satellite data showed no warming since 1979”. By averaging the results from the lower troposphere, upper troposphere, and lower stratosphere, then that seemed to be true, but it is now obvious it was only true until 1997. However, it strikes me as strange that if a scientific statement is made in 1997 then it is true, yet the same statement made to today is untrue. Will it still be true in another ten years time?
Re #140 What the scientists, who are trying to disprove Christy & Spencer’s MSU results, dont tell you is that nearly all the warming is happening in the northern hemisphere. By averaging the results from the northern and southern hemispheres they get the result they want, but in fact neither set of results fits the models. One is too high and the other is too low! The point is that they have made up their minds that the models are correct, and so for them those results prove that their models are correct. That is why they have gone through the MSU codes with a fine tooth comb. If the MSUs were giving results too high, how long do you think it would take them to blame the Asian Brown Cloud (ABC)?
BTW, the ABC is dark and so it absorbs solar radiation and warms, and heats the surrounding air by conduction. If we assume a fixed lapse rate, then the warming upper troposphere also leads to a warmer surface. So you get global dimming but the surface temperature still rises, although not as fast as without the dark aerosol.
My point is that the models are wrong, but as Steve Sherwood says in https://www.realclimate.org/index.php/archives/2005/08/the-tropical-lapse-rate-quandary/
“The non-warming troposphere has been a thorn in the side of climate detection and attribution efforts to date. Some have used it to question the surface record (though that argument has won few adherents within the climate community), while others have used it to deny an anthropogenic role in surface warming (an illogical argument since the atmosphere should follow no matter what causes the surface to warm). The most favored explanation has been that the “lapse rate,” or decrease in temperature as you go up in the atmosphere, has actually been increasing. This would contradict all of our climate models and would spell trouble for our understanding of the atmosphere, especially in the tropics.”
I take that to mean that they do not want to admit, even to themselves, that their models may be wrong because it would give comfort to the sceptics.
I have been forbidden from discussing the models here, but since my ideas have advanced since the ban, and I will not be repeating what I already wrote, I will take a chance and hope it gets past the censors :-)
The source function used to calculate the radiation re-emitted by the greenhouse gas molecules is wrong. They are using Planck’s function for continuous black body radiation, but greenhouse gas molecules actually emit line fluorescence. The main problem is that Planck’s function depends on temperature and so provides a negative feedback as temperature rises. Fluorescence is independent of temperature, and so the warming from an increase in greenhouse gas concentration produces a stronger effect than is currently being calculated.
In a very recent paper co-authored by Ray it states “We nevertheless still do not succeed in simulating warm enough polar temperatures and a definitive theory still waits for an integrated approach explicitly accounting for each factor influencing the thermal gradient (ocean dynamics, stratospheric clouds, and vegetation).” In other words the current models do not work!
See: “Modelling the primary control of paleogeography on Cretaceous climate” Earth and Planetary Science Letters, Volume 248, Issues 1-2, 15 August 2006, Pages 411-422
Y. Donnadieu, R. Pierrehumbert, R. Jacob and F. Fluteau
http://www.sciencedirect.com/science?_ob=GatewayURL&_method=citationSearch&_uoikey=B6V61-4KCGJ25-2&_origin=SDEMFRASCII&_version=1&md5=bbd3ee3b6aa2eea7c43d9d1f22f78356
In another paper, co-authored by Gavin, the abstract concludes “However, neither the GISS-S nor the HadCM3 models are able to reproduce the observed temperature changes, suggesting that this explanation for the impact of the inclusion of a atratosphere in the model may be incomplete.” In other words, two GCMs do not reproduce the climate correctly.
See: Gillett, N.P. et al. “How linear is the Arctic Oscillation response to greenhouse gases?” JGR vol. 107, NO. D3, 10.1029/2001JD000589, 2002
http://climate.uvic.ca/people/gillett/publications/ps/gcm_aochange.pdf
HTH,
Cheers, Alastair.
Dan says
re: 145. Rather than drawing assumptions, I would certainly defer a reply to the authors of “How linear is the Arctic Oscillation response to greenhouse gases?”, as should you unless you intend to publish your commments in a peer-reviewed journal in response. However, their saying “However, neither the GISS-S nor the HadCM3 models are able to reproduce the observed temperature changes, suggesting that this explanation for the impact of the inclusion of a atratosphere in the model may be incomplete.”, does not in any way mean the GCMs do not reproduce the climate correctly! “Incomplete” does not equate to “incorrect”.
Alastair McDonald says
Re #146 There seems little chance of my ideas being published since they are diametrically opposed to current thinking – no computer modeller, who would inevitably be the peer reviewer since any other scientist would not feel competent to comment, is likely to give my paper a (fair) reading. I have submited a paper once, and it did not get past the editor!
You are right about incomplete not equating to incorrect. Incorrect does not mean incomplete, but incomplete does mean it is incorrect.
Dan says
There are many peer-reviewed journals where you can submit legitimate comments. Perhaps the issue is that you are a layman questioning the expertise of thousands of climate modeling experts around the world? The idea that a layman knows more or thought of something that these experts did not is more than just a bit arrogant.
And no, incomplete does not mean incorrect.
Jim Dukelow says
Re #145
Alistair McDonald wrote: “However, it strikes me as strange that if a scientific statement is made in 1997 then it is true, yet the same statement made to today is untrue. Will it still be true in another ten years time?”
I tried to explain why this is the case. The C/S algorithm for converting the temperature-dependent GHertz emissions of oxygen atoms into temperature anomalies is so complicated and fraught with uncertainties that it has been significantly revised about a half-dozen times. Thus, the temperature anomalies for the lower troposphere that you got if you downloaded data from the MSU site in 2001 or 2002 are substanitially different from the anomalies you get if you download the “same” data today. These changes in the data arose as Christy and Spencer have responded to a number of errors in their algorithm, a few that they identified themselves, but most identified by other scientists looking at their raw data and at the algorithm.
It is also the case that the July 1997 assertion about the temperature trend in the lower troposphere is false, whether you use the 2002 data or the current data. I don’t have the July 1997 MSU data, so I can’t assess whether it was a “false” statement at the time it was made.
Since the topic of this thread is how misleading talking points propagate, I will assert that most of what is said and written on the right about the MSU data is not only misleading, it is outright false.
Alistair’s point about the Southern and Northern Hemisphere’s is interesting. Using the current MSU data, the SH has a statistically significant positive trend of 0.5866 deg C per decade with s.d. of 0.01298.
The NH trend is a statistically significantly positive 0.21039 with s.d. of 0.1485.
It should not be too surprising that the trends are different, because the hemispheres are very different. The SH is primarily the Southern Ocean and a high altitude ice-covered polar region that is essentially isolated from the rest of the hemisphere by a strong polar vortex. The NH is primarily land, with a polar region that is mixed water and ice and provides a strong albedo positive feedback to temperature increases. Because the polar air is so dry, GCMs uniformly predict stronger greenhouse gas warming in the northern high latitudes, which we see strongly confirmed in the surface temperature records, in the MSU data, and in effects on the ground that are reshaping the lives of inhabitants of the northern polar regions.
Best regards.
Jim Dukelow
Jim Dukelow says
Re #145
Alistair McDonald wrote: “I was trying to defend Lindzen’s claim that ‘satellite data showed no warming since 1979’. By averaging the results from the lower troposphere, upper troposphere, and lower stratosphere, then that seemed to be true, but it is now obvious it was only true until 1997.”
I am puzzled why Alistair would want to average together temperature trends for the troposphere and the stratosphere and think that he had established anything useful.
GCMs predict cooling of the stratosphere as a consequence of increasing concentrations of greenhouse gases. On top of that, there is the additional anthropogenic cooling of the stratosphere due to buildup of CFCs and the resulting loss of stratospheric ozone. So we have strong anthropogenic cooling in the stratosphere and anthropogenic warming in the troposphere, but we can’t average them together and say that nothing’s happening.
The simplest course is to simply recognize that Lindzen’s recent statements about satellite data are false and ask him whether he was aware they were false when he made them.
Best regards.
Jim Dukelow