It’s not often that blogs come up in congressional hearings, but RealClimate was mentioned yesterday in the Energy and Commerce hearings on the ‘Hockey Stick’ affair. Of course, it was only to accuse us of being part of tight-knit social network of climate scientists, but still, the public recognition is nice.
There is much that could be said about the hearings (and no doubt will be) and many of the participants (Tom Karl, Tom Crowley, Hans von Storch, Gerry North) did a good job in articulating the big picture on climate change independently of the ‘hockey stick’ study as we’ve highlighted before. But it seems to us that there was a missing element in the discussions. That element was the direct implication of the critique that was the principal focus of Wegman’s testimony and that was mentioned periodically throughout the day.
Wegman had been tasked solely to evaluate whether the McIntyre and McKitrick (2005) (MM05) criticism of Mann, Bradley and Hughes (1998) (MBH) had statistical merit. That is, was their narrow point on the impacts of centering on the first principal component (PC) correct? He was pointedly not asked whether it made any difference to the final MBH reconstruction and so he did not attempt to evaluate that. Since no one has ever disputed MM05’s arithmetic (only their inferences), he along with the everyone else found that, yes, centering conventions make a difference to the first PC. This was acknowledged way back when and so should not come as a surprise. From this, Wegman concluded that more statisticians should be consulted in paleo-climate work. Actually, on this point most people would agree – both fields benefit from examining the different kinds of problems that arise in climate data than in standard statistical problems and coming up with novel solutions, and like most good ideas it has already been thought of. For instance, NCAR has run a program on statistical climatology for years and the head of that program (Doug Nychka) was directly consulted for the Wahl and Ammann (2006) paper for instance.
But, and this is where the missing piece comes in, no-one (with sole and impressive exception of Hans von Storch during the Q&A) went on to mention what the effect of the PC centering changes would have had on the final reconstruction – that is, after all the N. American PCs had been put in with the other data and used to make the hemispheric mean temperature estimate. Beacuse, let’s face it, it was the final reconstruction that got everyone’s attention.Von Storch got it absolutely right – it would make no practical difference at all.
This is what MBH would have looked like using centered PC analysis:
Red is the original MBH emulation and green is the calculation using centered PC analysis (and additionally removing one of the less well replicated tree ring series). (Calculations are from Wahl and Amman (2006), after their fig. 5d). Pretty much the same variability and the same ‘hockey stick’. We’d be very surprised if anyone thought that this would have made any difference to either the conclusions or the subsequent use of the MBH results.
In fact, it’s even more simple, Throw out that PC analysis step completely, what do you get?
Again, red is the original MBH98 multiproxy+PC analysis, green is if the raw data are used directly (with no PC analysis at all). (This comes from Rutherford et al (2005) and uses a different methodology – RegEM – to calibrate paleoclimate proxy data against the modern instrumental record, but that doesn’t make any difference for this point).
Why doesn’t it make any difference? It’s because the PC analysis was used to encapsulate all of the statistically relevant information in the N. American tree ring network and so whatever patterns are in there they will always influence the final reconstruction.
So what would have happened to the MBH results if Wegman and his colleagues had been consulted on PC centering conventions at the time? Absolutely nothing.
Can we all get on with something more interesting now?
Mark Shapiro says
Once again, Lynn V. has a wonderful insight (# 47, # 48). Yes, RC is a science site, but the “A” in “AGW” does indeed stand for anthroprogenic. So while RC focuses appropriately on the the physical science of climate, the sciences of people attitudes, motivations, and behavior are not only important, but necessary if we want to change course.
I will never tire of Lynn reminding us that rather than imposing cost or hardship, conservation and efficiency actually make you wealthier. But I must admit that my favorite post from Lynn, in replying to Crichton’s book, was that we should move to clean energy out of a “STATE OF LOVE”.
shargash says
Re: 13 & 39
The old Soviet Union was an ideological state that filtered and twisted science to support it’s idealogy (q.v. Lysenko). In that regard it was remarkably similar to the present US.
Eric says
James (#49): have you considered the increase in OH in your scenario? http://www.igac.noaa.gov/newsletter/21/oh_modeling.php
Also, can you provide links on the amount of methane, rate of release, and change in rate of release? Thanks.
Chuck Booth says
Re#44 Here are some references (abstracts are, I think, available for free at the Science web site (www.sciencemag.org):
Tropical Climate at the Last Glacial Maximum Inferred from Glacier Mass-Balance Modeling
Steven W. Hostetler and Peter U. Clark
Science 1 December 2000 290: 1747-1750
Early Local Last Glacial Maximum in the Tropical Andes
Jacqueline A. Smith, Geoffrey O. Seltzer, Daniel L. Farber, Donald T. Rodbell, and Robert C. Finkel
Science 29 April 2005 308: 678-681
Near-Synchronous Interhemispheric Termination of the Last Glacial Maximum in Mid-Latitudes
Joerg M. Schaefer, George H. Denton, David J. A. Barrell, Susan Ivy-Ochs, Peter W. Kubik, Bjorn G. Andersen, Fred M. Phillips, Thomas V. Lowell, and Christian Schlüchter
Science 9 June 2006 312: 1510-1513
Early Warming of Tropical South America at the Last Glacial-Interglacial Transition
G. O. Seltzer, D. T. Rodbell, P. A. Baker, S. C. Fritz, P. M. Tapia, H. D. Rowe, and R. B. Dunbar
Science 31 May 2002 296: 1685-1686
The Role of Ocean-Atmosphere Interactions in Tropical Cooling During the Last Glacial Maximum
Andrew B. G. Bush and S. George H. Philander
Science 27 February 1998 279: 1341-1344
Rapid Changes in the Hydrologic Cycle of the Tropical Atlantic During the Last Glacial
Larry C. Peterson, Gerald H. Haug, Konrad A. Hughen, and Ursula Röhl
Science 8 December 2000 290: 1947-1951
Stormy says
From 2002 until this year, NASAâ??s mission statement, prominently featured in its budget and planning documents, read: â??To understand and protect our home planet; to explore the universe and search for life; to inspire the next generation of explorers … as only NASA can.â??
In early February, the statement was quietly altered, with the phrase â??to understand and protect our home planetâ?? deleted. In this yearâ??s budget and planning documents, the agencyâ??s mission is â??to pioneer the future in space exploration, scientific discovery and aeronautics research.â??
http://www.nytimes.com/2006/07/22/science/22nasa.html?ei=5094&en=7a71420a9103fea3&hp=&ex=1153627200&adxnnl=1&partner=homepage&adxnnlx=1153543120-I5g0T4aFitiKrXZazUNXdw
I have a hard time having any hope. Sorry that this is off topic.
Eric says
This is somewhat interesting regarding Sen. Inhofe basically refuting all of global warming and saying that Al Gore is “full of crap”.
http://thinkprogress.org/2006/07/21/inhofe-gore/
This guy needs to stick to politics and let the scientists sort it out. His whole idea of the medieval warm period being warmer than now.. hasn’t that been totally refuted???
Jeffrey Davis says
No he didn’t. I suggest you actually read Hansen’s actual remarks, rather than someone else’s partisan pseudo-interpretation
Hansen’s measured remarks are actually encouraging reading. Thanks. I expected his words — given his notorious remarks about sea level change — to have had a Chicken Little tone. Instead, he made the prospect of curtailing disaster seem not only possible but likely.
cp says
I was thinking that people focus too much on what will happen to the climate, plants and wild life and that this is not the right approach. No politician or CEO that take themselves seriously will pay attention to that. Maybe someone should rather focus on financial impacts of climate change. That might make a difference.
Dadoo says
So, I have to ask: when we have two seemingly reputable scientists who disagree, how do we know who’s right?
http://www.opinionjournal.com/extra/?id=110008220
This guy has been writing for WSJ, once a month or so, trying to discredit Global Warming. Since he’s a Professor of Atmospheric Sciences at MIT, people tend not to believe me, when I tell them he doesn’t know what he’s talking about.
[Response: Ah, but he does know what he’s talking about. It’s just that he uses his knowledge to confuse rather than enlighten. You have to parse what he writes very carefully to see how he gives the impression of saying something without actually saying it. If it was just one scientist versus another, then your friends would be right – you wouldn’t be able to work out who was right. However, it’s more like all the other scientists against one, and then, it’s much clearer where ‘truth’ is likely to be. So point to statements by the National Academies of all G8 countries, point to the NAS 2001 report – on which Lindzen was an author, point to the IPCC – possibly the most well reviewed scientific document in history, and point to critiques of Lindzen’s statements here and elsewhere to see if they really stand up to scrutiny. -gavin]
Juola (Joe) A. Haga says
In respect to cp’s comment in #58, it might help to clarify thinking a little if we withdrew ourselves a bit and looked upon our ilk as a natural process which as been continuing for 200 thousand years. Within that stream, knowing and planning by talking are natural processes needed for that essential natural process, eating. To take one more step in the natural process of walking into the next instant, the natural process of believing or trusting contributes. Some will believe that 535 rule-makers can make plans for the planet. Some will not and will seek out ways of rule-making in which they can believe more. There is no use, I believe, in not being very cautious in thinking, talking and doing.
Martin Lewitt says
Re: 45, John, there were lots of hyperbolic and fearmongering moments in the Disovery Channel show. Hansen had his share of hyperbole: “99.9% of scientists say that we basically understand what is going on, – the science is overwhelming” Right after that, the show discusses the Hadley model, with animated natural and anthropogenic lines progressing across a time chart, and the anthro rising with the temperature and the natural not as much. IPCC diagnostic studies document lots of errors in the models, one of them, Roesch (2006) shows that model to have a globally averaged, annual positive surface albedo bias, note this is a bias against natural solar forcing.
Eyeballing the chart (at 800% zoom), it looks like the UKMO-HadCM3 has a surface albedo for the period studied of 0.132. This is better than the all models average of 0.14, but still significantly higher that the satelite observations of 0.124 and 0.121. Apply surface solar fluxes to the errors of 0.008-0.011, we get globally averaged flux errors 1.33-1.8W/m^2. This is larger than the current globally averaged flux into the ocean for even the warmest recent years. (less that 1W/m^2 per Hansen)
Do you really think 99.9% of scientists, or Hansen for that matter, thinks that in the nonlinear climate system, we can accurately attribute a net flux into the oceans of less than 1 W/m^2 with an error in a key relevant component larger than that? Presumably, in order to attribute, this figure between natural (most likely solar) and anthropogenic, even qualitatively, we would like to accuracy to be at least with 0.1 to 0.15 W/m^2. We are an order of magnitude away from that, even with a better than average model.
Note, both figures are fluxes at the surface. The Roesch paper was also discussed in this other thread. https://www.realclimate.org/index.php/archives/2006/07/the-discovery-of-global-warming-update/#comment-15709
Without premature use of models not yet ready for meaningful attibutions studes, the hockeystick is just as supportive of natural warming due to historically high levels of solar activity, as it is of anthropogenic warming. With solar attribution, the likelyhood based on both the solar conveyor theory, and Solanki’s study of the paleo records, that the solar activity will lessen in the first half of this century.
Roesch (2006) http://www-pcmdi.llnl.gov/ipcc/abstract.php?ipcc_publication_id=36
George Landis says
I have always found Dr. Lindzen’s critiques and analysis to be very objective and thought provoking, unlike Dr. Hansen’s or Dr. Pilke’s etc. I hope some of you actually read the Wall Street Journal, Ms. Oreskes had a letter to the editor today railing about Lindzen in a “how dare he question my work” type attitude. After all, she says, my work was peer reviewed in “Science” the leading peer reviewed science journal in the U.S. (and as we all know selects articles for sales value like Newsweek and brought us the famous Korean stem cell fraud scandal, which was also peer reviewed).
She also identified herself as an “historian”, I think some thought her to be a scientist due to her background, but she has given that up it seems. She violates one of the cardinal rules of the scientific method (objectivity towards the outcome of an hypothesis), by stating that she never said the science was settled in her study, but then goes on at the end of the letter to challenge the business community to take action immediately to stop it. Very odd for a scientist, but understandable for an historian, who needs very little other than their own opinions to determine proof and fact.
And BTW, I did read that Hansen said that, it is here:
http://www.sciam.com/media/pdf/hansen.pdf
Look near the top of page 30, I can’t understand why people try to deny he said it, or maybe I can, given the politics of this issue.
[Response: Just a problem of your reading comprehension then… And your characterisation of the Oreskes study is way off base. She asked the question whether there was evidence in the published literature for the existence of a scientific consensus, and in taking a reasonably large sample and not being able to find a single contrary peer reviewed article, she correctly deduced that there can’t be much of a ‘contrary’ literature. No surprise there. – gavin]
John L. McCormick says
RE# 51
I too enjoy, at times, the wisdom of Lynn. And, I too struggle to conserve my energy use and practice other eco-habits out of a spirit of love.
But, the modern American lifestyle and a warmer climate are not helping me save energy. My home has been invaded by blinking green lights my teenage son and we have purchased along with electric stuff. Maybe 7 percent of our electric bill goes into standby power and some of that power demand (air conditioner) is inelastic, at least by the rules my family has set down.
Cutting energy consumption, in much of America, is not an end in itself and we all must understand that fact.
Electric current is not the problem. It is how that current is generated that we have to keep clearly in focus.
I live in Northern Virginia and my electricity is delivered by a Virginia power company but only through its wires. The actual kilowatts come from any plant in what is called the PJM or the Pennsylvania-Jersey-Maryland Interconnect, the power control area for 51 million people in 13 states and the District of Columbia.
The electricity essentially is dispatched from PJM regardless of the state in which it was generated. PJM insures that there is enough power to meet expected customer electricity demand at all times plus an additional reserve margin above the peak demand is ready and deliverable in the control area.
PJM determines the power demand for the next day and invites power stations to bid electricity at a price to serve that demand. The lowest bids get invited into the power grid first then more expensive power is committed until the projected demand is covered by power generators now contractually obligated to feed electricity into the grid.
How does PJM dispatch electricity over the day?
Conventional nuclear or fossil fuel power plants are called on first because of their relative low cost to operate and ability to deliver power into the grid at all times and are called base load plants. Others plants operate as “spinning” reserves waiting to be called on by PJM as the load increases during the day. They are backed off as the load decreases at the end of the day. Most natural gas combined cycle plants operate in this manner because they have higher operating costs and can deliver energy quicker when called on by PJM. PJM insures the lowest cost electricity is dispatched first, insures the reliability of the electric grid and monitors the market to prevent market powers/manipulation.
On July 17, PJM customers set an official record for peak electricity use of 139,746 megawatts (one megawatt of electricity is enough to power 800 to 1,000 homes). Since PJM territory partners have about 165, 000 megawatts of power generating capacity on hand, PJM had no difficulty meeting the demand. So, PJM used 85 percent of its generating capacity in a market territory expanding in population and experiencing longer, hotter summers.
If every PJM customer used 10 or 20 percent less energy, the same mix of power plants would still be put into service and those are primarily nuclear and coal plants and some hydro. Natural gas plants would be used if needed but their higher fuel costs would push up the floor price of electricity for that day. Wind would have the chance to bid its services as would solar (at least when the sun shines).
The PJM Interconnection is not unique. Electricity deregulation caused the formation of other independent system operators (ISO) that pool generation and dispatch lowest cost power to their territory customers. New England, New York, California and the Midwest also operate under ISO control.
Yes to electricity and petroleum conservation. But, there is a lot more behind that golden door than the public is aware and policy makers are willing to sort out.
Nuclear and coal plants are the machines that drive the American economy — like it or not. As coal and nuclear plants age, — and believe it, a huge number are already past normal retirement age – they will not easily be replaced by cleaner, lower CO2-emitting plants because that fuel is becoming scarce (unless you like the idea of importing liquid natural gas from Algerian ports). New coal plant plans and construction are now the first option of the power industry.
Wow, we are failing our children; all of whom we profess to LOVE.
John Donohue says
RE: 50 Ron Taylor,
I am not anwhere so forgiving. If it was a glitch, then it is utter incompetence. And if the point itself was “irrelevant” as you claim, why have that person tromping through the ice melt to make that comment as the lead ‘fact’ of the show??
I doubt it was a glitch. It is a piece of misinformation scripted and left in the final cut, and it gives Brokaw plausable deniability; after all it is TRUE that things were a lot colder 20,000 years ago! But the impression left behind is that ‘there is something VERY wrong, things should not be melting like this.’
Remember, this is a “consumer level” show on Discovery; it moves very fast, and all the “average person” gets from that moment, right at the top of the show, is that ‘the ice is melting all around me and the water is raising the sea and it was a lot colder up here just a little while ago.”
I have at least 10 other points in that show that made me snarl, I just don’t have time to address them now. I taped the show and will parse later.
Sorry to be off-topic a little, but all back and forth on AGW, including the Wegman hearing and the upcoming IPCC report are ‘of a piece’, no?
Alan - RE #33 says
If you pull an ostriches head out of the sand, will it attack you?
John Donohue says
RE: 61 Martin Lewitt
“the Hadley model, with animated natural and anthropogenic lines progressing across a time chart, and the anthro rising with the temperature and the natural not as much.”
You’ve pointed to the one item that got my attention as potentially significant. If that thing is actually fair, then it is persuasive for AGW. As part of the ‘loyal opposition’ to the AGW consensus, but not as knowledgeable as you obviously are, I need to pursue the validity of that “divergence” graph. I have a tape of the show; I have not been to the Discovery Channel website to see if the chart and its animation and data/basis are there.
You’ve provided an excellent critical contention against it with your incisive objection and supporting links…so thank you. I’ll go down that street.
Hank Roberts says
>43, 62, George Landis
Read this, from Hansen. I agree you haven’t understood it. He wrote there:
“Summary opinion re scenarios. Emphasis on extreme scenarios may have been appropriate at one time, when the public and decision-makers were relatively unaware of the global warming issue, and energy sources such as �synfuels�, shale oil and tar sands were receiving strong consideration.”
Now, read this page I’ve excerpted below (or at least read the snippet I copied here for you) — please. The scenarios were from the energy industry and government — assuming using all the coal, synfuels, shale oil and tar sands. Hansen calls them ‘extreme’ — he didn’t originate the assumptions, he started working out what could happen if that much fossil fuel was burned up in the last few decades. That was when he and other people first started looking at the consequences. The consequences were, yes, extreme.
Gavin means, I think, that comprehension requires context. You need to know, or find out, what Hansen’s talking about to know what he meant.
I used Google, looked for information about, in Hansen’s words, a “time, when … energy sources such as ‘synfuels’, shale oil and tar sands were receiving strong consideration” and I found: Results …about 295 for energy projection synfuel tar
This all may be from years before you were born; we can’t assume you know the context. We can try to help you look it up.
Here’s just one example from 1982 — back when energy use was predicted to go up, and up, and up, as Hansen describes:
http://sedac.ciesin.columbia.edu/mva/ERMUM/ERMINTRO.html
“One of the major issues in long-term energy analysis is the magnitude of energy supply. The model has two classes of energy supply …. The second class, which includes unconventional oil, unconventional gas, coal, … is considered resource unconstrained. That is, the amount of the resource, relative to potential demand is sufficiently large that for practical purposes, resource size alone does not constrain the rate of production.
“….
“Shale oil and tar sands.
“… unconventional sources of oil. There is far less controversy surrounding resource estimates for shale oil and tar sands. These sources are known to be in massive global supply. … in a full accounting of energy demand.”
I hope this is helpful. You can ask your school librarian or public library reference desk for help locating more of the relevant energy predictions from the 1950s, 1960s and 1970s — before 300 baud modems — mostly only available in paper form. There are reference indexes for books and magazines at the library.
Don’t give up. You can understand this stuff, in context, by looking for it.
Mark A. York says
“scientific community has come to a consensus, you just don’t want to believe it”
I should have put the clip in quotes. That was Noonan speaking after my intro. She can’t handle the truth and clearly doesn’t know what the story is.
Mark Shapiro says
RE # 63 –
John McCormick gives a good quick overview of how electricity gets to all our computers (and AC, lights, etc). It’s also a great place to show how well efficiency and renewables can help. Start with the 7 percent of his bill going to standby. Cutting that in half in all PJM’s 51 million customers cuts baseload by more than 4,000 megawatts (MW). That’s about 4 large (1,000 MW) power plants. Energy efficiency technology has improved in all the big use categories: lighting, refrigeration, and air conditioning, but implementation lags. Simple insulation alone reduces energy demand in summer and winter, plus it makes us more comfortable.
Solar is intermittent, but it supplies power when demand is highest, a perfect fit. This reduces the need for those expensive peaking plants, and the gas that they burn.
The choices are ours to make. They are individual choices, but good policy can steer us to better choices, through efficiency R&D, information, standards, and (dare I say it) tax policy.
And if you think it’s just me, (and Lynn, and Amory Lovins at RMI, etc) talking efficiency, read this testimony by a Walmart executive in those other House hearings on Thursday:
http://reform.house.gov/UploadedFiles/Wal-Mart%20-%20Rubens%20Testimony.pdf
Walmart’s goals include:
To be supplied by 100 percent by renewable energy.
To create zero waste.
Walmart!
isaac held says
Re 61: Martin. A common misconception is that a climate model must confidently generate the energy balance of the earth to within 1 W/m2 before we can even begin to use it to discuss the sensitivity of the climate to a perturbation of 1 W/m2 in the forcing. Suppose that we are interested in the response to a perturbation of 0.1 W/m2; do we need a model with this level of accuracy in its energy fluxes before we can trust the model? Or 0.01 W/m2? Where does this end? It might have occured to you that scientists who use models in discussions of climate sensitivity must have thought about and rejected precisely the idea upon which your comment is based. Why do we reject this argument?
Sometimes an equation is worth a thousand words:
S(1-a) = A + BT.
The left hand side is the absorbed solar flux and the right hand side the outgoing infrared flux. S is the incident solar flux averaged over the Earth, a is the planetary albedo, T is the temperature, while A and B are constants obtained by linearizing the infrared flux’s dependence on temperature. Increasing carbon dioxide increases A; water vapor feedback decreases B, etc. The sensitivity of T to a perturbation dS in the solar flux S is dT = (dS)(1-a)/B. If the albedo is actually a + da, then the sensitivity is instead dT =(dS)(1- a – da)/B. This is a small difference in sensitivity if the albedo error is relatively small, even though the sensitivity of T itself to the error da, which is -(da)S/B, might be comparable to the temperature change (dS)(1-a)/B that one is interested in! I hope this makes sense to you. This linear equation may seem naive, but it is not a good idea to worry about nonlinearity until one has a good grasp of this simplest linear energy balance perspective.
As a secondary comment on your remarks, you quote numbers that evidently refer to surface albedos, but it is the planetary albedo, not the surface albedo, that is most closely related to climate sensitivity (see Ray’s Nov 2005 piece: A busy week for water vapor). A model that has too large a planetary albedo will have a surface that is too cold on average (holding everything else fixed) even if it has too small a surface albedo.
Martin Lewitt says
Re: 71, Isaac, I wasn’t talking about sensitivity, other than to point out that the positive albedo biases of the AR4 models reduces their sensitivity to solar.
A model, however, does need to balance the energy budget to less than 1W/m^2 to get the climate commitment right, in a climate such as ours where the net heat flux into the oceans is also less than 1W/m^2. A model that is throwing away over 1W/m^2 of solar via globally averaged annual albedo bias at the surface, and yet still “correctly” balances the energy budget, reproduces the temperatures well, demostrates a realistic level of unrealized climate commitment, must be making compensating errors elsewhere. You are right, that the compensating error may be in the planetary albedo. But the compensating errors may be in other feedback mechanisms or forcings, or given the variance in model sensitivities to CO2, it is likely to be in increased sensitivity to CO2 in at least some of the models.
In a non-linear system, it would be surprising if the compensating errors, left the model in a state useful for attribution, climate commitment or credible predictions.
Hank Roberts says
>43, 62, 68
Oh. You don’t have a problem with reading comprehension, you have bunk sources
.
I notice belatedly that ‘s one of Inhofe attack PR pieces, debunked at Scientific American:
http://blog.sciam.com/index.php?title=half_baked_smears_against_climatologists&more=1&c=1&tb=1&pb=1
“Incorporating a strategy hatched by the committee’s new communications director, Marc Morano” — formerly with Rush Limbaugh, an early Swift Boat propagandizer, who sent them the press release personally. Proud of himself.:
http://blog.sciam.com/index.php?title=senator_inhofe_s_pet_weasel&more=1&c=1&tb=1&pb=1
John L. McCormick says
RE #70
Mark, I am heartened by your optimism.
I want to reiterate the central point in #63. Granted, not all US electric customers are dependant upon a conglomorate of electric power generators to supply power to blinking green lighted electronic stuff. But, Northern Virginians are locked into receiving the cheapest power available to our centrally dispached power control system; the PJM (see #63 for explanation of PJM).
When I am forced to go deeper in debt (and sooner than I can really afford) to replace my leaking electric water heater it will be the most efficient available. Good news. It will also be powered by the cheapest kilowatt-hour of electricity the PJM can make available to my local power distributor. That means my efficient water heater will very likely be using a majority of electricity produced in coal-burning power plants. Bad news.
That is the dark side of electricity deregulation.
The electric utility commission, in states that deregulated electricity, has little to no power to command what plants will be built where. Those non-utility plants are electron factories owned by investors, i.e., stock holders.
Electric utility deregulation is the deep pot hole Amory Lovins and others have yet to map a detour around. Some electric distributors offer packages of renewable energy to eco-conscious customers. The reality of that green option is that the power came from whatever sent that electron into the wires.
If I wanted only solar power 24/7, my night time reading would be done under an oil lamp.
Please excuse the details. Nothing in life is perfect. Why did we expect utility deregulation would open the gates for renewable energy?
Dereg put a premium on the lowest cost kiowatt-hour of electric power.
And, that is making electric power conservation a difficult investment for homeowners and apartment managers who would do it for climate change reasons. Their capital investment would not shut down that low cost coal plant.
Ron Taylor says
Re #64
John, I agree that Brokaw’s statement about Patagonia being much warmer than 20,000 years ago was misinformation. It does not even come close to describing the seriousness of the situation. Try this Science Daily report from NASA-JPL for a much fuller picture.
http://www.sciencedaily.com/releases/2003/10/031017074133.htm
Halldor Bjornsson says
The statistical arguments against MBH98 have never been convincing. What matters is the amount of variance explained (or how many PCs are significant), not how the shape of the first PC will change by centering. This discussed elsewhere on this site, the data are freely available…
It is therefore unbelievable how much airtime the “controversy” has got.
In my opinion what is needed is not more discussion of statistics, but more proxy records. And records of differing type. Especially for the first half of the last millenium.
And also, better understanding is needed of the uncertainities associated with climate reconstructions for each type of proxy (see #17 above).
Pherhaps, these hearings will lead to congress increasing funding for paleoclimatic research. Or maybe they will just be happy with this theatre-of-the-absurd.
Mark A. York says
I lived in Alaska for two years with no electricity or running water. I read by Coleman lantern and rejected avgas from helicopter operations in the area. It’s not for the faint of heart.
http://www.amazon.com/gp/product/0595219977/ref=lpr_g_1/102-6157309-8846513?s=books&v=glance&n=283155
John Donohue says
RE: 64 Ron Taylor,
No, the misinformation is not, as you stated, that the situation is worse than implied. Specifically and impactfully, it was a dishonest comparison. I guess I’d call it disinformation.
The show was emphatically about “Global Warming” which absolutely has the connotation of AGW, rapid recent warming over the last 150 years.
Meanwhile, the comparison, fired off casually and rapidly, was between now, at an up-swing fluctuation of an interglacial, with 20,000 years ago, earth in the grip of a strong glaciation. The difference in conditions, obvious to us but not to the “consumer” of pop-science on Discovery, is necessarily extreme. Therefore you have both plausible deniability and the ability to enthuse the speaker with passion, and the clear implication that that passionate statement referred to scary, rapid change from AGW in the very recent last few decades.
Why didn’t Brokaw have his speaker say instead “Wow, it’s a lot colder here now than it was 8000 years ago when hottest point of this interglacial struck. You should have seen the water running off this glacier back then.”?
My contention, not proven, is that because to say the honest thing “Wow, it’s warmer here than it was 80 years ago, but that’s just the normal ebb and flow of this glacier during an interglacial” would not support the advocacy of the program.
Meanwhile the link you sent me to has no science by itself; it is a synopsis of the well known Rignot study. That study is not linked, although there is a link to NASA at the bottom.
Have you vetted the Rignot study to get satisfaction that a comparison of surface topographical information could actually be joined with space shuttle data in a useful way? I’d be more interested in a new set of shuttle data taken under intensely controlled identical conditions to the 2000 mission and a comparison of the two.
Jeffrey Davis says
Look near the top of page 30, I can’t understand why people try to deny he said it, or maybe I can, given the politics of this issue.
Hansen said that the former emphasis on extreme outcomes may have helped bring about attention etc. You cannot make that mean that Hansen is currently encouraging emphasis on extreme scenarios. Particularly when so much of the work you’re quoting from is encouraging a realistic and measured approach to the problem.
Technically, you have turned a sentence in the subjunctive mood about an action in the past and turned it into a statement in the declarative mood about a present action. If he’d written, “Those tracks may have been made by a lion”, that wouldn’t mean that there was a lion outside now. And it certainly wouldn’t mean that he’s set a lion loose to make the tracks.
caerbannog says
The statistical arguments against MBH98 have never been convincing. What matters is the amount of variance explained (or how many PCs are significant), not how the shape of the first PC will change by centering. This discussed elsewhere on this site, the data are freely available…
Indeed.
Download http://www.climate2003.com/pdfs/2004GL012750.pdf and take a gander at figure 1. The upper plot in figure 1 is one of hockey-sticks that M&M “mined” from red noise. The lower plot is MBH’s hockey-stick. They look remarkably similar, right? But look at the Y-axis scales — there’s nearly an order of magnitude of difference in the Y-axis ranges for the two plots. ‘Nuff said.
Armand MacMurray says
Re: #76
“…more proxy records. And records of differing type. Especially for the first half of the last millenium.
And also, better understanding is needed of the uncertainities associated with climate reconstructions for each type of proxy (see #17 above).”
Exactly right!
Ron Taylor says
Re 78
No, I have not vetted the Rignot study. That would be beyond my level of knowledge. I depend on the experts at RC for that sort of thing. However, I am not aware of any published work that has discredited Rignot’s results.
I could draw on my small-town roots in the rural Midwest to tell you that I believe you misunderstand the likely reaction of the average viewer to the 20,000 year reference. I can easily picture those people asking, “So what in the bleep does that have to do with us?” They would have no frame of reference to give meaning to such a time span.
Then I went back and reviewed the Patagonia part at the beginning of the Discovery Channel program. I realized that you had really misrepresented what was said. Here is the section in question.
Tom Brokaw: In the last seven years, these glaciers (in Patagonia) have lost 10% of their mass.
Switch to (scientist) Stephan Harrison: 20,000 years ago, this valley was much more covered in ice than it is now. The reason why we think climate change is so significant now, though, is that it is happening at a historically fast rate.
Assuming Brokaw’s factual statement is accurate, it seems quite meaningful. Harrison somewhat clumsily tosses in the 20,000 year reference, but then neutralizes it by explaining that it is the rapid change of recent years that is of concern. That is also relevant to the subject at hand.
I will have no further posts on this topic.
Eric Swanson says
Re: #80
For what it’s worth, one should note that M&M’s paper is actually doi:10.1029/2004GL021750, not the number given in the link. Another goof from the careful analysts, M&M??
Hank Roberts says
I think the House Energy Committee chairman on his website gave away the real purpose of this push — taking control of the actual research done by agencies away from the scientists and giving it to political managers.
Wegman, as quoted by Barton, on what statisticans do:
http://energycommerce.house.gov/108/News/07182006_1995.htm
“With clinical trials for drugs and devices to be approved for human use by the FDA, review and consultation with statisticians is expected. Indeed, it is standard practice to include statisticians in the application-for-approval process. …”
New England Journal of Medicine:
http://content.nejm.org/cgi/content/full/353/10/969
FDA Standards — Good Enough for Government Work?
Jerry Avorn, M.D.
“…there is one area of biomedicine in which the government allows — even defends — a minimal standard that would be unacceptable anywhere else in research. It is the set of evidentiary requirements maintained by the Food and Drug Administration (FDA) for the approval of new drugs.”
C. W. Magee says
re 74:
1. If you buy a solar/electric hot water system, it will only use off-peak, cheaper electricity, since all the daytime water heating will come from the sun.
2. The reason that coal is the cheapest power source is that it does not include the cost of global warming. If coal had to pay into a compensaton scheme, then nuclear, hydro, and renewable might be cheaper.
caerbannog says
For what it’s worth, one should note that M&M’s paper is actually doi:10.1029/2004GL021750, not the number given in the link. Another goof from the careful analysts, M&M??
I’m not a professional climatologist or statistician (not by a long-shot), but I’ve used SVD/eigenvector techniques fairly extensively in acoustic signal processing work. I find it hard to believe that anyone would put a singular vector with a small associated singular value on the same “numerical footing” as a singular vector with a large associated singular value.
How did a blooper like that get past the reviewer? (If I tried a stunt like that at a project review meeting, I’d leave the meeting sporting a brand-new posterior orifice!)
Chuck Booth says
Re: #84″With clinical trials for drugs and devices to be approved for human use by the FDA, review and consultation with statisticians is expected. Indeed, it is standard practice to include statisticians in the application-for-approval process. …”
The required (or expected) involvement of a statistician in biomedical research (i.e., in preparing grant proposals and papers for submission) is relatively new (past decade or two)and is consequence of the fact that, historically, most biomedical scientists had little or no training in statistics – as a result, the quality of statistical analysis in journal articles and preliminary data for grant proposals was pretty bad. So, funding agencies and journal editorial boards had to beef up the quality of submitted work by requiring better statistical analysis (some biomedical journals include a statistician in peer review).
It is my impression that rigorous statistical analysis has long been the rule in the physical and natural sciences (at least in some fields of biology, though not all.
Leonard Evens says
Re #84 and #87:
It is important to keep in mind that all statistics is not the same. In biomedical applications, it is possible to visualize a drug trial or something similar in terms of simple models involving sampling from an urn with different colored balls. That means it is relatively easy to isolate the statistical aspect of the subject from those aspects which require an understanding of content. So a statistician aiding such a study need not be a biologist to contribute.
As far as I can tell, statistics in observational climatology arises in the context of time series. That subject is a bit murky, and it is not so clear what type of statistical model one should use. For example, consider the discussions in RC of white noise vs. red noise. It seems to me that a thorough understanding of the subject matter would be much more important.
It is also important to note that not all statisticians are the same. In a complex situation in climatology I would be most comfortable with someone who had both a deep understanding of the science and also was capable of going back to first principles to be able to analyze just how to model the problem statistically, rather than someone who would just choose from one of a standard set of statistical sets. I don’t mean to downplay the difficulty of statistics, but from my personal experience, at least, I think it is considerably easier for a good climatologist to master statistics than for a good statistician to master climatology. I doubt that a statistician who doesn’t know the basics of climatology can contribute much to that subject.
Even in biostatistics, disputes can aise between scientists and (some) statisticians. This was highlighted in reports in the NY Times on the effect of supplements on bone fractures in women. In a large double blind study, the effect on the total population was not statistically significant, meaning that any differences could have happened just by chance. That means that subject to various assumptions, which themselves can’t be established by statistics, there was a chance greater than a certain level, often 5 percent, that there was in fact no effect and the differences were due just to random variation in selecting the samples. But it happened that women who adhered strictly to the regimen and also older women did show a significant (in the same sense) effect. The hard nosed statisticians objected, as they almost reflexively will in such a case, that you can’t look at subgroups after the test. There are two objections. First you can’t be sure the subgroups were randomized even if the whole group was. The second is that if you look at enough subgroups, you are very likely to find at least one that shows an effect. (ESP studies typically suffer from this type of problem.) Both these objections are purely statistical and not based on substance. But it seems to me the second objection is not so strong that it should override all other considerations. At the very least, if you find the effect in a relevant subgroup, it should suggest further studies. Also, the study population was not divided into a large number of subgroups. In particular, one of the subgroups, women over 60, would be a natural group to consider on substantive grounds.
Note by the way that about one in twenty studies found significant by the 5 percent significance level will in fact be wrong. Just how do you wrap your head around something like that? Perhaps I am wrong, but it has always seemed to me that practical use of statistics requires faith in some unprovable assumptions and the advantage a good statistician might have is that, understanding the theory behind it all, he/she may be more aware of that fact.
Karen Street says
Off topic, but on a topic many have been discussing:
How do we fight the proposed coal power plants built without carbon capture and storage?
It makes sense for us to write our newspapers, and contact our legislators, pointing out the advantage of a system like in California, where utilities are required in making plans to assume an ever increasing carbon tax. It makes sense to point out that increased costs of mitigation of greenhouse gases and adapting to (or just plain losing out to) climate change will swamp the small savings in electricity costs.
Other arguments? Other people to argue with, besides newspapers and legislators?
Is there any format here or elsewhere to discuss these kinds of questions?
I will post this question on my blog and perhaps people can respond there.
Roger Smith says
Re: John in #74, restructuring doesn’t necessarily put a premium on the cheapest generation (it would if markets were truly competitive, but they’re not), but rather incents generators to maximize the difference between the cheapest baseload generators (coal, nuclear) and the most expensive generators (natural gas), as the price is set by the marginal, or most expensive, generator. Their profit is the difference between the two.
Even within this framework, states like NJ are able to mandate that large amounts of renewables be part of the mix (20% by 2020 in NJ case), with further mandates for increasing amounts of solar. NJ and MD are also part of the Regional Greenhouse Gas Initiative which caps carbon dioxide pollution from power plants (10% cut by 2020).
You can definitely knock fossil fuel plants offline. In CA, they’re doing this through a “loading order” of efficiency first, renewables second, and combined-cycle natural gas third, both to meet daily power needs, and for energy planning purposes. Maine and Rhode Island (both restructured) are also moving in this general direction by putting least-cost efficiency before new power plants, and mandating that the utilities do energy planning instead of just payign for more supply. Reducing demand changes the economics of energy and makes it unlikely you’ll get that new coal plant.
Green power options are also real in that you’re not paying more just for the electrons, but the premium is for the social and environmental benefit associated with the type of clean power. These voluntary programs help build the market for clean energy.
Lynn Vincentnathan says
RE #63, there are still lots & lots of things people can do to reduce their GHGs. I know an architect near Chicago who built a passive solar home — shrub-covered birms & small windows & garage on the north, large window/sliding doors on south, letting sun shine on the brown tile floor, absorbing heat. He also did lots of other things, insulation, foam insulated window shutters, deciduous shade trees on the south to cool the place in summer, and a small shed for a highly efficient gas house heater/water heater combo (when it’s not heating the house, it can heating water. The extra 5% he paid for his house, which uses a tiny fraction of the energy of other houses, paid for itself in about 15 years, & is going on to save money.
I have my TV & VCR on a strip, which I turn off, unless I’m recording something. Same with my computer — the strip goes off when not in use. I just feel blessed I’m on 100% wind power.
In the south, there are other ideas to reduce cooling costs (I think U of AZ school of architecture is into that).
The big problem is that most ideas for reducing GHGs are really small (but add up), & perhaps require more work & attention to ferret out & implement. But that shouldn’t stop such a great nation as ours (unless we go barton & throw up our hands in hopelessness).
Lynn Vincentnathan says
Anyone care to comment on a new study re methane from ocean hydrates perhaps going almost entirely into the atmosphere if they warm (rather than a large part dissolving into other stuff).
See: http://www.climateark.org/articles/reader.asp?linkid=58603
Perhaps “hockey” is a bit too tame nowadays; maybe a “J” or shepherd’s crook shape may prove to be more accurate. But by the time the science is in on it, we probably won’t be able to stop, much less reverse, it.
Barton should be investingating the runaway tipping point of no return. He might do better getting a few scientists on his side…& alerting the public about this real possibility of “hysteresis” with all his negative publicity.
caerbannog says
Just a wee bit off-topic… but the LA Times has just published an op-ed written by Naomi Oreskes, author of the Science journal article BEYOND THE IVORY TOWER: The Scientific Consensus on Climate Change.
You can read a “fair use” (no hassles with registration) copy of Naomi’s op-ed at: http://www.commondreams.org/views06/0724-28.htm
Richard Simons says
Lynn’s comment about a passive solar home reminded me that in the 1960s a high school in England (St. Helens? Eccles? – somewhere between Liverpool and Manchester) was built to rely on heat produced by the lights and all the bodies. Initially it had a heating system but after a few years it was taken out as it had never been used. On cold mornings, the lights were turned on to warm the place up. Admittedly that area does not get anything like as cold as many places in the US but materials such as glass have improved a lot since then.
John Donohue says
RE: 78
No, you misrepresented my interpretation of the 20,000 year comment. It serves no honest purpose. On the other hand, it serves as plausible deniability for the claimant (that’s what the 20,000 is for, if they are ever challenged, not for the public, who, yes, will not notice the number), but enables dramatic emphasis for the speaker in making the contrast between solid ice and current receding glacier. As I said before, I’d love to see that very spot where Harrison was standing 8000 years ago, a more fair contrast date, and see how alarmed he’d be. My objection stands.
It’s not too easy finding facts about the glaciers’ history. I did find this link:
http://pubs.usgs.gov/pp/p1386i/chile-arg/wet/past.html
which may indicate glaciers HAVE died recently, not (as Tom intimated) ‘having survived since the last ice age’. I am not sure this study refers to the same glaciers as Harrison has been studying.
From that link:
“The chronology of Holocene moraines in front of Glaciares [sic] Upsala and Tyndall has been established by Aniya and Sato (1995a, b). The moraines were deposited around 3.6 ka, 2.3 ka, 1.4 ka, and 250 years before present. (In the Lakes District, Mercer’s (1976) dating of the oldest moraine is different, 4.6-4.2 ka, and the third one is not found.’
What is meant by “moraine being laid down”? Doesn’t that mean a glacier that did not survive, and it’s melting occurred during the Holocene, leaving a moraine?
Also, in another paragraph on the work of Caldenius, a picture seems to be created of massive ice caps during the big glaciations, certainly, but also empty moraines during the Holocene.
Certainly anyone better than I at interpreting this study please explain deeper.
I don’t see how this study supports Brokaw’s claim of ‘glaciers surviving’. Also, why chose that particular thought, which is then contrasted with his alarm-voice claim that they have recently lost 10% mass? Is the intimation that now they are now, for the first time since the last ice age, in threat of being lost.
Also, the claim about the speed of retreat of the Patagonian glaciers? I wonder where the science is to support that it is significantly swifter than normal.
It is so typical for glaciers to wax and wane in cycles, how was it determined that this one particular retreat is extraordinarily fast?
Jim Redden says
RE #93 reference to Naomi Oreskes’ Global Warming– Signed, Sealed, and Delivered
Seems pretty on topic to me. However, I’ve been critically mulling over some of Lindzen’s theories, and taking a cue from his pondering, perhaps this season we have exchanged heat waves in lieu of the potent display of kinetic hurricanes.
Indeed, Oreskes does a fine job of making her point… Thanks for calling out the article.
Also, in the same section, the LA times has another article in the opinion section about California’s right to regulate CO2 emissions, “Global Warming on Trial: The Supreme Court is right to weigh in on the globe’s hottest issue”. (I think you can get to it without registering in the opinion section).
After reading some international law, I can see how important it is to some interests that co2 is not classified as a pollutant both domestically and abroad.
Barton Paul Levenson says
Okay, I’ve neatened up the discussion of planet temperatures on my website and added a link to a paper which discusses what “optical thickness” means, with mathematical definitions and a worked example using Beer’s Law. This paper is much smaller than the planet temperatures one, about two pages, so I’d appreciate if someone knowledgeable could take a look at it and tell me if I screwed up anywhere. The URL is:
http://members.aol.com/bpl1960/Optical.html
Thanks in advance.
-BPL
James Annan says
Since I’ve just written about it at tedious length (here and subsequently), I can’t resist the temptation to point out that Leonard Evens’ characterisation (in post #87) of what NHST tells us is incorrect:
At least, I don’t know about the paticular case in question, but it is not generally the case that a non-significant result justifies the belief (at any specific level of probability) that there is no effect. In fact NHST doesn’t directly tell us anything much about whether an effect exists, and if so, what size it is. It’s not what it is designed to do.
Ron Taylor says
Re 98
Darn, and I thought I was finished with this.
Actually, John, I said nothing at all about your interpretation of the 20,000 year reference, which involves assigning ulterior motives to Brokaw and his colleagues. But since you brought it up, I do think your logic is convoluted to the point of meaninglessness. What they said (Post 82) stands on its own and requires no deniability of any kind. That is simply my own opinion and I accept that yours is different. Anyone who wants to know where this rabbit trail has led can simply read my post 82 and your post 98 and decide for themselves.
Okay, I really am finished with this.
Hank Roberts says
An amazing (to me as a mere reader) comment’s recently posted in the Scientific American blog — this link:
http://blog.sciam.com/index.php?title=half_baked_smears_against_climatologists&more=1&c=1&tb=1&pb=1&template=popup#comments
Find this one:
Comment from: cearbannog [Visitor]
dated
July 22, 2006 @ 20:23
There’s no direct link to the comment so I’ll quote a chunk of it.
I _think_ this point was already made here and elsewhere and I just never understood it before. Is it correct as stated here?
Quote:
=-=-=-=-=-=-=
M&M generated a bunch of band-limited (“red”) noise data sets and performed eigenvector (aka principal component) decompositions. They demonstrated that “hockey-stick” shapes can often appear in the leading principal components even if the input is just random noise.
However, there is a major problem with their claims. And it has to do with the fact that the “hockey-stick” principal components that M&M generated from random noise had associated eigenvalues that were *much* smaller than the eigenvalue associated with the leading principal component of MBH’s “hockey-stick” data. To determine how important a particalular “principal component” is, one must first scale it appropriately with its associated eigenvalue. You can see this problem in M&M’s own work by looking at figure 1 [p. 19 in the PDF file] in http://www.climate2003.com/pdfs/2004GL012750.pdf
Figure 1 shows two time-series. The upper time-series is a “hockey-stick” shaped principal component that M&M generated from random noise. The lower time-series is the leading principal component that MBH generated from their “hockey-stick” data. A cursory look at the figure shows that the two “hockey-sticks” are remarkably similar, and that MBH’s hockey-stick could very well have just been fished out of data with no real temperature trends.
But take a closer look — look at the Y-axis scales of the two time-series! The MBH “hockey-stick” Y-axis scale (lower plot in figure 1) has a range of roughly -0.5 to +0.3. The Y-axis range of M&M’s “random-noise” hockey stick (uppr plot) has a range of about -0.08 to +0.025 or so. There is nearly an order of magnitude of difference in the two plot scales! That is, M&M exaggerated the “significance” of their random-noise “hockey-stick” by nearly a factor of 10! If both plots were properly scaled, it would become painfully clear that M&M’s random-noise “hockey-stick” would be completely insignificant in comparison with MBH’s hockey-stick data plot.
=-=-=-=-=
As I said, I think this was hinted at elsewhere and maybe explained here in the thread and I just didn’t get it til now. Is that right?
Ron Taylor says
Sorry, I meant people could compare your post 95 (not 98) with my post 82 and decide.