The RC open thread.
With a reminder that this is not a dumping ground for anything under the sun, but is rather for discussing climate science topics that don’t fit neatly into ongoing discussions.
Reader Interactions
366 Responses to "Unforced variations, July 2011"
R. Gatessays
Okay, here’s a general question, especially for those who are professional climate scientists:
What, in your opinion, are the 2 or 3 biggest unknowns currently out there in the study of the climate? Things that if they were known, could have some impact on global climate models. As you list these, maybe you could give some idea what you think their maximum impact could be in terms of climate forcing.
[Reply: (1) Clouds (2) How much coal is really down there and recoverable? (3) Will we have the fortitude to leave any of it unburned? If I’m allowed four, I’d add: (4) Could the present terrestrial carbon sink turn around and become a source, leading to a PETM style carbon release adding to the fossil fuel carbon we add to the atmosphere directly? With regard to consequences, there is a very slight chance that clouds might make climate sensitivity low, but an almost unbounded prospect for clouds to make things very, very bad, perhaps even to the point of making much of the world uninhabitable outdoors for mammals — especially if there really is 5000 gigatonnes of coal and we wind up burning it all. –raypierre]
Edward Greischsays
James Hansen on conditions inside the government:
“TreeHugger: Under the Bush administration you experienced pressure and threats at times when you were talking about climate change. How has the Obama administration been different?
James Hansen: Well, it’s different in the sense that there’s no one trying to check on what I’m talking about and censor it. But basic problems remain. Namely that the public affairs offices are headed by political appointees who tend to feel that their job is to make the administration look good. I just think those offices should be headed by career civil servants who cannot be pressured to try to present what amounts to propaganda.
Also, government scientists, when they testify to Congress, must have their testimony approved by the White House, by the Office of Management and Budget. And again, I don’t think that makes sense. Government scientists are paid by the public, by taxes, and they should be allowed to give their best opinion without having it reviewed prior by the administration.
So those basic processes have not been corrected. But personally, I feel that, in this administration, that I’m allowed to say what I think is right.”
I don’t know if treehugger is any good, but I have heard some bad things about how scientists are being treated these days. I don’t remember the reference, but somebody said words to the effect that if scientists are receiving threats, the opposition must be desperate. I am retired now and when I was working, my job did not include communication to the outside world, so I would like to hear comments. My impression of politically appointed agency heads has been negative.
Office of Management and Budget [OMB] is the vice president’s domain.
Vincent Kosiksays
Read Mark Bowen’s account in his book, “Censoring Science: Inside the Political Attack on Dr. James hansen and the Truth of Global Warming” Plume book published by Penquin Group 2008 for a tale about the Bush administration successful attempt to halt any action on this crisis.
Not sure of the current administration. Also Dr. james hansen writes a bit about it in his own book, “Storms of my Grandchildren”
Really makes one angry.
“The research found less than 6 per cent of Australians are true climate change sceptics.”
There we have the reason for desperation on the part of denialists. Rich coal company stockholders stand to loose their shirts. Can a wealthy 6% control Australia’s government? Are the police willing to arrest the rich?
For anyone in or near Allegheny County, PA. I will give a lecture on global warming to the Parsec science fiction club on 9 July 2011 at about 2 PM. Location: Squirrel Hill Carnegie Library, corner of Murray and Forbes Avenues.
Pete Dunkelbergsays
R. Gates, imho the big questions are:
1. How long will we keep burning carbon like there’s no next generation?
The answer is blowing in the wind. The political wind.
2. How harmful will climate disruption become, and how soon?
Those two are the really serious issues.
3. The size and speed of various feedbacks –
a) Arctic amplification (Arctic warms fastest) was predicted [calculated] over a century ago, but it keeps running ahead of model projections.
b) Cloud feedback: “the great white hope” of the subject. Alas there is no sign that a negative feedback from clouds is going to save us, and paleoclimate also says don’t bet on it (or on any other unknown savior). There is slight evidence of a positive feedback from clouds.
c) Methane: this one is still blowing in the wind. But it can only hurt, not help.
4. How to make good regional projections.
Sub question: keep making global models with smaller grid size, or make specific regional models? The US Navy uses an Arctic regional model and supercomputers. No one has the resources to do that for every region of the earth though.
Earth scientists face a profound ethical challenge. Humanity is an integral part of the Earth’s ecosystem, but the waste from our industrial society is now driving rapid global climate change. What is our responsibility as a community of scientists? Is it simply to follow tradition and explore and discuss in our own world, largely isolated from the broader community, the many interesting facets and complexities of the transformation of the Earth’s climate system and then to publish our results in our private jargon in copyrighted journals that are not freely available to the public that is funding us? Surely this is for us just “business as usual,” an integral part of the problem, not the solution.
I suggest it is time to reconsider our responsibilities to society and to the Earth. Humanity will be unable to deal with climate change, in terms of both mitigation and adaptation, until a broad spectrum of society is fluent in discussing the issues and the choices we face. Changing the direction of our global society from its present unsustainable path is a moral and ethical
challenge as well as a scientific one.
However, broad understanding of the limits imposed by the Earth system is essential. Clear, open communication and discussion are needed at all levels of society, along with research directed at clarifying the limits for decision makers in local communities. The contribution of science, honest communication of the state of knowledge, is needed to inform and counter the
simplistic ideologies that are common in politics. I conclude that scientists need to
become more deeply embedded in society.
We all face the essential task of reducing human impacts on the Earth system ….
…
Nicksays
I was perusing web articles when I stumbled onto this one, called “An Inconvenient Fallacy.” With a title like that I really couldn’t resist so I opened it up and I found these assertions in the article…
“Fact 1. A mild warming of about 0.5 degrees Celsius (well within previous natural temperature variations) occurred between 1979 and 1998, and has been followed by slight global cooling over the past 10 years. Ergo, dangerous global warming is not occurring.
“Fact 2. Between 2001 and 2010 global average temperature decreased by 0.05 degrees, over the same time that atmospheric carbon dioxide levels increased by 5 per cent. Ergo, carbon dioxide emissions are not driving dangerous warming.”
Does anyone even know where they come up with these figures?
vukcevicsays
R. Gates says:
2 Jul 2011 at 10:26 PM Okay, here’s a general question, especially for those who are professional climate scientists:
What, in your opinion, are the 2 or 3 biggest unknowns currently out there in the study of the climate?
Once science is prepared to accept non-professional climate scientists the ‘known unknowns’ may become ‘known knowns’ as it is the case here: http://www.vukcevic.talktalk.net/A&P.htm
[Response:Those who have something meaningful to contribute can do so at any time. The problem is that doing so is generally a lot more demanding and not nearly so easy or simplistic as those non-professionals think.–Jim]
David Kiddsays
To Pete Dunklenberg at comment 11.
This is an “Opinion piece”in a Major Australian Newspaper, “The Sydney Morning Herald” written by a Mr. Bob Carter a well known Australian denialist who has just lately come back into prominence after a spell out to pasture so to speak. The reason being that the Australian Government is in the process of introducing a diluted form of Carbon tax into parliament. This has aroused all the usual suspects into a frenzy of activity.Mr Carter has form. Check the list to be found on the “Smog Blog” Rouges gallery
Regards, David Kidd.
Paul from VAsays
@Nick @11
Bob Carter’s piece has been completely demolished. It’s based on standard denialist tropes. The “cooling for ten years” trope comes from the fact that 1998 was a monster el nino year, and thus was much warmer than average and depending on how you count, the warmest year on record (2010 is very close or higher depending on which global temperature series is used). Of course, if you compare over a longer time period, you see that the 00’s were on average significantly warmer than the 90’s. More thorough demolitions of the Bob Carter piece can be found at the two links below: http://scienceblogs.com/deltoid/2011/06/bob_carter_not_entitled_to_his.php http://www.theage.com.au/opinion/society-and-culture/half-the-truth-on-emissions-20110627-1gne1.html
vukcevicsays
There is an atmospheric pressure sea-saw operating in the North Atlantic (NAO) controlling the winter temperatures oscillations in the wider area. http://www.vukcevic.talktalk.net/WPd.htm
Both sides engaged in the ‘climate trench warfare’ may find it of some interest.
> Nick says:
> 3 Jul 2011 at 11:08 AM
>
> I was perusing web articles when I stumbled onto this …
> …
> Does anyone even know where they come up with these figures?
Google found 519 different blogs with those fake numbers copypasted.
Now 520 since they’re reposted here. Try searching and reading some of the copies Google finds and you’ll see how they they make this stuff up wholesale. Repetition works because people come to believe stuff they see repeated. It’s a tool. Don’t let them use you by repeating their fiction.
No such numbers found by Google Scholar.
Chris Rsays
Hello again Vukcevic,
Over at Tamino’s blog you did a similar drive-by. http://tamino.wordpress.com/2011/06/16/open-thread-4/#comment-51746
I asked you how you came up with your NAP index, you offered your email address, and I declined a private chat favouring the open forum and benefit of other’s opinions and observations.
Now in addition to your NAP (North Atlantic Precursor) Index, you offer a PDO (Pacific Decadal Oscillation?) Driver index…
It strikes me that if your ideas have merit, history shows they will win through. However it’s damn near impossible to work out the basics, like your NAP index from the pages you offer. Without understanding the basics it’s impossible to evaluate the merits of whatever evidence you may have.
Wheelssays
@11:
The opinion piece is written by Bob Carter, which may tell you all that you need to know about the general direction and reliability of it. Here are specifics about the points you mentioned.
Carter is cherry-picking endpoints to arrive at his declines; start at the year 2000-2009 instead of 2001-2010 and 3 out of 4 datasets show a positive trend. Speaking of time periods, 10 years is such a short time in climate that year-to-year variations and short-term cyclical events can swamp signal in noise. Over 15, 20, or (best yet) 30 years, real long-term trends are more visible in the data. This leads to why his last sentence in Fact 2 is totally ludicrous.
He’s also cherry-picking datasets: if he had used NASA’s GISTEMP record, which attempts to describe the Arctic, there would have been a positive trend over 2001-2010 rather than negative. He’s likely using HADCRUT, which shows the .05 degree “decline” from 2001-2010 but doesn’t cover the Arctic region at all, which is where much of the warming is taking place. It’s amusing because he had previously tried to paint the Hadley researchers as duplicitous and fraudulent, so his reliance on their data NOW is pretty telling. John Cook concludes that this is because the last two years have been the hottest on record, so it’s much harder for Carter to claim that global warming “stopped in 1998” as he so loves to do.
So even if we take his 2 statements as barely factual (and his claim about CO2 is clearly not), they are not exactly compelling arguments because they rely entirely on cherry-picking small points in isolation rather than viewing the bigger picture.
Mike Roddysays
Scientists need to follow the late Stephen Schneider’s example, and work hard to explain to the public what the climate data means. Schneider and Hansen are about the only ones who understood that communicating the evidence about climate change to the public is far more important than perfecting it.
Instead, this critical task has been outsourced to floundering reporters and the occasional political leader such as Gore, who are taking a lot of heat due to scientists’ retreat from the public dialogue.
Key knowledge includes responsibility. About 30% of Americans have been persuaded by the media that global warming is some kind of hoax, which is all the cover our oil owned politicians need.
Time to wake up and attack, climate scientists. Set up booths on the Capitol Lawn. Organize teach ins, including in places like Dallas and Gillette, Wyoming. Show little patience for liars and prostitutes. Do the right thing for all of the earth, and commit yourselves to it.
[Response:Scientists haven’t retreated from the public dialogue. Notwithstanding the loss of Steve, there are more scientists trying to educate the public on the issues than ever, in various venues, such as this one. However, I agree that there aren’t enough doing it, and we need a lot more to do their part. But then, that’s a general problem with society as a whole, not scientists specifically–it’s always a small minority that stands up for principle, at least initially. As for the science, it’s not about “perfecting” it, whatever that means. There are still all kinds of important uncertainties that need to be understood much better than they are if there is going to be an rational effective response to the problem. Climate change effects chief among them.–Jim.]
Mike McClorysays
@Nick (post 11),
I don’t know where they’ve got those exact figures from but ‘Fact 1’ is just the usual ‘no warming in x years’ lie (meme is too forgiving a word). We know that there has been warming over that period – the timescale is too short to state that it is statistically ,significant to 95%. Extend the start point back and we can confidently state that it was statistically significant.
‘Fact 2’ appears to be just plain wrong given that 2010 was a join record high anomaly. I would suggest that the figure may come from one of the satellite records… Again it’s a cherry picked duration as well.
vukcevicsays
Chris R
I did post an answer two times but it was blocked, email was posted in order to respond to your question, since I thought it was impolite to ignore your request.
The answer is simple, I have collected lot of historical data dispersed through various institutions, all publicly available some on line some in various printed publications, put it all together and developed two data sets one referring to the N. Atlantic and one to N. Pacific.
This is a purely personal effort, at personal expense, so I am under no obligation to release details for time being, since I am preparing a more extensive publication.
I think it may require some assistance for presentation from an academic institution or some other establishment, but that is not priority at the moment.
[Response:The better course of action, by far, is to publish first and blog second. Once you start making public statements, you’re obligated to defend them. Nobody will take them seriously until you do.–Jim]
Urban Leprechaunsays
Can someone tell me how sea levels are determined?
Such a what do people do when it has just rained heavily on the sea, which, surely, must give a higher level until this rain has either evaporated or flowed away(?)to a lower level sea?
Brent Hargreavessays
Pete Dunkelberg, 3 July, 06:38.
So our knowledge of the sun’s effect on climate is so well understood that it doesn’t make your list in response to R. Gates?
I do hope you’re buying shares in that Canadian port being built to exploit the NW Passage. When Maunder II strikes, and it’s buried under the ice, it’ll stand as a monument to the hubris of the Global Warming industry,
Susannesays
Reading various responses to Bob Carter’s latest outpourings, I’ve realised that I have two ideas in my head (don’t laugh, it’s more than some people have) that seem to be in conflict:
1. El Nino / La Nina and other oscillations don’t actually change the temperature of the globe; they just move the heat around.
2. Global temperature was exceptionally high in 1998 because of an extreme El Nino.
I know that’s a grossly oversimplified way to talk about it, but it’s the level I’m at and I’m a long way ahead of many of the people I talk to. Is there an equally simple way to express how these things are both true – if they are?
Thanks
Paul Pentonysays
Susanne
Here is my attempt at an answer – others may do better.
During a La Nina event the thermocline which divides warm surface water from colder deep water is depressed on the western side of the Pacific and rises elsewhere. As a result you get a relatively deep warm pool on the west of the Pacific and relatively cooler surface water elsewhere. The net result is a reduction in average surface temperature.
Icarussays
We have raised atmospheric CO2 by a large amount (about 40%) but my understanding is that the amount by which we’ve raised the total amount of carbon in the entire climate system is still relatively small (I calculated it to be about 2% but could be well off).
Conversely though, atmospheric CO2 could decline quite quickly if we drastically cut emissions, because of absorption by natural sinks, whereas the total amount of carbon in the climate system only changes very very slowly – perhaps 10,000 or 20,000 times slower than the rate at which we’re increasing it today.
I think that means that although we’re not stuck with 390ppm or more of atmospheric CO2, we *are* stuck with the raised total amount of carbon in the climate system for any conceivable future (hundreds of thousands of years at least).
What are the implications of this? I think it means that we have permanently and irrevocably changed the climate system because there is no way we’re ever going to remove the trillion tons or more of carbon that we’ve put into the climate system in the last 250 years… but just how big a change does that imply? If we miraculously dropped emissions to zero today, what’s kind of climate would we end up with, by the time everything equilibrated? Have we already made it impossible for the planet to continue with the ice age / interglacial cycles of the last million years or so? Have we committed the climate to going back at least to, say, an Eocene climate regardless of what we do now?
Icarussays
Susanne (no. 23): I would say that ENSO does change the temperature of the globe but only on short timescales of several years, and the change is positive for a while and then negative for a while or neutral for a while, so that the *net* change over a much longer period is about zero. Also, while an El Niño may be warming the atmosphere because of upwelling and spreading warm water in the Pacific, it’s actually cooling the oceans (because the heat from the oceans is escaping through the atmosphere to space), so in reality the total heat content of the climate system declines even while the atmosphere is warmer. I think if we could measure the heat content of the entire climate we would find that it was actually getting cooler in 1998 because of the El Niño, not warmer – it’s just that we are biased towards taking the temperature of the atmosphere rather than the whole planet, of which the oceans are by far the largest repository of heat.
Patrick 027says
Re 23 Susanne –
The key is vertical heat transport in the ocean. Generally, the upper ocean is warmed and cooled by radiation and sensible and latent heat transfers between it and the atmosphere, just like the land, but also, the upper ocean is supplied with a heat sink by upwelling (or upward mixing?) deep cold water, and also acts as a heat source in some regions where water becomes cold enough to sink back down into the deep ocean (salinity is also quite important, and in some previous geologic time(s), from what I remember reading, that the depths of the ocean were filled more with warm but sufficiently salty water to sink).
Upwelling cold water can be pulled up to the surface in spite of it’s density by the wind-driven currents causing surface water to move out of the way – once at the surface, it may be out-of-equilibrium with local radiative and atmospheric conditions and tend to gain heat. Without upwelling cold water reaching the surface, I would guess it could also return to the upper ocean by getting mixed into the upper ocean from below(the upper ocean is mixed by winds, and also by the vertical distribution of solar heating relative to evaporative cooling and evaporative salinity increases at the surface – although precipitation or melting of ice (or runoff from land) at the surface would/do tend to stabilize the water by freshenning surface water (temperature effects aside)). (Tides and plankton also play a role in mixing the ocean – tides also have an effect on mixing the deep ocean, I think, but I don’t know a lot about that.)
(If the upper ocean were losing mass from continued formation of deep water and not gaining mass from upwelling or mixing, the layer would get thinner, which would bring deeper water closer to the surface and perhaps make it easier to mix it into the upper ocean from below or otherwise make it easier for winds to spread the warmer water apart to bring cold water to the surface. But different conditions in different places help determine where water is sinking, mixing, or upwelling.)
Anyway, it is possible for changes in oceanic circulation to ‘hide’ cold water from the surface or ‘reveal’ more of it without any uptake or loss of heat from the ocean. This will lead to warmer or cooler surface conditions and thus warmer or cooler atmospheric conditions – which will, interestingly, tend to lead to global heat loss or gain. In an El Nino, more cold water is kept hidden from the surface (in a particular region of the Pacific where it usually upwells), and so the global average surface temperature, and tropospheric temperature, rises, and this increases outgoing radiation to space, causing a heat loss. In a La Nina, the reverse happens. (Together this mode of varibility is called ENSO, and it also includes changes in atmospheric circulation. There is some positive feedback between changes in the wind and changes in the sea surface temperature, which, I would guess, would help explain ENSO’s prominence as an important mode of variability.)
(If either condition were held for a sufficient time period, the globe would lose or gain heat to return the surface and atmospheric climate toward equilibrium (assuming the equilibrium climate, in terms of average global surface temperature, is not too sensivite to whether there is an El Nino or La Nina), while the ocean would now have a net loss or gain of heat.)
Patrick 027says
and also acts as a heat source in some regions where water becomes cold enough to sink back down into the deep ocean
– it’s a heat source because the water has to lose heat in order to become cold enough to sink. Evaporative cooling also increases salinity which is also important in making water dense enough to sink.
Patrick 027says
Re 21 Urban Leprechaun – that’s actually an interesting point. To a first approximation, sea level forms a sphere, as it is a geopotential surface (constant gravitiational potential energy, perpendicular to the gravitational acceleration) and large objects tend to form spheres as gravitational potential energy is minimized, thus the gravitational field has spherical symmetry.
However, the rotation of the Earth makes geopotential surfaces oblate – they bulge at the equator. This affects the crust and ocean (and mantle, and core, to varying extents depending on centrifugal force and gravity).
Tides also distort the geopotential surface, introducing an oscillating component to gravity that makes the Earth oscillate – the response of the ocean is different from the crust (there are natural frequencies involved – Kelvin waves, etc.), so we can have water level rising and falling relative to the land (we wouldn’t notice tides (without sensitive scientific instruments, etc.) if they rose and fell together). Sea level isn’t generally in equilibrium with the tides, so the water level won’t actually conform to the geopotential surface at any moment. But it should in a time-average – except, see below.
Also, The time-averaged Earth deviates from even the simple oblate spheroid as there are density variations in the mantle and crust, pressure variations in the atmopshere, and mountain ranges and oceanic trenches. Sea level dips over a trench, for example
Aside from all that, sea level can deviate from the time average geopotential surface because:
1. as you point out, there are uneven sources and sinks of water (evaporation, precipitation, runoff), requiring flow for balance, and flow requires a pressure gradient (either to drive it against friction or else balance the coriolis acceleration that occurs when there is flow, or else to drive acceleration of the flow when conditions are changing), which would either come from atmospheric pressure variations at the surface or density variations in the water, or sea level variations.
2. the wind itself applies force on the water and drives motion, which can result in piling up of water or thinning of water in various places (which causes pressure gradients in the water which drive other motions, etc.)
3. water can be fresher or saltier, warmer or colder, and so density can vary. In order to have a particular horizontal pressure gradient at one level, a different horizontal pressure gradient must exist at another level if there is a horizontal density gradient in between. If warm or fresh water were dumped on the surface, it would tend to spread out; it will also add weight to the column of water and cause water below to spread out, which thins the layer of denser water beneath, which lowers the pressure. Aside from atmospheric pressure, the pressure at depth is uniform when the total mass of water above is the same, but a layer of fresh water requires greater thickness to achieve this, so at the surface it would form a dome. Why wouldn’t it spread out evenly to cover the globe? As it spreads out, the coriolis force acts on it to turn the flow; geostrophic balance can be achieved when a pressure gradient remains with a proportionate flow perpendicular to it. So you can end up with a high pressure at the surface (associated with a dome in sea level) and a low pressure at depth (associated with the greater water column thickness not balancing the decreased density). (PS in order to achieve low pressure at depth in geostrophic balance, the water has to actually have flowed inward rather than outward, in consideration of potential vorticity (the conservation of angular momentum).)
So with climate change, there are a number of factors that can lead to geographically and seasonally-varying trends in average local sea level (as well as short-period variations – waves and storms – but I don’t think tidal ranges should be affected much, unless sea level rise actually opens a new channel and … well, I don’t know about that one … )
But in addition to that, there can be a global average change in sea level as
1. the mass of the ocean changes (global net melting of land ice or ice in the water that is not entirely supported by buoyancy; over geologic time, geologic emission and sequestration of H2O, chemical reactions, H escape to space, etc.), and
2. the density of the ocean as a whole changes (thermal expansion, also some freshening from meltwater (PS over geologic time this may also be affected from geological processes, such as in hydrothermal systems), minus the effect of (I think much much smaller) loss of fresh water from the increase in atmospheric water vapor), and
3. isostatic adjustment of the crust (global tendency for oceanic crust to sink under added weight, local tendencies from loss of ice loading, also adding water loading on continental shelves (???), the effects of crustal rigidity… (???) – I think this has a fast and a slow component, by the way, but I don’t know a lot of details there.
Patrick 027says
Re ‘pressure variations in the atmopshere’ – not the biggest effect on the gravitational field :), but would have a direct effect on deviations of sea level from geopotetial surfaces.
Patrick 027says
PS James Kasting had an interesting paper awhile back on how geologic processes (hydrothermal activity) may act with a tendency to maintain some sea level over geologic time (negative feedback involved in the geologic branch of the water cycle) (I think relative to mid-ocean rigdes, so not necessarily relative to continents. Of course there have been geologic times when geologic changes pushed water up over vast areas of continental crust that are now exposed. From what I remember reading, this involves rates of sea floor spreading (cooling of lithosphere (or just crust?) when leaving mid oceanic ridges involves sinking via isostatic adjusment; rapid spreading widens the ridges, increasing there volume. Rifting apart of continents tends to spread out continental crust, displacing water by removing volume of elevated land (erosion with sediments transported to the sea does the same thing); coninential collisions raise up a volume of rock above sea level, making space for ocean water to spread out.)
john byattsays
#10 , The source of the disinformation was Australia’s Bob Carter.
John Masheysays
I am tied up with other things, but people with time, and especially acadmemics. might want to visit a blog at Chronicle of Higher Education and read Bottling Up Global Warming Skepticism” by Peter Ward, President of National Association of Scholars, which might be labeled as NAS* to disambiguate vs another NAS.
If you happen to comment, *please* be polite and perhaps seek more information from Peter.
Susanne, you already have some good if not always brief answers. But to go directly to your question: “Is there an equally simple way to express how these things are both true – if they are?”
The clarification you need is to distinguish the temperature of the whole globe (your idea 1, based on conservation of energy) and the globe’s surface temperature.
And as mentioned by others there is the wrinkle that during a “cool” La Niña year (now as warm as El Niño used to be) the earth retains more of the incoming energy and vice versa, so that ENSO adds an oscillation to the rate of warming.
Pete Dunkelbergsays
For those above discussing picking ten year periods for special effects, it is even more fun with eight years, although recent GISSTEMP downs are scarce.
Pete Dunkelbergsays
Brent Hargreaves @ 22, a new Grand Solar Minimum has been discussed by RC here and by Skeptical Science here. Those are the places to take your concerns for Canada. In view of the poor track record of predictions of global cooling you might also consider this.
Pete Dunkelbergsays
John Mashey @33: “If you happen to comment, *please* be polite ….”
That’s going to be tricky.
Craig Nazorsays
I read this blog a lot and love it, but I rarely post, as climate science is not my major field, and I feel I can learn more at this point by just reading. I now have an issue that some on this blog might be able to help me with. (I hope that this post is not considered under-the-sun dump material!).
On August 10, the Lower Colorado River Authority (LCRA) is holding a public hearing about selling 25,000 acre-feet of water per year to a proposed coal-fired power plant (romantically called “White Stallion”) on Matagorda Bay. The LCRA controls the water in the entire lower Colorado River, which is the major water supply for central Texas, and particularly the city of Austin (my home town). Governor Rick Perry, a noted global warming denier, has appointed all of the members of the board of the LCRA and none are climate scientists, which is very scary. In the LCRA assessment (supposedly written by scientists) of whether there is enough water to actually sell to this project, the LCRA goes back no further than the middle of the 20th century for the “drought of record,” and does not even mention global warming in their future projections:
The LCRA held a public hearing in June about this, at which I spoke and told them that it was hard to believe that they would base a water decision affecting millions of people while ignoring a major scientific consensus about future climate. The Texas Sierra Club commissioned what appears to me to be an excellent study by Dr. D. Lauren Ross using the LCRA’s own information that does mention global warming, and shows why they really do not have enough water:
But the LCRA managed to pretty much ignore her, as she was the only scientist speaking, and her study was commissioned by an environmental organization. Many others spoke also (a few politicians and many concerned citizens), all against the sale except for one person. Since central Texas (actually, almost all of Texas) is now in the middle of an extreme drought, experiencing the second hottest June on record, the LCRA punted the decision down the road until August 10 (Hoping for some rain? Or hoping that “White Stallion” doesn’t get a TCEQ permit to release lots of black smoke?).
Approving this sale would be very bad for Austin as far as water goes, and that’s not even considering all the CO2 and other pollution that will be pumped into the atmosphere by the “White Stallion.” What I believe we need on August 10 is more good scientific testimony from climate scientists (I have a doctorate, but not in climate science) about global warming, to force the board to at least address this reality. First, I would appreciate any helpful comments about the above links. And anyone who would like to come and give ‘em hell, it would be great if you would contact me in Austin (my contact information is easily available on the web). And keep posting all of this great information!
Craig Nazor
Jim Eatonsays
It has been global weirding here in the Sacramento area the past few weeks. It was over 100 F on June 21st. A week later the high was 68 with a half inch of rain (a record, although it was cooler and over an inch of rain in Davis). Today is was back over 100. For several years now, we have had a prolonged wet and cooler spring running into late June. But never a rainfall event such as this (records for last Tuesday were set from Los Angeles (.02 inches) to the Oregon border.
ccposays
@#25 I think that means that although we’re not stuck with 390ppm or more of atmospheric CO2, we *are* stuck with the raised total amount of carbon in the climate system for any conceivable future (hundreds of thousands of years at least).
I think this is incorrect. We have a number of ways of reducing carbon in the active climate system. The fastest two that are “natural,” have multiple functions, and help us with other problems we face – and dicussed here in the past – are forests and terra preta. Terra preta is a very long-term soil improvement technique (creates soils as rich as you will find) and carbon sink. it will be broken down and released only very slowly, and the rate is such that the creation of terra preta is a carbon sink, so as long as you are making it, you are effectively reducing carbon in the active climate system.
Additionally, this improves food production, both in quantity and resilience of the system. The soil is far less likely to wash away with rains and floods, e.g., and food production remains high for far longer without having to add additional nutrients.
Forests are obviously massive carbon sinks. Hansen has suggested we can equal CO2 emissions with forest growth, thus ending the rise in CO2 even if we keep producing it. If we add in edible forest gardens, we can add far more forest and make the food supply stable, to boot. Of course, the gain in CO2 sequestration with forests levels out with the growth in forests, but whatever the number is, it is still essentially a permanent sequestration. If we actively manage the edible forest gardens, then we can continue the slow sequestration of CO2 as we increase the amount of organic matter in the forest floor, which also makes it more nutrient rich, thus making the forest essentially self-sustaining.
Even if we max out forest growth and farmland production, we can continue to draw down carbon by using fast-growing trees for continued terra preta production which would either A. allow us to maintain more of the industrial base we now have (assuming energy supplies of whatever kind) and B. would give us a control knob for climate: we can choose to keep global CO2 within a given “ideal” range with nothing more than forests and food production.
We were discussing the use of the Tiljander proxies and possible problems trying to calibrate to the instrumental record because of contamination during the period of the instrumental record. Rather than not use the proxies completely, is there any value trying to ‘indirectly calibrate’ to a period where there is not believed to be any contamination ? (I know this is all pretty marginal anyway, but it seems a shame to throw out stuff if there is some meaningful signal there)
That sort of brought us (via a roundabout route) to the Zorita opinion piece (which seemed to be suggesting an alternative approach to paleoclimatic reconstructions) :
“Expert assessment to evaluate the signal of a particular record from a particular proxy archive (e.g., the lowfrequency skill of a new speleothem record) will be invaluable in trying to minimize ‘wrong figures’ being put into a large-scale reconstruction. It seems advisable at this point to use fewer, but expert assessed proxy records, rather than hundreds of proxy series, and hope that reconstruction algorithms will overcome the often huge noise components typical for many of the available time series”
Any comments ?
[Response: Good paper (lead author is David Frank, not E. Zorita) and good discussion over there at Bart’s. More in a bit from my viewpoint.–Jim]
[Response: I’ll give mine, with specific reference to tree rings. The biggest advances are going to come from a better mechanistic understanding of the causes of growth rate changes in those trees typically used in dendro-climatic reconstructions (i.e. climatically stressed trees of various types). Statistics will only take you so far. On that point I unequivocally agree with the linked Frank et al. paper. This improvement must be accompanied by changes in field sampling techniques that have traditionally been used, which are not optimized to evaluate long term, low frequency climatic changes, but which were used extensively by workers in the 1980s (who collected much of the data used in recent reconstructions), before this issue was recognized. See for example this short white paper by Briffa and Cook. The cumulative, interacting, and complex effects of temperature, soil moisture, (both of which have seasonal considerations) carbon dioxide, and tree age/size, all need to be resolved, because they are all actually or potentially important to radial growth rate. Doing so will still leave unresolved some process-generated noise relating to things like the uncertainty in relating measured precip levels–always at some distance away–to the soil characteristics at the tree sampling site, atmospheric effects (N deposition, ozone, acid rain, etc), and potentially changing inter-tree competition levels over time, for example. We can’t hope to remove all sources of uncertainty, but we can significantly reduce several.
To answer your other question, yes it is possible to “indirectly calibrate”–if you have some independent climate measure that is trustworthy and you have the requisite understanding of the driving processes of both that will allow you to rule out certain wrong or artifactual relationships. It may also be possible to improve some proxy series, strictly via mathematical improvements, before you relate them to any climatic time series. Then when other proxies, or improved instrumental data, becomes available, you have a better response variable series to use. This can be important as well.–Jim]
dcomerfsays
I’d like to ask for help with finding references for an economics paper I want to write. In a lot of climate change economics work the environment is represented by a damage function (so that utility flow is a strictly decreasing function of CO2), and a CO2 equation of motion that is of the form: dCO2/dt = Emissions(t) + DecayConst * CO2(t)
where DecayConst is negative and this decay of CO2 levels is motivated by the observed absorbtion of CO2 by carbon sinks, especially the oceans.
I think that this is nonsense and, to a first order, the DecayConst should be zero or positive. My basis for believing this is that my understanding is that warmer oceans will hold less CO2, so any anthropogenic CO2 release, if the system is given the time to respond, will (to 1st order) cause an additional CO2 release from the ocean rather than CO2 being absorbed.
Are there any good references that I can cite to make this case very simply (this is just the introduction to my paper) to an economist audience?
Thanks in advance for any help!
[Response: There is no single ‘DecayConstant’ that makes much sense. Instead, use the multi-exponential approximation to the Bern Carbon Cycle model. This does not have an explicit carbon cycle feedback term, and so you need to add that in addition (i.e. add a term to the CO2_atm equation like gamma*Tsfc where gamma ~ 8 ppm/ºC and Tsfc is the anomaly over pre-industrial (perhaps with some delay). Note too that your equation needs to have an equilibrium point at the pre-industrial levels (not at CO2=0!). – gavin]
Re #21’s question on how the global sea level is measured. I’ve also wondered about this. The claimed sea level rise is about 2-3mm per year. With shifting land masses (there are villages under water off the English coast), how can any measurement to such a degree of precision be made? As remarked above, the ocean envelope is not even spherical.
[Response: All sea level measurements at made with respect to the geoid (which is not spherical, or even oblately spheroid, but has many bumps and dips associated with topography etc.). Indeed, the accurate measurement of sea level change has been greatly improved because of greater accuracy in the geoid. The best global numbers currently come from satellite altimetry – using lasers to assess the distance between the satellite and the ground, and the accuracy is a function of a great deal of averaging over time and space. The satellite numbers are ground-truthed to a set of tide gauges around the world so that the satellite estimate of the changes at those specific points are the same as the direct observations. The altimeters obviously see anything that affects the distance (including residual tectonic motions/isostatic rebound etc.) and so that needs to be corrected for in assessing how much sea level is rising because of thermal expansion or ice melt etc. The http://sealevel.colorado.edu site has a lot more info and references. – gavin]
dcomerfsays
Thanks Gavin, but embedding sophisticated climate models within economic models is what I want to argue against.
My point is that uncertainty around climate projections (especially on the downside) probably dominates in terms of economic decision making. Putting a deterministic climate model within the economic model means that benefits of climate policy are equated with costs of climate policy exactly – which is an absolute nonsense given the uncertainties/threshold effects involved. Instead I want to advocate that specific recommedations are embedded in the economic model – like e.g. no coal post 2030, and the implied prices/policy-instruments conditional on this are the focus of interest.
In order to make this case I want to argue against the CO2 equation of motion as currently embedded in economic models (see Nordhaus for most well developed example) (note also that CO2 is usually expressed as deviations from pre-industrial, so it is pre-industrial that is an equilibrium rather than zero concentrations). The major flaw that I think is baked in to this formulation is the structural negative feedback. What I want (if it exists??) is a good reference that says that the ocean carbon sink is actually a carbon source on the millenial timescale (I don’t even know if this is true – but it seems to follow from reduced CO2 solubility in warmer water?)
I will investigate the Bern Carbon Cycle model with adjustments as you suggest – but I was hoping for more of a simplistic solution!
[Response: The approximation to the BCC is the most simplistic solution that makes any sense. It is not in any sense a ‘sophisticated’ climate model (just a series of 4 exponential functions). This captures the first order behaviour that an increase in atm CO2 increases the flux at the air sea interface (you call this the structural negative feedback). What you want to be adding is a an additional function (as I outlined above) that increases atm CO2 as a function of temperature (the amplifying carbon cycle feedback). This gives a long-term equilibrium impact on atm. CO2 as a function of global temperature and is roughly 8 ppm/ºC (i.e. for a 2ºC rise over pre-industrial, you would expect a 16 ppm rise in CO2 over pre-I values. There is a lot of uncertainty in this number (see here) but this is the right ballpark. – gavin]
How about looking at the claims that the so-called Pacific Decadal Oscillation (like the AMO, of dubious physical reality) has entered a ‘cooling mode’?
“The switch of PDO cool mode to warm mode in 1977 initiated several decades of global warming. The PDO has now switched from its warm mode (where it had been since 1977) into its cool mode.” – Don Easterbrook, 2008.
Other topics: drought in Texas, flooding on the Mississippi, wildfires in Arizona – all in line with various climate model projections, such as polewards expansion of the Hadley circulation, increased water vapor in the atmosphere leading to high precipitation in some regions, and the general trend of declining soil moisture in the U.S. southwest and what that means for forest fires and ecosystem changes (northward expansion of Sonoran desert conditions appearing highly likely?)
Or, more importantly – the refusal of the corporate media and the corporate academic system to pay any attention to these issues. Is the Global Climate and Energy Program at Stanford University still raking in all those ExxonMobil & Schlumberger Oil dollars? Couldn’t be influencing the direction of academic research, could it? And the DOE’s Science Chief – yes, look’s like BP’s Chief Scientist is still in that position – and they’re still trying to push their fraudulent carbon capture and sequestration programs.
Pete Dunkelbergsays
Stefan-Boltzmann in the stratosphere:
The Stefan-Boltzmann law says that the amount of energy radiated by a mass per unit time is proportional to its area times its temperature to the fourth power. Yet the cool upper troposphere radiates much more energy into space than does the hot stratosphere. Area seems to stand in for mass in the S-B law in a way. A small mass, even if spread over a large area, can not radiate as much as a larger mass. Evidently the mass must not be spread so thin that it is not well approximated as a continuous area for purposes of radiation. btw how would you even assign an area to the stratosphere? How can Stefan-Boltzmann be applied there?
Clarification needed.
dcomerfsays
Thanks again Gavin. Just one final thought: I think the implication of how you describe the BCC (I still haven’t looked it up) is that the oceans are a sink and that the increased partial pressure from higher atmospheric CO2 trumps the reduced CO2 solubility?
My simple calc to get to this is: assume 3degree sensitivity to doubled CO2 and 50% initial absorbtion by oceans due to higher partial pressure of CO2. Period 1: emissions sufficient to raise CO2 from 280ppm to 840ppm i.e. emissions ~ 560ppm but 50% absorbed so atmospheric CO2 concentration at end of ’emissions period’ = 560ppm.
By next period, temperatures have equilibriated with new CO2 concentration so temperatures 3degrees warmer and we get another 24ppm of ’emissions’ from the ocean. Total effect of the oceans is then to absorb 280 – 24 = 256ppm worth of CO2 emissions.
Is this calc, ceterus paribus, on the right lines?
[Response: BCC is for the non-climate change situation – i.e. there is no reduction of solubility (or change in respiration, or stratification or all the dozens of other factors that impact the CC feedback). The climate feedbacks need to be explicitly stated via the ‘gamma ‘ term. Your calculation is ok except that the transient emissions required to maintain your scenario are varying implicitly – better to calculate the atm CO2 explicitly as a function of emissions and BCC + feedback. – gavin]
Fred Magyarsays
Recently I was particulary struck by these incredible sculptures by Janet Echelman. http://www.ted.com/talks/janet_echelman.html
In particular, there is one that is based on NOAA data obtained from a tsunami,is titled 1.26
I was wondering if there was a particular set of climate data that might be transformed into such a sculpture that could serve as a medium to get across in a visceral way to the general public what dynamic climate change looks like.
Perhaps this might be a way to cross pollinate art and scientific concepts to the benefit of a wider audience.
Patrick 027says
Re 47 Pete Dunkelberg –
The Stefan-Boltzmann law applies to either a blackbody, which is a material that is non-reflecting, completely opaque (optical thickness = infinity), and at LTE, or to something which can simulate such conditions (a small hole cut into a box; as long as the box’s inside surface has less than 100 % albedo, a sufficiently small hole relative to the box’s dimensions will require that most photons entering such a hole would have to be reflected so many times that most end up getting absorbed before exiting, so the hole can act almost like a blackbody surface).
A real material layer typically has finite optical thickness; for a given direction, the transmitted fraction of radiation incident on one side that comes out the other is equal to exp(-optical thickness) and, if reflection and scattering are zero (and if at LTE), the emissivity (emitted radiation / blackbody value)and absorptivity (absorbed radiation / incident radiation) are both equal to 1 – exp(-optical thickness).
Except – that is only strictly true for one direction, and for a given frequency and polarization. If there is scattering or reflection, emissivity must still equal absorptivity for incident radiation being absorbed from a direction and emission back into that direction (although if optical properties have enough symmetry, or are isotropic, the same values apply to a direction and it’s opposite).
When emissivity (and absorptivity) vary over the spectrum, it is necessary to use the Planck function, which describes blackbody radiant intensity at a given frequency. Radiant intensity is the flux per unit area per ‘unit direction’ (that is, per unit solid angle). Radiant intensity must be integrated over directions (over solid angle) to find a flux per unit area, and before integration it must be weighted by the projection of the unit area of interest onto the unit area that faces each direction (this is the cosine of the angle between the direction of the intensity and the direction the unit area of interest is facing).
But qualitatively, holding optical properties constant, the emitted flux always increases with temperature, but the proportionality depends on how optical properties vary over the spectrum. Increasing the thickness of a layer, or increasing the amount of a substance (with isotropic optical properties) that absorbs or emits radiation, increases the optical thickness of that layer (in all directions, by the same percentage). If the layer is isothermal and there is no scattering/reflection and it is at LTE, then at each direction the emitted intensity is equal to the Planck function times the emissivity. If it is not isothermal than it is necessary to integrate along a line to find the radiant flux (see ‘Schwarzchild’s equation’, which simplifies to Beer’s Law when the temperature is too cold for significant emission).
R. Gates says
Okay, here’s a general question, especially for those who are professional climate scientists:
What, in your opinion, are the 2 or 3 biggest unknowns currently out there in the study of the climate? Things that if they were known, could have some impact on global climate models. As you list these, maybe you could give some idea what you think their maximum impact could be in terms of climate forcing.
[Reply: (1) Clouds (2) How much coal is really down there and recoverable? (3) Will we have the fortitude to leave any of it unburned? If I’m allowed four, I’d add: (4) Could the present terrestrial carbon sink turn around and become a source, leading to a PETM style carbon release adding to the fossil fuel carbon we add to the atmosphere directly? With regard to consequences, there is a very slight chance that clouds might make climate sensitivity low, but an almost unbounded prospect for clouds to make things very, very bad, perhaps even to the point of making much of the world uninhabitable outdoors for mammals — especially if there really is 5000 gigatonnes of coal and we wind up burning it all. –raypierre]
Edward Greisch says
James Hansen on conditions inside the government:
“TreeHugger: Under the Bush administration you experienced pressure and threats at times when you were talking about climate change. How has the Obama administration been different?
James Hansen: Well, it’s different in the sense that there’s no one trying to check on what I’m talking about and censor it. But basic problems remain. Namely that the public affairs offices are headed by political appointees who tend to feel that their job is to make the administration look good. I just think those offices should be headed by career civil servants who cannot be pressured to try to present what amounts to propaganda.
Also, government scientists, when they testify to Congress, must have their testimony approved by the White House, by the Office of Management and Budget. And again, I don’t think that makes sense. Government scientists are paid by the public, by taxes, and they should be allowed to give their best opinion without having it reviewed prior by the administration.
So those basic processes have not been corrected. But personally, I feel that, in this administration, that I’m allowed to say what I think is right.”
Source:
http://www.treehugger.com/files/2011/06/treehugger-radio-james-hansen-climate-change-and-intergenerational-justice.php?page=3
I don’t know if treehugger is any good, but I have heard some bad things about how scientists are being treated these days. I don’t remember the reference, but somebody said words to the effect that if scientists are receiving threats, the opposition must be desperate. I am retired now and when I was working, my job did not include communication to the outside world, so I would like to hear comments. My impression of politically appointed agency heads has been negative.
Office of Management and Budget [OMB] is the vice president’s domain.
Vincent Kosik says
Read Mark Bowen’s account in his book, “Censoring Science: Inside the Political Attack on Dr. James hansen and the Truth of Global Warming” Plume book published by Penquin Group 2008 for a tale about the Bush administration successful attempt to halt any action on this crisis.
Not sure of the current administration. Also Dr. james hansen writes a bit about it in his own book, “Storms of my Grandchildren”
Really makes one angry.
Tony lynch says
R. Gates: Known Unknowns?
Prokaryotes says
Looking for blog post dedicated to “Geomorphological Response” and “Methane Status Updates – Projections and Estimates”
Edward Greisch says
http://thinkprogress.org/romm/2011/07/02/257869/when-scientists-take-to-the-streets-it’s-time-to-listen-up/
leads to:
Universities “seriously concerned” by death threats against climate scientists
http://theconversation.edu.au/universities-seriously-concerned-by-death-threats-against-climate-scientists-1686
Scientists hit back amid fresh death threats
http://www.abc.net.au/news/stories/2011/06/20/3248032.htm
Study says majority believe in climate change
http://www.abc.net.au/news/stories/2011/06/03/3234342.htm
“The research found less than 6 per cent of Australians are true climate change sceptics.”
There we have the reason for desperation on the part of denialists. Rich coal company stockholders stand to loose their shirts. Can a wealthy 6% control Australia’s government? Are the police willing to arrest the rich?
Barton Paul Levenson says
For anyone in or near Allegheny County, PA. I will give a lecture on global warming to the Parsec science fiction club on 9 July 2011 at about 2 PM. Location: Squirrel Hill Carnegie Library, corner of Murray and Forbes Avenues.
Pete Dunkelberg says
R. Gates, imho the big questions are:
1. How long will we keep burning carbon like there’s no next generation?
The answer is blowing in the wind. The political wind.
2. How harmful will climate disruption become, and how soon?
Those two are the really serious issues.
3. The size and speed of various feedbacks –
a) Arctic amplification (Arctic warms fastest) was predicted [calculated] over a century ago, but it keeps running ahead of model projections.
b) Cloud feedback: “the great white hope” of the subject. Alas there is no sign that a negative feedback from clouds is going to save us, and paleoclimate also says don’t bet on it (or on any other unknown savior). There is slight evidence of a positive feedback from clouds.
c) Methane: this one is still blowing in the wind. But it can only hurt, not help.
4. How to make good regional projections.
Sub question: keep making global models with smaller grid size, or make specific regional models? The US Navy uses an Arctic regional model and supercomputers. No one has the resources to do that for every region of the earth though.
Pete Dunkelberg says
What Eli said.
Quoting a recent essay by Alan Betts in EOS:
…
Nick says
I was perusing web articles when I stumbled onto this one, called “An Inconvenient Fallacy.” With a title like that I really couldn’t resist so I opened it up and I found these assertions in the article…
“Fact 1. A mild warming of about 0.5 degrees Celsius (well within previous natural temperature variations) occurred between 1979 and 1998, and has been followed by slight global cooling over the past 10 years. Ergo, dangerous global warming is not occurring.
“Fact 2. Between 2001 and 2010 global average temperature decreased by 0.05 degrees, over the same time that atmospheric carbon dioxide levels increased by 5 per cent. Ergo, carbon dioxide emissions are not driving dangerous warming.”
Source: http://www.smh.com.au/opinion/politics/an-inconvenient-fallacy-20110626-1glmu.html#ixzz1R3hkoDqU
Does anyone even know where they come up with these figures?
vukcevic says
R. Gates says:
2 Jul 2011 at 10:26 PM
Okay, here’s a general question, especially for those who are professional climate scientists:
What, in your opinion, are the 2 or 3 biggest unknowns currently out there in the study of the climate?
Once science is prepared to accept non-professional climate scientists the ‘known unknowns’ may become ‘known knowns’ as it is the case here:
http://www.vukcevic.talktalk.net/A&P.htm
[Response:Those who have something meaningful to contribute can do so at any time. The problem is that doing so is generally a lot more demanding and not nearly so easy or simplistic as those non-professionals think.–Jim]
David Kidd says
To Pete Dunklenberg at comment 11.
This is an “Opinion piece”in a Major Australian Newspaper, “The Sydney Morning Herald” written by a Mr. Bob Carter a well known Australian denialist who has just lately come back into prominence after a spell out to pasture so to speak. The reason being that the Australian Government is in the process of introducing a diluted form of Carbon tax into parliament. This has aroused all the usual suspects into a frenzy of activity.Mr Carter has form. Check the list to be found on the “Smog Blog” Rouges gallery
Regards, David Kidd.
Paul from VA says
@Nick @11
Bob Carter’s piece has been completely demolished. It’s based on standard denialist tropes. The “cooling for ten years” trope comes from the fact that 1998 was a monster el nino year, and thus was much warmer than average and depending on how you count, the warmest year on record (2010 is very close or higher depending on which global temperature series is used). Of course, if you compare over a longer time period, you see that the 00’s were on average significantly warmer than the 90’s. More thorough demolitions of the Bob Carter piece can be found at the two links below:
http://scienceblogs.com/deltoid/2011/06/bob_carter_not_entitled_to_his.php
http://www.theage.com.au/opinion/society-and-culture/half-the-truth-on-emissions-20110627-1gne1.html
vukcevic says
There is an atmospheric pressure sea-saw operating in the North Atlantic (NAO) controlling the winter temperatures oscillations in the wider area.
http://www.vukcevic.talktalk.net/WPd.htm
Both sides engaged in the ‘climate trench warfare’ may find it of some interest.
Hank Roberts says
> Nick says:
> 3 Jul 2011 at 11:08 AM
>
> I was perusing web articles when I stumbled onto this …
> …
> Does anyone even know where they come up with these figures?
Google found 519 different blogs with those fake numbers copypasted.
Now 520 since they’re reposted here. Try searching and reading some of the copies Google finds and you’ll see how they they make this stuff up wholesale. Repetition works because people come to believe stuff they see repeated. It’s a tool. Don’t let them use you by repeating their fiction.
No such numbers found by Google Scholar.
Chris R says
Hello again Vukcevic,
Over at Tamino’s blog you did a similar drive-by.
http://tamino.wordpress.com/2011/06/16/open-thread-4/#comment-51746
I asked you how you came up with your NAP index, you offered your email address, and I declined a private chat favouring the open forum and benefit of other’s opinions and observations.
Now in addition to your NAP (North Atlantic Precursor) Index, you offer a PDO (Pacific Decadal Oscillation?) Driver index…
It strikes me that if your ideas have merit, history shows they will win through. However it’s damn near impossible to work out the basics, like your NAP index from the pages you offer. Without understanding the basics it’s impossible to evaluate the merits of whatever evidence you may have.
Wheels says
@11:
The opinion piece is written by Bob Carter, which may tell you all that you need to know about the general direction and reliability of it. Here are specifics about the points you mentioned.
Carter is cherry-picking endpoints to arrive at his declines; start at the year 2000-2009 instead of 2001-2010 and 3 out of 4 datasets show a positive trend. Speaking of time periods, 10 years is such a short time in climate that year-to-year variations and short-term cyclical events can swamp signal in noise. Over 15, 20, or (best yet) 30 years, real long-term trends are more visible in the data. This leads to why his last sentence in Fact 2 is totally ludicrous.
He’s also cherry-picking datasets: if he had used NASA’s GISTEMP record, which attempts to describe the Arctic, there would have been a positive trend over 2001-2010 rather than negative. He’s likely using HADCRUT, which shows the .05 degree “decline” from 2001-2010 but doesn’t cover the Arctic region at all, which is where much of the warming is taking place. It’s amusing because he had previously tried to paint the Hadley researchers as duplicitous and fraudulent, so his reliance on their data NOW is pretty telling. John Cook concludes that this is because the last two years have been the hottest on record, so it’s much harder for Carter to claim that global warming “stopped in 1998” as he so loves to do.
So even if we take his 2 statements as barely factual (and his claim about CO2 is clearly not), they are not exactly compelling arguments because they rely entirely on cherry-picking small points in isolation rather than viewing the bigger picture.
Mike Roddy says
Scientists need to follow the late Stephen Schneider’s example, and work hard to explain to the public what the climate data means. Schneider and Hansen are about the only ones who understood that communicating the evidence about climate change to the public is far more important than perfecting it.
Instead, this critical task has been outsourced to floundering reporters and the occasional political leader such as Gore, who are taking a lot of heat due to scientists’ retreat from the public dialogue.
Key knowledge includes responsibility. About 30% of Americans have been persuaded by the media that global warming is some kind of hoax, which is all the cover our oil owned politicians need.
Time to wake up and attack, climate scientists. Set up booths on the Capitol Lawn. Organize teach ins, including in places like Dallas and Gillette, Wyoming. Show little patience for liars and prostitutes. Do the right thing for all of the earth, and commit yourselves to it.
[Response:Scientists haven’t retreated from the public dialogue. Notwithstanding the loss of Steve, there are more scientists trying to educate the public on the issues than ever, in various venues, such as this one. However, I agree that there aren’t enough doing it, and we need a lot more to do their part. But then, that’s a general problem with society as a whole, not scientists specifically–it’s always a small minority that stands up for principle, at least initially. As for the science, it’s not about “perfecting” it, whatever that means. There are still all kinds of important uncertainties that need to be understood much better than they are if there is going to be an
rationaleffective response to the problem. Climate change effects chief among them.–Jim.]Mike McClory says
@Nick (post 11),
I don’t know where they’ve got those exact figures from but ‘Fact 1’ is just the usual ‘no warming in x years’ lie (meme is too forgiving a word). We know that there has been warming over that period – the timescale is too short to state that it is statistically ,significant to 95%. Extend the start point back and we can confidently state that it was statistically significant.
‘Fact 2’ appears to be just plain wrong given that 2010 was a join record high anomaly. I would suggest that the figure may come from one of the satellite records… Again it’s a cherry picked duration as well.
vukcevic says
Chris R
I did post an answer two times but it was blocked, email was posted in order to respond to your question, since I thought it was impolite to ignore your request.
The answer is simple, I have collected lot of historical data dispersed through various institutions, all publicly available some on line some in various printed publications, put it all together and developed two data sets one referring to the N. Atlantic and one to N. Pacific.
This is a purely personal effort, at personal expense, so I am under no obligation to release details for time being, since I am preparing a more extensive publication.
I think it may require some assistance for presentation from an academic institution or some other establishment, but that is not priority at the moment.
[Response:The better course of action, by far, is to publish first and blog second. Once you start making public statements, you’re obligated to defend them. Nobody will take them seriously until you do.–Jim]
Urban Leprechaun says
Can someone tell me how sea levels are determined?
Such a what do people do when it has just rained heavily on the sea, which, surely, must give a higher level until this rain has either evaporated or flowed away(?)to a lower level sea?
Brent Hargreaves says
Pete Dunkelberg, 3 July, 06:38.
So our knowledge of the sun’s effect on climate is so well understood that it doesn’t make your list in response to R. Gates?
I do hope you’re buying shares in that Canadian port being built to exploit the NW Passage. When Maunder II strikes, and it’s buried under the ice, it’ll stand as a monument to the hubris of the Global Warming industry,
Susanne says
Reading various responses to Bob Carter’s latest outpourings, I’ve realised that I have two ideas in my head (don’t laugh, it’s more than some people have) that seem to be in conflict:
1. El Nino / La Nina and other oscillations don’t actually change the temperature of the globe; they just move the heat around.
2. Global temperature was exceptionally high in 1998 because of an extreme El Nino.
I know that’s a grossly oversimplified way to talk about it, but it’s the level I’m at and I’m a long way ahead of many of the people I talk to. Is there an equally simple way to express how these things are both true – if they are?
Thanks
Paul Pentony says
Susanne
Here is my attempt at an answer – others may do better.
During a La Nina event the thermocline which divides warm surface water from colder deep water is depressed on the western side of the Pacific and rises elsewhere. As a result you get a relatively deep warm pool on the west of the Pacific and relatively cooler surface water elsewhere. The net result is a reduction in average surface temperature.
Icarus says
We have raised atmospheric CO2 by a large amount (about 40%) but my understanding is that the amount by which we’ve raised the total amount of carbon in the entire climate system is still relatively small (I calculated it to be about 2% but could be well off).
Conversely though, atmospheric CO2 could decline quite quickly if we drastically cut emissions, because of absorption by natural sinks, whereas the total amount of carbon in the climate system only changes very very slowly – perhaps 10,000 or 20,000 times slower than the rate at which we’re increasing it today.
I think that means that although we’re not stuck with 390ppm or more of atmospheric CO2, we *are* stuck with the raised total amount of carbon in the climate system for any conceivable future (hundreds of thousands of years at least).
What are the implications of this? I think it means that we have permanently and irrevocably changed the climate system because there is no way we’re ever going to remove the trillion tons or more of carbon that we’ve put into the climate system in the last 250 years… but just how big a change does that imply? If we miraculously dropped emissions to zero today, what’s kind of climate would we end up with, by the time everything equilibrated? Have we already made it impossible for the planet to continue with the ice age / interglacial cycles of the last million years or so? Have we committed the climate to going back at least to, say, an Eocene climate regardless of what we do now?
Icarus says
Susanne (no. 23): I would say that ENSO does change the temperature of the globe but only on short timescales of several years, and the change is positive for a while and then negative for a while or neutral for a while, so that the *net* change over a much longer period is about zero. Also, while an El Niño may be warming the atmosphere because of upwelling and spreading warm water in the Pacific, it’s actually cooling the oceans (because the heat from the oceans is escaping through the atmosphere to space), so in reality the total heat content of the climate system declines even while the atmosphere is warmer. I think if we could measure the heat content of the entire climate we would find that it was actually getting cooler in 1998 because of the El Niño, not warmer – it’s just that we are biased towards taking the temperature of the atmosphere rather than the whole planet, of which the oceans are by far the largest repository of heat.
Patrick 027 says
Re 23 Susanne –
The key is vertical heat transport in the ocean. Generally, the upper ocean is warmed and cooled by radiation and sensible and latent heat transfers between it and the atmosphere, just like the land, but also, the upper ocean is supplied with a heat sink by upwelling (or upward mixing?) deep cold water, and also acts as a heat source in some regions where water becomes cold enough to sink back down into the deep ocean (salinity is also quite important, and in some previous geologic time(s), from what I remember reading, that the depths of the ocean were filled more with warm but sufficiently salty water to sink).
Upwelling cold water can be pulled up to the surface in spite of it’s density by the wind-driven currents causing surface water to move out of the way – once at the surface, it may be out-of-equilibrium with local radiative and atmospheric conditions and tend to gain heat. Without upwelling cold water reaching the surface, I would guess it could also return to the upper ocean by getting mixed into the upper ocean from below(the upper ocean is mixed by winds, and also by the vertical distribution of solar heating relative to evaporative cooling and evaporative salinity increases at the surface – although precipitation or melting of ice (or runoff from land) at the surface would/do tend to stabilize the water by freshenning surface water (temperature effects aside)). (Tides and plankton also play a role in mixing the ocean – tides also have an effect on mixing the deep ocean, I think, but I don’t know a lot about that.)
(If the upper ocean were losing mass from continued formation of deep water and not gaining mass from upwelling or mixing, the layer would get thinner, which would bring deeper water closer to the surface and perhaps make it easier to mix it into the upper ocean from below or otherwise make it easier for winds to spread the warmer water apart to bring cold water to the surface. But different conditions in different places help determine where water is sinking, mixing, or upwelling.)
Anyway, it is possible for changes in oceanic circulation to ‘hide’ cold water from the surface or ‘reveal’ more of it without any uptake or loss of heat from the ocean. This will lead to warmer or cooler surface conditions and thus warmer or cooler atmospheric conditions – which will, interestingly, tend to lead to global heat loss or gain. In an El Nino, more cold water is kept hidden from the surface (in a particular region of the Pacific where it usually upwells), and so the global average surface temperature, and tropospheric temperature, rises, and this increases outgoing radiation to space, causing a heat loss. In a La Nina, the reverse happens. (Together this mode of varibility is called ENSO, and it also includes changes in atmospheric circulation. There is some positive feedback between changes in the wind and changes in the sea surface temperature, which, I would guess, would help explain ENSO’s prominence as an important mode of variability.)
(If either condition were held for a sufficient time period, the globe would lose or gain heat to return the surface and atmospheric climate toward equilibrium (assuming the equilibrium climate, in terms of average global surface temperature, is not too sensivite to whether there is an El Nino or La Nina), while the ocean would now have a net loss or gain of heat.)
Patrick 027 says
and also acts as a heat source in some regions where water becomes cold enough to sink back down into the deep ocean
– it’s a heat source because the water has to lose heat in order to become cold enough to sink. Evaporative cooling also increases salinity which is also important in making water dense enough to sink.
Patrick 027 says
Re 21 Urban Leprechaun – that’s actually an interesting point. To a first approximation, sea level forms a sphere, as it is a geopotential surface (constant gravitiational potential energy, perpendicular to the gravitational acceleration) and large objects tend to form spheres as gravitational potential energy is minimized, thus the gravitational field has spherical symmetry.
However, the rotation of the Earth makes geopotential surfaces oblate – they bulge at the equator. This affects the crust and ocean (and mantle, and core, to varying extents depending on centrifugal force and gravity).
Tides also distort the geopotential surface, introducing an oscillating component to gravity that makes the Earth oscillate – the response of the ocean is different from the crust (there are natural frequencies involved – Kelvin waves, etc.), so we can have water level rising and falling relative to the land (we wouldn’t notice tides (without sensitive scientific instruments, etc.) if they rose and fell together). Sea level isn’t generally in equilibrium with the tides, so the water level won’t actually conform to the geopotential surface at any moment. But it should in a time-average – except, see below.
Also, The time-averaged Earth deviates from even the simple oblate spheroid as there are density variations in the mantle and crust, pressure variations in the atmopshere, and mountain ranges and oceanic trenches. Sea level dips over a trench, for example
Aside from all that, sea level can deviate from the time average geopotential surface because:
1. as you point out, there are uneven sources and sinks of water (evaporation, precipitation, runoff), requiring flow for balance, and flow requires a pressure gradient (either to drive it against friction or else balance the coriolis acceleration that occurs when there is flow, or else to drive acceleration of the flow when conditions are changing), which would either come from atmospheric pressure variations at the surface or density variations in the water, or sea level variations.
2. the wind itself applies force on the water and drives motion, which can result in piling up of water or thinning of water in various places (which causes pressure gradients in the water which drive other motions, etc.)
3. water can be fresher or saltier, warmer or colder, and so density can vary. In order to have a particular horizontal pressure gradient at one level, a different horizontal pressure gradient must exist at another level if there is a horizontal density gradient in between. If warm or fresh water were dumped on the surface, it would tend to spread out; it will also add weight to the column of water and cause water below to spread out, which thins the layer of denser water beneath, which lowers the pressure. Aside from atmospheric pressure, the pressure at depth is uniform when the total mass of water above is the same, but a layer of fresh water requires greater thickness to achieve this, so at the surface it would form a dome. Why wouldn’t it spread out evenly to cover the globe? As it spreads out, the coriolis force acts on it to turn the flow; geostrophic balance can be achieved when a pressure gradient remains with a proportionate flow perpendicular to it. So you can end up with a high pressure at the surface (associated with a dome in sea level) and a low pressure at depth (associated with the greater water column thickness not balancing the decreased density). (PS in order to achieve low pressure at depth in geostrophic balance, the water has to actually have flowed inward rather than outward, in consideration of potential vorticity (the conservation of angular momentum).)
So with climate change, there are a number of factors that can lead to geographically and seasonally-varying trends in average local sea level (as well as short-period variations – waves and storms – but I don’t think tidal ranges should be affected much, unless sea level rise actually opens a new channel and … well, I don’t know about that one … )
But in addition to that, there can be a global average change in sea level as
1. the mass of the ocean changes (global net melting of land ice or ice in the water that is not entirely supported by buoyancy; over geologic time, geologic emission and sequestration of H2O, chemical reactions, H escape to space, etc.), and
2. the density of the ocean as a whole changes (thermal expansion, also some freshening from meltwater (PS over geologic time this may also be affected from geological processes, such as in hydrothermal systems), minus the effect of (I think much much smaller) loss of fresh water from the increase in atmospheric water vapor), and
3. isostatic adjustment of the crust (global tendency for oceanic crust to sink under added weight, local tendencies from loss of ice loading, also adding water loading on continental shelves (???), the effects of crustal rigidity… (???) – I think this has a fast and a slow component, by the way, but I don’t know a lot of details there.
Patrick 027 says
Re ‘pressure variations in the atmopshere’ – not the biggest effect on the gravitational field :), but would have a direct effect on deviations of sea level from geopotetial surfaces.
Patrick 027 says
PS James Kasting had an interesting paper awhile back on how geologic processes (hydrothermal activity) may act with a tendency to maintain some sea level over geologic time (negative feedback involved in the geologic branch of the water cycle) (I think relative to mid-ocean rigdes, so not necessarily relative to continents. Of course there have been geologic times when geologic changes pushed water up over vast areas of continental crust that are now exposed. From what I remember reading, this involves rates of sea floor spreading (cooling of lithosphere (or just crust?) when leaving mid oceanic ridges involves sinking via isostatic adjusment; rapid spreading widens the ridges, increasing there volume. Rifting apart of continents tends to spread out continental crust, displacing water by removing volume of elevated land (erosion with sediments transported to the sea does the same thing); coninential collisions raise up a volume of rock above sea level, making space for ocean water to spread out.)
john byatt says
#10 , The source of the disinformation was Australia’s Bob Carter.
John Mashey says
I am tied up with other things, but people with time, and especially acadmemics. might want to visit a blog at Chronicle of Higher Education and read Bottling Up Global Warming Skepticism” by Peter Ward, President of National Association of Scholars, which might be labeled as NAS* to disambiguate vs another NAS.
If you happen to comment, *please* be polite and perhaps seek more information from Peter.
Apparently, he disliked this, sorry, paywall.
Pete Dunkelberg says
Susanne, you already have some good if not always brief answers. But to go directly to your question: “Is there an equally simple way to express how these things are both true – if they are?”
The clarification you need is to distinguish the temperature of the whole globe (your idea 1, based on conservation of energy) and the globe’s surface temperature.
And as mentioned by others there is the wrinkle that during a “cool” La Niña year (now as warm as El Niño used to be) the earth retains more of the incoming energy and vice versa, so that ENSO adds an oscillation to the rate of warming.
Pete Dunkelberg says
For those above discussing picking ten year periods for special effects, it is even more fun with eight years, although recent GISSTEMP downs are scarce.
Pete Dunkelberg says
Brent Hargreaves @ 22, a new Grand Solar Minimum has been discussed by RC here and by Skeptical Science here. Those are the places to take your concerns for Canada. In view of the poor track record of predictions of global cooling you might also consider this.
Pete Dunkelberg says
John Mashey @33: “If you happen to comment, *please* be polite ….”
That’s going to be tricky.
Craig Nazor says
I read this blog a lot and love it, but I rarely post, as climate science is not my major field, and I feel I can learn more at this point by just reading. I now have an issue that some on this blog might be able to help me with. (I hope that this post is not considered under-the-sun dump material!).
On August 10, the Lower Colorado River Authority (LCRA) is holding a public hearing about selling 25,000 acre-feet of water per year to a proposed coal-fired power plant (romantically called “White Stallion”) on Matagorda Bay. The LCRA controls the water in the entire lower Colorado River, which is the major water supply for central Texas, and particularly the city of Austin (my home town). Governor Rick Perry, a noted global warming denier, has appointed all of the members of the board of the LCRA and none are climate scientists, which is very scary. In the LCRA assessment (supposedly written by scientists) of whether there is enough water to actually sell to this project, the LCRA goes back no further than the middle of the 20th century for the “drought of record,” and does not even mention global warming in their future projections:
http://www.lcra.org/library/media/public/docs/water/WhiteStallionContractFactSheetJune20_2011.pdf
The LCRA held a public hearing in June about this, at which I spoke and told them that it was hard to believe that they would base a water decision affecting millions of people while ignoring a major scientific consensus about future climate. The Texas Sierra Club commissioned what appears to me to be an excellent study by Dr. D. Lauren Ross using the LCRA’s own information that does mention global warming, and shows why they really do not have enough water:
http://www.lcra.org/library/media/public/docs/water/WhiteStallionContractFactSheetJune20_2011.pdf
But the LCRA managed to pretty much ignore her, as she was the only scientist speaking, and her study was commissioned by an environmental organization. Many others spoke also (a few politicians and many concerned citizens), all against the sale except for one person. Since central Texas (actually, almost all of Texas) is now in the middle of an extreme drought, experiencing the second hottest June on record, the LCRA punted the decision down the road until August 10 (Hoping for some rain? Or hoping that “White Stallion” doesn’t get a TCEQ permit to release lots of black smoke?).
Approving this sale would be very bad for Austin as far as water goes, and that’s not even considering all the CO2 and other pollution that will be pumped into the atmosphere by the “White Stallion.” What I believe we need on August 10 is more good scientific testimony from climate scientists (I have a doctorate, but not in climate science) about global warming, to force the board to at least address this reality. First, I would appreciate any helpful comments about the above links. And anyone who would like to come and give ‘em hell, it would be great if you would contact me in Austin (my contact information is easily available on the web). And keep posting all of this great information!
Craig Nazor
Jim Eaton says
It has been global weirding here in the Sacramento area the past few weeks. It was over 100 F on June 21st. A week later the high was 68 with a half inch of rain (a record, although it was cooler and over an inch of rain in Davis). Today is was back over 100. For several years now, we have had a prolonged wet and cooler spring running into late June. But never a rainfall event such as this (records for last Tuesday were set from Los Angeles (.02 inches) to the Oregon border.
ccpo says
@#25 I think that means that although we’re not stuck with 390ppm or more of atmospheric CO2, we *are* stuck with the raised total amount of carbon in the climate system for any conceivable future (hundreds of thousands of years at least).
I think this is incorrect. We have a number of ways of reducing carbon in the active climate system. The fastest two that are “natural,” have multiple functions, and help us with other problems we face – and dicussed here in the past – are forests and terra preta. Terra preta is a very long-term soil improvement technique (creates soils as rich as you will find) and carbon sink. it will be broken down and released only very slowly, and the rate is such that the creation of terra preta is a carbon sink, so as long as you are making it, you are effectively reducing carbon in the active climate system.
Additionally, this improves food production, both in quantity and resilience of the system. The soil is far less likely to wash away with rains and floods, e.g., and food production remains high for far longer without having to add additional nutrients.
Forests are obviously massive carbon sinks. Hansen has suggested we can equal CO2 emissions with forest growth, thus ending the rise in CO2 even if we keep producing it. If we add in edible forest gardens, we can add far more forest and make the food supply stable, to boot. Of course, the gain in CO2 sequestration with forests levels out with the growth in forests, but whatever the number is, it is still essentially a permanent sequestration. If we actively manage the edible forest gardens, then we can continue the slow sequestration of CO2 as we increase the amount of organic matter in the forest floor, which also makes it more nutrient rich, thus making the forest essentially self-sustaining.
Even if we max out forest growth and farmland production, we can continue to draw down carbon by using fast-growing trees for continued terra preta production which would either A. allow us to maintain more of the industrial base we now have (assuming energy supplies of whatever kind) and B. would give us a control knob for climate: we can choose to keep global CO2 within a given “ideal” range with nothing more than forests and food production.
Occam’s razor says do it, do it now.
PeteB says
At Bart’s http://ourchangingclimate.wordpress.com/2011/06/22/climate-science-scientific-method-skeptics-not/#comments
We were discussing the use of the Tiljander proxies and possible problems trying to calibrate to the instrumental record because of contamination during the period of the instrumental record. Rather than not use the proxies completely, is there any value trying to ‘indirectly calibrate’ to a period where there is not believed to be any contamination ? (I know this is all pretty marginal anyway, but it seems a shame to throw out stuff if there is some meaningful signal there)
That sort of brought us (via a roundabout route) to the Zorita opinion piece (which seemed to be suggesting an alternative approach to paleoclimatic reconstructions) :
http://coast.gkss.de/staff/zorita/Frank_etal_WIRESCllmChange_2010.pdf
which suggested
“Expert assessment to evaluate the signal of a particular record from a particular proxy archive (e.g., the lowfrequency skill of a new speleothem record) will be invaluable in trying to minimize ‘wrong figures’ being put into a large-scale reconstruction. It seems advisable at this point to use fewer, but expert assessed proxy records, rather than hundreds of proxy series, and hope that reconstruction algorithms will overcome the often huge noise components typical for many of the available time series”
Any comments ?
[Response: Good paper (lead author is David Frank, not E. Zorita) and good discussion over there at Bart’s. More in a bit from my viewpoint.–Jim]
[Response: I’ll give mine, with specific reference to tree rings. The biggest advances are going to come from a better mechanistic understanding of the causes of growth rate changes in those trees typically used in dendro-climatic reconstructions (i.e. climatically stressed trees of various types). Statistics will only take you so far. On that point I unequivocally agree with the linked Frank et al. paper. This improvement must be accompanied by changes in field sampling techniques that have traditionally been used, which are not optimized to evaluate long term, low frequency climatic changes, but which were used extensively by workers in the 1980s (who collected much of the data used in recent reconstructions), before this issue was recognized. See for example this short white paper by Briffa and Cook. The cumulative, interacting, and complex effects of temperature, soil moisture, (both of which have seasonal considerations) carbon dioxide, and tree age/size, all need to be resolved, because they are all actually or potentially important to radial growth rate. Doing so will still leave unresolved some process-generated noise relating to things like the uncertainty in relating measured precip levels–always at some distance away–to the soil characteristics at the tree sampling site, atmospheric effects (N deposition, ozone, acid rain, etc), and potentially changing inter-tree competition levels over time, for example. We can’t hope to remove all sources of uncertainty, but we can significantly reduce several.
To answer your other question, yes it is possible to “indirectly calibrate”–if you have some independent climate measure that is trustworthy and you have the requisite understanding of the driving processes of both that will allow you to rule out certain wrong or artifactual relationships. It may also be possible to improve some proxy series, strictly via mathematical improvements, before you relate them to any climatic time series. Then when other proxies, or improved instrumental data, becomes available, you have a better response variable series to use. This can be important as well.–Jim]
dcomerf says
I’d like to ask for help with finding references for an economics paper I want to write. In a lot of climate change economics work the environment is represented by a damage function (so that utility flow is a strictly decreasing function of CO2), and a CO2 equation of motion that is of the form: dCO2/dt = Emissions(t) + DecayConst * CO2(t)
where DecayConst is negative and this decay of CO2 levels is motivated by the observed absorbtion of CO2 by carbon sinks, especially the oceans.
I think that this is nonsense and, to a first order, the DecayConst should be zero or positive. My basis for believing this is that my understanding is that warmer oceans will hold less CO2, so any anthropogenic CO2 release, if the system is given the time to respond, will (to 1st order) cause an additional CO2 release from the ocean rather than CO2 being absorbed.
Are there any good references that I can cite to make this case very simply (this is just the introduction to my paper) to an economist audience?
Thanks in advance for any help!
[Response: There is no single ‘DecayConstant’ that makes much sense. Instead, use the multi-exponential approximation to the Bern Carbon Cycle model. This does not have an explicit carbon cycle feedback term, and so you need to add that in addition (i.e. add a term to the CO2_atm equation like gamma*Tsfc where gamma ~ 8 ppm/ºC and Tsfc is the anomaly over pre-industrial (perhaps with some delay). Note too that your equation needs to have an equilibrium point at the pre-industrial levels (not at CO2=0!). – gavin]
John E. Pearson says
40: on drawing down CO2.
Wally Broecker advocates “scrubbing” the CO2 out of the air: http://www.amazon.com/Fixing-Climate-Changes-Current-Threat–/dp/0809045028/ref=sr_1_1?s=books&ie=UTF8&qid=1309786614&sr=1-1
Jonathan Bagley says
Re #21’s question on how the global sea level is measured. I’ve also wondered about this. The claimed sea level rise is about 2-3mm per year. With shifting land masses (there are villages under water off the English coast), how can any measurement to such a degree of precision be made? As remarked above, the ocean envelope is not even spherical.
[Response: All sea level measurements at made with respect to the geoid (which is not spherical, or even oblately spheroid, but has many bumps and dips associated with topography etc.). Indeed, the accurate measurement of sea level change has been greatly improved because of greater accuracy in the geoid. The best global numbers currently come from satellite altimetry – using lasers to assess the distance between the satellite and the ground, and the accuracy is a function of a great deal of averaging over time and space. The satellite numbers are ground-truthed to a set of tide gauges around the world so that the satellite estimate of the changes at those specific points are the same as the direct observations. The altimeters obviously see anything that affects the distance (including residual tectonic motions/isostatic rebound etc.) and so that needs to be corrected for in assessing how much sea level is rising because of thermal expansion or ice melt etc. The http://sealevel.colorado.edu site has a lot more info and references. – gavin]
dcomerf says
Thanks Gavin, but embedding sophisticated climate models within economic models is what I want to argue against.
My point is that uncertainty around climate projections (especially on the downside) probably dominates in terms of economic decision making. Putting a deterministic climate model within the economic model means that benefits of climate policy are equated with costs of climate policy exactly – which is an absolute nonsense given the uncertainties/threshold effects involved. Instead I want to advocate that specific recommedations are embedded in the economic model – like e.g. no coal post 2030, and the implied prices/policy-instruments conditional on this are the focus of interest.
In order to make this case I want to argue against the CO2 equation of motion as currently embedded in economic models (see Nordhaus for most well developed example) (note also that CO2 is usually expressed as deviations from pre-industrial, so it is pre-industrial that is an equilibrium rather than zero concentrations). The major flaw that I think is baked in to this formulation is the structural negative feedback. What I want (if it exists??) is a good reference that says that the ocean carbon sink is actually a carbon source on the millenial timescale (I don’t even know if this is true – but it seems to follow from reduced CO2 solubility in warmer water?)
I will investigate the Bern Carbon Cycle model with adjustments as you suggest – but I was hoping for more of a simplistic solution!
[Response: The approximation to the BCC is the most simplistic solution that makes any sense. It is not in any sense a ‘sophisticated’ climate model (just a series of 4 exponential functions). This captures the first order behaviour that an increase in atm CO2 increases the flux at the air sea interface (you call this the structural negative feedback). What you want to be adding is a an additional function (as I outlined above) that increases atm CO2 as a function of temperature (the amplifying carbon cycle feedback). This gives a long-term equilibrium impact on atm. CO2 as a function of global temperature and is roughly 8 ppm/ºC (i.e. for a 2ºC rise over pre-industrial, you would expect a 16 ppm rise in CO2 over pre-I values. There is a lot of uncertainty in this number (see here) but this is the right ballpark. – gavin]
ike solem says
How about looking at the claims that the so-called Pacific Decadal Oscillation (like the AMO, of dubious physical reality) has entered a ‘cooling mode’?
“The switch of PDO cool mode to warm mode in 1977 initiated several decades of global warming. The PDO has now switched from its warm mode (where it had been since 1977) into its cool mode.” – Don Easterbrook, 2008.
How does that match the actual temperature trend in the N. Pacific?
http://www.nhc.noaa.gov/tafb/pac_anom.gif
Other topics: drought in Texas, flooding on the Mississippi, wildfires in Arizona – all in line with various climate model projections, such as polewards expansion of the Hadley circulation, increased water vapor in the atmosphere leading to high precipitation in some regions, and the general trend of declining soil moisture in the U.S. southwest and what that means for forest fires and ecosystem changes (northward expansion of Sonoran desert conditions appearing highly likely?)
Or, more importantly – the refusal of the corporate media and the corporate academic system to pay any attention to these issues. Is the Global Climate and Energy Program at Stanford University still raking in all those ExxonMobil & Schlumberger Oil dollars? Couldn’t be influencing the direction of academic research, could it? And the DOE’s Science Chief – yes, look’s like BP’s Chief Scientist is still in that position – and they’re still trying to push their fraudulent carbon capture and sequestration programs.
Pete Dunkelberg says
Stefan-Boltzmann in the stratosphere:
The Stefan-Boltzmann law says that the amount of energy radiated by a mass per unit time is proportional to its area times its temperature to the fourth power. Yet the cool upper troposphere radiates much more energy into space than does the hot stratosphere. Area seems to stand in for mass in the S-B law in a way. A small mass, even if spread over a large area, can not radiate as much as a larger mass. Evidently the mass must not be spread so thin that it is not well approximated as a continuous area for purposes of radiation. btw how would you even assign an area to the stratosphere? How can Stefan-Boltzmann be applied there?
Clarification needed.
dcomerf says
Thanks again Gavin. Just one final thought: I think the implication of how you describe the BCC (I still haven’t looked it up) is that the oceans are a sink and that the increased partial pressure from higher atmospheric CO2 trumps the reduced CO2 solubility?
My simple calc to get to this is: assume 3degree sensitivity to doubled CO2 and 50% initial absorbtion by oceans due to higher partial pressure of CO2. Period 1: emissions sufficient to raise CO2 from 280ppm to 840ppm i.e. emissions ~ 560ppm but 50% absorbed so atmospheric CO2 concentration at end of ’emissions period’ = 560ppm.
By next period, temperatures have equilibriated with new CO2 concentration so temperatures 3degrees warmer and we get another 24ppm of ’emissions’ from the ocean. Total effect of the oceans is then to absorb 280 – 24 = 256ppm worth of CO2 emissions.
Is this calc, ceterus paribus, on the right lines?
[Response: BCC is for the non-climate change situation – i.e. there is no reduction of solubility (or change in respiration, or stratification or all the dozens of other factors that impact the CC feedback). The climate feedbacks need to be explicitly stated via the ‘gamma ‘ term. Your calculation is ok except that the transient emissions required to maintain your scenario are varying implicitly – better to calculate the atm CO2 explicitly as a function of emissions and BCC + feedback. – gavin]
Fred Magyar says
Recently I was particulary struck by these incredible sculptures by Janet Echelman.
http://www.ted.com/talks/janet_echelman.html
In particular, there is one that is based on NOAA data obtained from a tsunami,is titled 1.26
I was wondering if there was a particular set of climate data that might be transformed into such a sculpture that could serve as a medium to get across in a visceral way to the general public what dynamic climate change looks like.
Perhaps this might be a way to cross pollinate art and scientific concepts to the benefit of a wider audience.
Patrick 027 says
Re 47 Pete Dunkelberg –
The Stefan-Boltzmann law applies to either a blackbody, which is a material that is non-reflecting, completely opaque (optical thickness = infinity), and at LTE, or to something which can simulate such conditions (a small hole cut into a box; as long as the box’s inside surface has less than 100 % albedo, a sufficiently small hole relative to the box’s dimensions will require that most photons entering such a hole would have to be reflected so many times that most end up getting absorbed before exiting, so the hole can act almost like a blackbody surface).
A real material layer typically has finite optical thickness; for a given direction, the transmitted fraction of radiation incident on one side that comes out the other is equal to exp(-optical thickness) and, if reflection and scattering are zero (and if at LTE), the emissivity (emitted radiation / blackbody value)and absorptivity (absorbed radiation / incident radiation) are both equal to 1 – exp(-optical thickness).
Except – that is only strictly true for one direction, and for a given frequency and polarization. If there is scattering or reflection, emissivity must still equal absorptivity for incident radiation being absorbed from a direction and emission back into that direction (although if optical properties have enough symmetry, or are isotropic, the same values apply to a direction and it’s opposite).
When emissivity (and absorptivity) vary over the spectrum, it is necessary to use the Planck function, which describes blackbody radiant intensity at a given frequency. Radiant intensity is the flux per unit area per ‘unit direction’ (that is, per unit solid angle). Radiant intensity must be integrated over directions (over solid angle) to find a flux per unit area, and before integration it must be weighted by the projection of the unit area of interest onto the unit area that faces each direction (this is the cosine of the angle between the direction of the intensity and the direction the unit area of interest is facing).
But qualitatively, holding optical properties constant, the emitted flux always increases with temperature, but the proportionality depends on how optical properties vary over the spectrum. Increasing the thickness of a layer, or increasing the amount of a substance (with isotropic optical properties) that absorbs or emits radiation, increases the optical thickness of that layer (in all directions, by the same percentage). If the layer is isothermal and there is no scattering/reflection and it is at LTE, then at each direction the emitted intensity is equal to the Planck function times the emissivity. If it is not isothermal than it is necessary to integrate along a line to find the radiant flux (see ‘Schwarzchild’s equation’, which simplifies to Beer’s Law when the temperature is too cold for significant emission).