Bob, related to your question in 259: in general, when chemical bonds form (or the bonding type changes from weaker to stronger, as in your phase change example), the energy is usually released as vibrational motion, though there are probably exceptions to this. The extra energy is then removed by collisional energy transfer (in the atmosphere that means collisions with N2 or O2 molecules, in a liquid it means collisions with whatever solvent molecules are present). So when part of a water droplet starts to freeze (typically around some seed particle), I would guess that the latent heat is initially released as increased vibrational motion of adjacent water molecules, which is then removed by thermalizing collisions with the surrounding gas. Does this help?
Yes, thank you very, very much! That’s exactly what I was looking for.
R. Gatessays
Jim Steele @ #291:
Interesting points. I am wondering as well about the AMO cycle. Seems like the potential for the AMO and PDO to both align in a negative phase is there, and this too, could provide for a longer period where we see no new record warm years. On the flip side, should we see a few record warm years during a cool PDO/AMO cycle, that would be quite telling as well. Regardless, the flattening of the global temp rise during such a period will give plenty of ammo to AGW skeptics and make it that much harder to press forward with any meaningful GHG reductions. If you’re a believer that this is a crucial period, the timing couldn’t be worse…
[Response: The PDO (at least) is not periodic, so this sort of prediction is meaningless. Yes, things might happen to align that way, by chance, but even if they did, these have negligible impact on global temperatures anyway. Statistics may conspire against a continuing trend for a few years here and there, but I’m with Stefan on this: it’s not a strong bet.–eric]
JCHsays
R. Gates – while it’s simplistic and probably meaningless, you can go to wood for trees and crudely replicate the Tsonis-Swanson graph using GISS, UAH and RSS, and get a very different picture of the “shift”. Which is something Swanson hinted at with respect to GISS in his article here.
jmho from the cheap seats, but I think some people need to sell off some natural variation and buy some Asian brown cloud.
John E. Pearsonsays
So I agreed to discuss climate science with an old highschool friend and denialist. I told him I wouldn’t discuss anything that was on “the wrong side of crazy” which included an 8 page manifesto from the heartland institute that started started stupid and quickly devlolved into demonstrable lies. Today I received this which was supposed to be demonstrating a big inconsistency.
================================================
James Overland 2004 :
When scientists trained their analytical tools on the North Pole and its environs, they quantified the local knowledge: The polar ice cap is 40 percent thinner and millions of acres smaller than it was in the 1970s.
What happens at the North Pole can affect the rest of the planet, potentially altering the course of the Gulf Stream, which moderates climate from the East Coast of the United States to the British Isles. Closer to home, the jet stream that dictates much of Seattle’s weather can be diverted when the polar vortex speeds up.
“It’s probably contributing to the fact that it’s warmer and we’ve been getting less snow,” Overland said.
James Overland 2011 :
Our region experienced record snowfall last winter, topping the charts dating at least as far back as the late 1800s. In all, more than six feet of snow fell at sites such as Baltimore-Washington International Marshall Airport. Extreme weather nailed other U.S. cities last winter, too, and swaths of Europe saw unprecedented snowfalls and record cold temperatures. This year, the nation’s capital has suffered one unusually severe storm. Parts of the East Coast from Atlanta to Boston have been experiencing blizzard conditions. Last week, a vast swath of the country’s midsection and East Coast got deluged with sleet and snow, paralyzing travel. What gives?
Looks like I left the ‘k’ off the carbon mass estimates in #271. Multiply by 1000.
Looking into what is worrying Hansen a little more deeply we have (in addition to the 5 Tt of carbon from burning all coal, oil and gas) ‘Canadian and Venezuelan deposits contain about 3.6 trillion barrels … of recoverable oil’ Here recoverable means about 10% of the carbon in place. But, the mining method exposes all of that carbon to oxidation over time so at 7 barrels per tonne that comes to another 5 Tt of carbon. http://en.wikipedia.org/wiki/Oil_sands
There are also large reserves of oil shale. We might expect all of the estimated oil in place to be oxidized if mining is used. Less if in situ methods are used. This looks like about 0.5 Tt of carbon known so far. http://en.wikipedia.org/wiki/Oil_shale_reserves
For areas under ice sheets I don’t think much is known but we might guess a total of 4 Tt of land based frozen soil carbon available for carbon feedback over a several thousand year timescale.
If equatorial deserts expand a great deal, we might get another 2 Tt from soil carbon, loss of forests and burned peat. So, all together, that is another 6 Tt that Hansen does not mention.
So there are presently about 0.75 Tt of carbon in the atmosphere. If we assume that the oceans saturate in their carbon uptake soon then the atmospheric content might go to 27 Tt, or 35 times the current concentration. That is somewhat under 6 doublings. Even with these very generous assumptions, using Hansen’s 4 W/m^2 of forcing per doubling, that is only 22 W/m^2 and not the 25 which is what he required (10% over current solar forcing).
But, when discussing fig. 30 he says that he only needs 10 to 20 W/m^2 change to get either a snowball or a ‘runaway greenhouse’ (p. 227) so if that is the case then the generous assumptions get him there. But, in the way I read that figure (and I’m not sure I’m doing it right), the 20 W/m^2 would be the number for the runaway greenhouse, not 10 and we don’t get there without more carbon sources than he mentions (plus ocean uptake saturation). So I’m still not too sure how he has convinced himself without these unstated aspects.
Other possible assumptions within the carbon sources he mentions are more clathrates than in the estimate I linked to, or undiscovered tarsands or oil shale. Unless he has decided that 10 to 20 W/m^2 means 15 W/m^2 it does not seem to add up to a dead certainty.
But, if we grant him that, then I think it probable that he thinks that getting to 1.1 times the solar forcing leads to inevitable loss of the oceans. This is strange because he seems sanguine about walking back from 450 ppm to 350 ppm, something that Solomon might not sanction. And, while one dimensional models probably can’t inform about silicate weathering rates, at the level that he is certain of a runaway, we might be nearly as certain that the process would be halted by formation of calcium carbonate and like minerals and removal of carbon from the atmosphere.
I think this may be the most important problem. He seems to think that a runaway must runaway but his runaway involves losing the oceans and that is not inevitable. Ocean loss does not lead to more ocean loss in this regime. I think raypierre would say that it never gets started (never gets to a damp stratosphere), but it is pretty likely that if it does, it will not continue. So, it could be that semantics are the conceptual issue. When the Sun gets brighter, things will ‘runaway’ but only because the Sun is not going to get any fainter going forward. Once the oceans are gone, weathering will cease and we should get an atmosphere similar to that of Venus. But, carbon dioxide can become less so it is not the backstop that the Sun is. The word ‘runaway’ sounds inevitable, but it needs something to run from.
I happen to believe the AGW is occurring, but I’m hardly blind to these longer term ocean cycles, nor what it could mean if they happen to align to both be warm or cold at the same time. Witness, the well-known “Great Climate Shift” of 1976, was a point in time when both the AMO and PDO began strong uptrends toward warm.
I think a more interesting question, and perhaps one we’ll never know, is how AGW could be affecting these longer-term multi-decadal cycles.
The statement “it seems charts like this show clear periodicity” is a very poor indicator of genuine periodicity. In fact it’s usually indicative of misinterpretation.
This is my specialty, and I often suspect that the false identification of periodic behavior is one of the most common mistakes in the peer-reviewed scientific literature, let alone in blog posts and idle speculation. Some posts on the topic:
I haven’t yet seen any solid evidence that the PDO (or the AMO for that matter) is periodic. Period.
R. Gatessays
Tamino @ 309
I’ve only recently began looking at both of these in any detail, and I’ll trust your expertise. I still can’t but help think that the fact that both the PDO and AMO started going positive around 1976 was part of the “climate shift” so noted in that year, but of course, this is confusing cause and effect I suppose, as that is exactly what they were measuring, as opposed to causing. Also, the current cold cycle of the PDO looks rather like La Nina in pattern, and I’m wondering how one might distinguish them, other than length of duration?
Erikthefishsays
Interesting posts recently, thank you for the clarifications.
I have a question please?
Whilst viewing the Cryosphere today images and the NOAA/METOP Infra-red passes together, it strikes me that there maybe a significant error in the display of the northern polar ice-cover as represented. The raw satellite Infra-red images show that there is a massive variation in the homogeneity of the Sea ice cover, many areas of ice are very cracked with fissures and leads that travel in some cases completely across the polar Seas.
This cracking varies and produces areas that reveal large areas of the warmer sea beneath, and these cracks do allow the warmer sea to warm the areas along the cracks, in several instances this area can be more than 10 times the width of the crack.(several 1000’s square km). The result of this effect on the infra-red images are warm areas within the main ice mass. (To a level that would only normally be expected along the sea/ice boundary.)
The question is, why in most visible displays of ice cover is not the quality of the ice given more prominance?
And is the ratio of crack/ice factored into calculations of sheets total coverage, if so how?
(I know estimated ice thickness is)
The warmth along the cracks from the Infra-red images seems to show that when low/high pressure storms “impact” the top of the ice sheets in the arctic the cracks increase, and the area gets “warmer” in the infra-red images. In some cases like the recent breakup of the eastern Greenland coastal sea ice shelf, the cracks precipitate much faster changes in the quality of the ice, which is of significance.
The Cryosphere today images don’t appear to reflect the quality of the ice properly, in some cases from some images; I estimate the area of cracked ice well exceeds that of the un-cracked component within an area, yet on sites showing ice coverage these areas are shown as ICE. Some areas I estimate only have approx 20% Ice cover yet are shown as Ice.
From my own archive of data, it seems to me that the cracks in the Arctic sea ice have increased over the last few years but this is not illustrated in images or information I have seen.
Not picking on the Cryosphere today site, as it is very good, just using it as an example, the NASA/NOAA sites show the same.
Imagine a world in which these opinion pieces were published anonymously. How much attention would they receive? As far as I can remember his bubble-economics appeared to be almost as over-optimistic as his technological speculation.
By the way would the term inactivist be more appropriate than contrarian? Except for his equation :
climate change = warming of the cold regions.
deconvolutersays
Apologies for clumsy aim:
My previous comment was intended for the thread on FD.
David B. Bensonsays
R. Gates @310 — There is evidence that both the AMO and PDO have existed for a long time as quasi-periodic oscillations (QPOs); try searching for tree ring studies in the North Pacific and in Northern Europe. However, along with the SOI for ENSO, these are but indicies of inteernal variability and make but little difference in explaining the global temperatures over the instrumental period.
Here is an ultra-simple illustration to make the point for the AMO: https://www.realclimate.org/index.php/archives/2010/10/unforced-variations-3-2/comment-page-5/#comment-189329
I’ve recently completed a somewhat more realistic two reservoir model using annualized data. In that, the net of all forcings explains most of GISTemp with SOI a substantial aid in explaining more of it. Then a modest dose of AMO explains some more and a much smaller dose of the PDO explains a tiny bit more. The gist of the matter is that the forcings explain what has been happening to global temperatures since 1880 quite well and the indices of internal variability (SOI,AMO,PDO) explain very little.
Vendicar Decariansays
Proposals in Texas and Iowa single out public education for steep cuts.
In Texas, one proposal would reduce total education spending by 15 percent, or $7-$10 billion, depending on K-12 enrollment. The plan also calls for shutting down four community colleges in a bid to close the state’s $15 billion deficit while honoring Republican Gov. Rick Perry’s pledge not to use rainy-day funds or raise taxes.
Iowa GOP Gov. Terry Branstad wants to cut preschool funding from $71 million to $43 million, which would eliminate the universal free pre-school for four year-olds the state established in 2007.
[moved]
Weather and the Earth’s magnetic field
JW King – 1974 – nature.com “A comparison of meteorological pressures and the strength of the geomagnetic field suggests a possible controlling influence of the field on the longitudinal variation of the average pressure in the troposphere at high latitudes….”
And that’s been conflated with recent paleo work on major storm events that may happen again in California, and with a the recent paper about movement of the North magnetic pole.
Mush together, stir wildly, rush to print. Egad.
Briefly: Yes, there are a variety of magnetic poles, not just one straight through the middle, yes, the surface locations change, no, recent changes do not mean the poles are going to flip and cause giant storms, and no, NASA said nothing of the sort.
I agree, with Rai, there is a large tendency amongst most contrarians to minimize AGW with one oscillation or another. La-Nina usual effects are floundering because planetary waves are placing themselves according to NH surface conditions not strictly stemming from the temperature of the equatorial Pacific, but from a combination as a whole. Check out what these oscillations guys predict or project, since they admit having a basic misunderstanding of Climate, utterly rejecting AGW, they usually flunk more often than not in any predictions,
or they utterly are incapable of daring one.
john byattsays
A new take on the Australian poem My Country by D Mackellar
I love a sunburnt country
A land of sweeping plains
Of climate change deniers
Of droughts and flooding rains,
Of bleaching coral reefs
In oceans that are warming –
making cyclones rage through towns
and huge fires go a-storming.
I love her coal industry lobbyists,
I love her rightwing jocks –
either choosing wilful ignorance
or lying through their socks.
I love her weak politicians,
heads firmly in the sand –
‘the greatest moral challenge’
has pretty much been canned.
I weep for what will happen,
And wonder where it ends
Watching scenes of great destruction,
untold damage, death of friends.
I love her far horizons
with their burnt or ripped-up trees,
her vulnerability and our terror –
the wide-brown land for me.
@ john byatt #319, best add the ‘poet’ and source:
Poem by Judy Horacek who write: “A poem written more than 100 years ago by a homesick 19 year old versus an ever-increasing body of refereed scientific thought…hmm, hard to know which way to jump really.”
R. Gates, not to put too fine a point on it, but the thing about oscillations is that they repeat…repeatedly. In other words, you really don’t know if it’s periodic until you see several periods. There are all kinds of oscillations–some are actually periodic, and these must be driven by a periodic forcing. More commonly, the system has a sort of bi-stability and the distribution of perturbations makes it look quasi-periodic. Exercise for the reader: Graph the following sequence
Looks periodic, doesn’t it? But it ain’t. The “y”s in the pairs are the digits of the transcendental number e–the base of Napierian logarithms. The human eye is trained to see order. Sometimes it sees it when there really is none there.
John E. Pearsonsays
321 Ray:
You don’t need periodic forcing for a system to respond with periodic oscillations. There are two generic ways for a nonlinear dynamical system to bifurcate from steady to periodic: hopf bifurcation and saddle node bifurcation. Periodic forcing isn’t needed in either case.
J. Bobsays
#299 Ray
Here is a sample of a longer term outlook on temperature (HadCRUT3gl) using 3 different methods: MOV, 2-Pole “filtfilt” Chebushev & a Fourier Convolution. The time frame was a 10 year MOV, and a 0.1 cycles/year filter cut-off freq.
The three methods were used as a comparison between the methods, and to evaluate their strengths & weaknesses. The MOV “cuts out” before the end, which is not helpful, and the “filtfilt”, or forward & reverse recursive is subject to “reflection” conditions at the end. The Fourier does seem to give the best result, particularly at the end point.
The upper figure shows the three filter methods, and their respective responses. The bottom figure is a little more interesting, in that it shows the difference between the Fourier Convolution filter response and the “raw” signal. Of more importance, it provides a metric to evaluate the filtered value error from the actual. It shows the difference is pretty much less then +/- 0.2 deg. over all, while the mean is virtually negligible. Assuming a Gaussian distribution, the 3 sigma error would be less then 0.3 deg. Probably better then most of the temperature data taken over the past 150 years.
John E., sure, but the fact that you can get a periodic oscillation that way has no bearing on whether what you’re seeing in some aspect of climate is a periodic oscillation. Could be, it’s not impossible. Nobody says it is.
Ray and Tamino and others persistently explain that we’ve developed tools to tell us whether there’s a pattern because we can’t trust our eyes — we know by now that people are highly biased toward seeing patterns, whether or not a pattern really is there. It’s been a useful bias for a long time.
Fail to detect the leopard spots in the leafy shade? No more offspring.
Nowadays it’s our own actions that may cause patterns we need to know are there in time to avoid their consequences. It’s still a problem to know.
Could be periodic? Could be a trend? Could be multiple periods and multiple trends. What are your tools? How do you know? Why do you trust your tools?
Brian Dodgesays
@ Harold Pierce Jr — 7 Feb 2011 @ 3:26 PM (and thanks for the Freudian slip “Brain” compliment &;>)
L.B. KLYASHTORIN and A.A. LYUBUSHIN in CYCLIC CLIMATE CHANGES AND FISH PRODUCTIVITY start out admitting there has been global warming –
“As shown in Fig. 1.1, the increasing secular linear trend (about 0.06 °C per each 10 years; Sonechkin, 1998) at the background of interannual year Global dT variations is observed.”
then do their best to obscure this fact by removing the trends (sorta like McLean, de Freitas and Carter); at least they didn’t use a derivative filter which would have increased short term noise.
“After detrending, the long-period fluctuations of Global dT with the maxima at about 1870s, 1930s and, apparently, at 1990s are clearly observed”
“Fig. 1.6. Comparative dynamics of detrended Global dT (bold line) and zonal ACI…”
“Moreover, similar to Global dT, dynamics of water area free of ice demonstrates increasing secular linear trend (about 15% per century). To compare multiyear fluctuations of the water area free of ice and Global dT. both detrended curves are shown in Fig. 2.15.”
They use the word “detrend(ed,ing)” 41 times.
They say
“It is thus not our goal to discuss particular mechanisms of the climatic processes.”
“Similarity in the shape of Global dT and AT anomalies curves also suggests that there is a relationship between changes in the atmospheric transfer and subsequent fluctuations of the global temperature, although the mechanism of this dependence in not clear yet.”
Basically they are doing curve fitting without attempting to understand the underlying physics, and assuming that the mechanisms are periodic without any explanation why that should be.
“60-year periodicity of the climate fluctuations”
“(50-70 year) temperature periodicity”
“16.8-, 32-, 60- and 108-year periodicity”
“periodicities: 17.5, 23 and 57 years”
“54-year periodicity, the 32-year peak”
“predominance of the 60-year peak, whereas the 32-year peak is lower”
“predominance of the 76-year peak, whereas 26- and 39-year peaks are less intensive.”
“predominance of the 55.4-year peak”
“predominant peaks (54 years)”
“close to the 60-year periodicity” (but no cigar)
Their figure 1.13 shows four spectra, ice cores and tree rings, with peaks at 25.6, 32, 32.5, 38.6, 53.9, 55.3, 60.2, and 75.8 year periods.
They go on to say “Fluctuation spectra of California sardine and anchovy populations during the recent 1700 years (Fig. 1.15) demonstrate well defined predominant peaks: 57 and 76 years for sardine and 57, 72, and 99 year for anchovy. This correlates well with the predominant spectra of climatic fluctuations during the last 1500 years according to ice core samples and tree growth rings.” But they don’t say what the correlation coefficient(s) is(are).
I made a spreadsheet with periodic (sin) functions at the frequencies Klyashtorin et al found in their data, and plotted them over 400 years, along with their average + 20% random noise, and a lowpass filtered (40 year running average) random sequence. I also generated a graph of average of all frequencies, the average of the four frequencies from 50-70 years, (with noise), and a lowpass random sequence.
The chart is here – http://www.imagenerd.com/uploads/fishy_cycles-HjmIs.jpg
In the top graph the dotted curves are the noise and the average + noise. Clearly there are no strong “eyeball” correlations. The bottom plot is the comparison of all frequencies, four frequencies, and random data.
Referencing Kryizhov V.N. 2002. Regional features of climatic changes in the North Europe and the West Siberia in the 20th century, they leap to the conclusion that –
“The absence of secular increasing temperature trends in the Arctic region seems unexplainable from the point of view of the so-called global warming,…
This might explain why they included a chapter on global fossil fuel consumption in a work about fisheries.
PS In my chart, the random data is blue, all frequencies is tan, and four frequencies is green.
John E. Pearsonsays
324: Hank.
Perhaps I took Ray’s remark out of context. My reading was that he said that periodicity required periodic forcing as a general statement not with specific reference to climate. I was talking about mathematics not climate. I think that people who look at 130 year’s worth of data and insist there is a period of 65 (or whatever) years are talking through an orifice not their mouth. I know full well how difficult it is to not see patterns in noisy data. These days I devote essentially all my time to trying not to see patterns where there are none. I’m making some headway.
\The United States Geological Survey (USGS) Multi-Hazards Demonstration Project (MHDP) is preparing a new emergency-preparedness scenario, called ARkStorm, to address massive U.S. West Coast storms analogous to those that devastated California in 1861–62. Storms of this magnitude are projected to become more frequent and intense as a result of climate change.
The MHDP has assembled experts from the National Oceanic and Atmospheric Administration (NOAA), USGS, Scripps Institute of Oceanography, the State of California, California Geological Survey, the University of Colorado, the National Center for Atmospheric Research, and other organizations to design the large, but scientifically plausible, hypothetical storm scenario that would provide emergency responders, resource managers, and the public a realistic assessment of what is historically possible.
The ARkStorm is patterned after the 1861–1862 historical events but uses modern modeling methods and data from large storms in 1969 and 1986. The ARkStorm …, with a ferocity equal to hurricanes, slam into the U.S. West Coast for several weeks.\
—————–
That’s a distinction between \alarming\ and \alarmist\ — realism.
Those posts by Spencer clearly illustrate his inability to distinguish interesting but wrong mathematical analyses and the reality supported by the scientific consensus of the experts. The relationship between global temperatures and atmospheric CO2 due to ocean outgassing is well known: the exact magnitude is not precisely pinpointed, but has been estimated to be around 7 to 10 ppm/degree C (Denman et al., Chapter 7 WGI IPCC AR4). Not 7-10 ppm/degree/year, but 7-10 ppm/degree in equilibrium. Therefore, the past century’s temperature rise should account for less than 10 ppm of the total CO2 rise.
Not to mention, this requires ignoring the ice core record of the past 800,000 years which suggests that CO2 concentrations did not exceeded 300 ppm during this time period, until the last century. Or, maybe, is Spencer suggesting that the past century was warming than any time in the past 800,000 years? Not even the most alarmist of mainstream scientists claims that that is true (Yet, anyway. By the end of the 21st century, if we continue emitting GHGs, and we’re unlucky, we may unfortunately reach temperatures unprecedented in the past million years).
So, basically, Spencer confuses the factors leading to year to year noise in the system – namely biological and oceanic response to temperature change – with the factors leading to the long term trend in the system – namely, anthropogenic CO2 emissions. Hmm. This is surprisingly parallel to his arguments about temperature change and his weird scribble diagrams attempting to claim low climate sensitivity but high unforced cloud response…
-M
john byattsays
From Starship vs Starship thread
danny bloom
Rebrand Earth with More Fitting Name — Before All Hell Breaks Loose
A climate activist is seeking to rebrand the name of our planet Earth with a more fitting name — one that would better reflect a better understanding of where we live and our place in the cosmos. He says that his
crusade is a public awareness campaign and has no financial backers or corporate sponsorship.
“It’s my way to seeking to give this planet a more apt name, and I have no agenda, other than to help people think of the planet in a new and improved way, since ‘earth’ really just means the ground, the dirt, soil,” he says.
He says he is campaigning now for a new name for the planet we live on, given that, in his opinion, the word Earth is not a very good one.
“Let’s rename the Earth,” he adds, ” especially in regard to the fact that we need to work hard to stop global warming and climate change from doing a huge number on this third rock from the sun.”
Okay, so what name would you suggest, dear Reader?
One word is best, but this rebranding could aso use as many as 2 -5 words in a term as well, such as “Third Rock from the Sun” or “Terra Firma”.
He says he is looking for a word that will help teach younger generations in the future to treat the planet with more respect and gratitude for giving us life.
By the way, the current name “Earth” derives from the lame Anglo-Saxon word ”erda”, which means ground or soil, and is related to the lame German word ”erde”. Duh. It became ”eorthe” later, and then ”erthe” in Middle English. But people in the 16th Century had no idea what the plaent was really all about. Now we do. What new name would you suggest?
How about Fubbel you wubble you cucked ?
john byattsays
# 320, TA sou, i use the ABC (Aust} climate change forum articles to promote Real Climate and Skeptical Science,
Rod Bsays
Ray Ladbury (252 et al), Planck function radiation and spectroscopy radiation are not the same thing but neither are they mutually exclusive. Temperature is the direct cause of Planck radiation; it is only indirect for spectral radiation. CO2 vibration band at 15um will emit ONLY if the 1st energy level is excited, and that can happen in a number of ways. However a higher amBEiant temperature makes it more likely to be excited so temperature plays a secondary role. An exception to this is temperature often is the direct cause of electron level excitation and so is directly responsible for spectral radiation in the visible.
There is no reason why a body cannot emit “blackbody” radiation and spectral radiation at the same time. One gets almost a perfect broadband blackbody from the sun while simultaneously seeing He and H, et al spectral lines (though in the sun’s case they are mostly absorption lines as emission lines will be mostly drowned out by the blackbody radiation. Still…. they’re not the same thing (other than after the fact all radiation (photons) is exactly alike.)
The 2nd part of your comment gets us into the bugaboo of a gas emitting Planck radiation. I’m in the camp that it does (“ALL bodies radiate…”), though by my limited survey I’d say 2/3 or so say it doesn’t. In any case a Planck-radiating gas does pose some oddities. Figuring out the surface and assessing the emissivity are no easy tasks. It’s my view that this radiation is very weak (small emissivity), which might explain the prominence of the spectral line radiation.
I do not agree that the relative strength of the line emission is determined by where it intersects a Planck curve (if I understand your point correctly…) Since the energy absorbed or emitted by CO2 vibration is exactly fixed and the same, the line strength is purely a function of the number of molecules absorbing or emitting. If you draw a blackbody curve at, say, 275K and determine the intensity at the precise frequency where CO2 emits, you cannot can say the CO2 emitted intensity is (always) exactly on the blackbody curve for that frequency.
I disagree just a little bit with just a part of raypierre’s response. At least by my rough math, Doppler broadening is virtually nil and even collisional broadening is quite small (and subject to a fair amount of uncertainty, btw). Where collisional broadening does spread the line frequency, the intensity is very low and still the broadening is not much to write home about. It seems the “banding” is more a function of closely spaced rotational lines – which can be significant. (I don’t know if increased density (collisions) tends to enhance the rotational lines or not; I suppose it might….) ray’s other points are well taken and might also explain the confusion between planck and spectral/line radiation.
Isotopioussays
Jim Steele, the biggest problem with discussions about internal variability is time. For example, what is a short term trend with regards to climate?
For the current world, trends exist anywhere between 100kyr and 5 years, so:
100kyr = very very long term trend
10kyr = very long term trend
200-300 years = long term trend
60 years = short term trend
30 years = macro trend
5-15 years = micro trend
In the last 10kyrs, we have had a very long term cooling trend with plenty of long term /short term warming and cooling…
So what does that say about the current global warming?
Has anyone else read the above paper? It get’s quoted some here and there. I lack more than rudimentary statistics, but he seems to be saying that the actual theory of how the physical components of climate interact doesn’t matter, since the data can be ‘generated as a random walk process.’ Anything here make sense?
This quote shows up a lot too, including in my local paper:
Professor Terry Mills, professor of applied statistics and econometrics at Loughborough University in England, looked at the same data as the IPCC and found that the warming trend it reported over the past 30 years or so was just as likely to be due to random fluctuations as to the impacts of greenhouse gases. Mills findings are to be published in Climatic Change, a peer-reviewed environmental journal.
I found the journal, but you have to pay to read anything.
There’s also this:
In a paper published in the Journal of Data Science Terry Mills concludes
Indeed, examining much longer records of temperature reconstructions from proxy data reveals a very different picture of climate change than just focusing on the last 150 years or so of temperature observations, with several historical epochs experiencing temperatures at least as warm as those being encountered today: see, for example, Mills (2004, 2007b) for trend modelling of long-run temperature reconstructions. At the very least, proponents of continuing global warming and climate change would perhaps be wise not to make the recent warming trend in recorded temperatures a central plank in their argument.
Any comments? I couldn’t find Mills in the RCWiki. Has he shown up on this thread already? If so I apologize for not reading alll of it first. In that case will someone direct me. I appreciate anything.
I found no reference to this Terry Mills on RCWiki or RC, and his comments are showing up in my local newspaper in science-denial land. I lack the statistics to critique his, but he seems to discount the physical components of climate theory in favor of pure statistical arguments, even though he SAYS he doesn’t.
In March this year I had a post that reported on some forthcoming research into the econometrics of climate change.
I haven’t been able to track down a paper but I have seen reports on this
Professor Terry Mills, professor of applied statistics and econometrics at Loughborough University in England, looked at the same data as the IPCC and found that the warming trend it reported over the past 30 years or so was just as likely to be due to random fluctuations as to the impacts of greenhouse gases. Mills findings are to be published in Climatic Change, a peer-reviewed environmental journal.
In a paper published in the Journal of Data Science Terry Mills concludes
Indeed, examining much longer records of temperature reconstructions from proxy data reveals a very different picture of climate change than just focusing on the last 150 years or so of temperature observations, with several historical epochs experiencing temperatures at least as warm as those being encountered today: see, for example, Mills (2004, 2007b) for trend modelling of long-run temperature reconstructions. At the very least, proponents of continuing global warming and climate change would perhaps be wise not to make the recent warming trend in recorded temperatures a central plank in their argument.
I should point out that Mills is not a climate scientist either, he is an econometrician and author of magnificent book on modelling time series. It might be a bit hard claiming that he doesn’t understand first year stats.
The paper is still forthcoming but is now on the website in press (subscription required). What does Mills conclude? (emphasis added)
While a number of interesting features of the monthly global temperature series have been uncovered by fitting structural models, the trend component is likely to be of most interest, as this naturally features in debates concerning global warming. The trend component is generated as a random walk process with no drift, so that a pronounced warming trend cannot be forecast. Indeed, sensitivity analysis shows that, within this class of model, it is almost impossible to deliver an increase in trend temperatures over the twenty-first century that is consistent with that projected by conventional coupled atmospheric-ocean general circulation models: to do so would require choosing ill-fitting models statistically dominated by simpler specifications and then imposing a value on the slope parameter that, on statistical grounds, is highly unlikely. A similar, if less extreme, conclusion may be arrived at from a sensitivity analysis of the breaking trend model, although here, of course, some degree of global warming is forecast. In contrast, cointegration/error correction models, when supplemented with the assumption of a 1%annual increase in radiative forcing (equivalent to a doubling over 70 years), produce long-range forecasts much more in keeping with those from coupled general circulation models.
Given these alternative models of observed temperature series—and from the references given above, several others could be offered—there is no doubt that, in this area, there are indeed many ways to skin the proverbial cat. Which of the alternatives should be chosen? Do you adopt a carefully specified univariate time series model that, because of its property of adapting quickly to current movements in the series, essentially is unable to deliver any increase in forecasted temperatures; do you choose a simpler trend break model in which the breaks are a consequence of rare and large changes in key external forcing factors, as proposed by Gay-Garcia et al. (2009); or do you explicitly model the long-run, cointegrating relationship between temperatures and radiative forcing that is based on the hypothesis that changes in radiative forcing , influenced in part by human activity, generate changes in temperatures (Kaufmann et al. 2010). Statistical arguments alone are unlikely to settle issues such as these, but neither are appeals to only physical models or the output of computer simulations of coupled general circulation models. In such circumstances it would appear that, to quote another ageless proverb, ‘you pays your money and you takes your choice’. Indeed, it could be argued that such a proverb is particularly apposite given the ongoing debate concerning the potential costs of combating global warming and climate change, the most notable recent protagonists being Stern (2007) and his reviewers, for example, Nordhaus (2007), Tol and Yohe (2006) andWeitzman (2007).
I also found a paper at the Journal of Cosmology – I don’t know anything about this journal, but Mills makes much the same argument.
The most notable implication of these structural models is that concerning the presence of a global warming signal. The absence of a significant drift in the trend component, making this a simple random walk, thus precludes the presence of a warming – or, indeed, a cooling – trend. Long term forecasts of trend temperatures are given by the current value of the trend component and future trend temperatures have as much chance of going down as they have of going up. This may seem surprising given the general upward movement of the trend component over the last thirty years or so but such departures, which would be attributable to natural variation, are entirely possible for random walks and are a consequence of the arcsine law of probability (see Feller, 1971). As a consequence, forecast bounds for the trend component quickly become large as forecast error variances increase linearly with the forecast horizon for a random walk.
——————-
How does one respond to this? Thanks for anything. Andy
also sorry if I accidentally multi-post
Didactylossays
Andy: Tamino has rejected the “random walk” hypothesis very firmly.
But that is beside the point – the whole issue is a neatly dried red herring, since we don’t rely on statistics to conclude that the globe will continue warming. We measure recent trends purely so we can know exactly how fast we are warming.
So, not even wrong.
Didactylossays
Andy: one recollection from the past “random walk” discussions is that it is very easy to show that any data is a random walk simply by making some inappropriate choices when choosing parameters.
It is easy to test this on synthetic data.
From the introduction, we see that he splats a linear trend from 1850 to 2010 and starts from there. That’s clearly not a good start. How on earth does he plan to model all the complex climate interactions over that period? Duh. He doesn’t.
Economists all fall into the same trap. Their economic data doesn’t follow rules, so they think all data is the same – and then they mess with climate and embarrass themselves in the literature. I don’t want to generalise, but I genuinely can’t think of a counter-example.
Loughborough: one of the top sports science universities in the United Kingdom!
Ray Ladburysays
OK, Rod, think about this. We have an atom or molecule in an energy eigenstate E2, a second excited state above its ground state E0. It may be able to radiate to E1 or E0, but it cannot radiate to a state halfway between E1 and E2. No energy states exist there. The wave function goes to zero. So just how–by what physical process–do you propose to have this molecule radiate over a continuum? There is no separate “blackbody radiation mechanism”.
Now think of a hydrogen gas lamp. Pass a current through it to heat it to a high temperature and look at the spectrum. It is a line spectrum. Is there a continuum in addition to those lines? No. Between the lines is black. So, why would there not be a continuum if it were possible for the hydrogen to radiate in the continuum as well as between its energy states?
We get a blackbody from the sun because 1)the plasma is quite dense and quite hot so there is lots of spectral broadening, 2)it takes years for a photon radiated in the photosphere to make its way to the surface and escape.
And no, the spectral strength is not equal to the number of molecules absorbing and emitting, but to the difference between the number of molecules being excited and the number of molecules that relax by processes other than radiation.
Rod, find just one textbook that says that matter emits thermal radiation in other than between its (broadened) energy eigenstates. I don’t know why you persist in this fantasy when you have found bupkes to support it coming from actual scientists.
Rod Bsays
Ray Ladbury, here’s a simple (but too long) copy from a course put out by JPL/NASA I think for advanced secondary schools. http://www2.jpl.nasa.gov/radioastronomy/radioastronomy_all.pdf
There are bupkes like this up the yin-yang. I’ll respond to some of your specific ideas later.
“Examples of thermal radiation include:
1) Continuous spectrum emissions related to the temperature of the object or material.
2) Specific frequency emissions from neutral hydrogen and other atoms and molecules.
….. any object that contains any heat energy at all emits radiation…. ….and then emits the energy at all frequencies… All the matter in the known universe behaves this way.
Electromagnetic radiation is produced whenever electric charges accelerate, that is, when they
change …speed or direction of their movement. In a hot object, the molecules are continuously vibrating (if a solid) or bumping into each other (if a liquid or gas), sending each other off in different directions and at different speeds. Each of these collisions produces electromagnetic radiation at frequencies all across the electromagnetic spectrum. However, the amount of radiation emitted at each frequency (or frequency band) depends on the temperature of the material producing the radiation.
…… A blackbody with a temperature higher than absolute zero emits some energy at all wavelengths.”
Look at the introduction, 1.1 up through 1.3, available from the Google Books images. I’m nowhere near a physicist but I can follow this much. In a solid those vibrations aren’t single pure notes. A musician can ‘bend’ a note on a vibrating string by stretching it. A solid composed of atoms/molecules is full of bonds that are bent and stretched. Those are the charges accelerating/changing that cause the infrared photons to come off a solid, while heat’s moving through those same bonds.
Then by 1.3 he gets to “thermal” radiation — discrete specific frequency emissions.
If y’all could just pick a book like that one, and start a blog to discuss it, maybe attract a teacher you could agree on, this wouldn’t have to keep coming up over and over here, and might be a good education for the rest of us bystanders instead of a recurring argument that never gets anywhere.
“It Is Impossible For A 100 ppm Increase In
Atmospheric CO2 Concentration To Cause Global
Warming”
john byattsays
At ABC Australia unleashed
No its not the joke of the day
john byatt :
10 Feb 2011 1:26:10pm
” You ain’t seen nothing yet”
By 2200, the PCF strength in terms of cumulative permafrost carbon flux to the atmosphere is 190 ± 64 Gt C. This estimate may be low because it does not account for amplified surface warming due to the PCF itself and excludes some discontinuous permafrost regions where SiBCASA did not simulate permafrost. We predict that the PCF will change the Arctic from a carbon sink to a source after the mid 2020s and is strong enough to cancel 40–88% of the total global land sink. The thaw and decay of permafrost carbon is irreversible and accounting for the PCF will require larger reductions in fossil fuel emissions to reach a target atmospheric CO2 concentration.” Kevin Schaefer, Tingjun Zhang, Lori Bruhwiler, Andrew P. Barrett, 2011, Tellus B, DOI: 10.1111/j.1600-0889.2011.00527.x.
Reply Alert moderator
scotty :
10 Feb 2011 2:59:38pm
“We predict that the PCF will change the Arctic from a carbon sink to a source after the mid 2020s”
And correspondingly, the Antarctic will become a sink, as Antartic ice is at record levels on the majority of its landmass, and Antartic ice grows when Arctic ice recedes (and vice versa).
Selective analysis to prove your pre-determined point isn’t scientific, in fact that’s the precise problem with much of the climate ‘science’ that is out there.
recaptcha made more sense
Ray Ladburysays
Rod, first, that is not a physics text. I’ll take Landau over a tutorial for amateur astronomers. Second, we are not talking about
1)a solid, in which allowed energies form continuous bands
2)a plasma, in which the electrons are free to interact with the electromagnetic fields of the photons and scatter them
In a gas, energy levels are discrete lines–that is precisely why you get spectral lines when starlight passes through a gas cloud. Blackbody radiation is the spectrum you get when photons are in thermal equilibrium. Photons do not interact with each other, so the only way they come into equilibrium is by interacting with surrounding matter. That means absorption and emission. Those absorptions and emissions must occur between allowed energy states–in a gas, lines; in a solid, bands. Just how is a neutral material going to absorb and emit at energies that are forbidden?
In addition, there are no true blackbodies. The closest you can get is a cavity with a small hole. Why? Because it allows for maximum time for the gas to truly reach equilibrium and maximum time for extremely improbable interactions to fill in a bit the gaps between lines. And even then you don’t get a true blackbody spectrum–just close.
Look at the emissions from a vapor lamp–they’re line emissions, not blackbody. How is that consistent with your thesis?
Rod, I am harping on this for a reason. You cannot understand the greenhouse effect if you think CO2 molecules radiate and emit at all energies. If they did, we’d be a whole helluva lot warmer than we are, since some of the excited states are long-lived and would turn light into kinetic energy until we no longer had near the lapse rate we do.
John E. Pearsonsays
raypierre in 252 said “A more productive discourse would be on the origins of Kirchoff’s law. I mean the microscopic origins; the macroscopic origins as a consequence of the Second Law are clear.”
I’m not sure what you meant by microscopic origins, a mechanism? Is a mechanism required? At equilibrium all fluxes balance. If the body is at equilibrium with a radiation field, by definition of equilibrium, the photon flux absorbed by the body must equal the photon flux emitted by the body. If the fluxes were not balanced the radiation spectrum of the body would be changing with time. The flux balance must hold for all wavelengths. It seems to me that it is as interesting to ask how bodies reach (radiative) thermodynamic equilibrium. My discussion is obviously naive; I think there are issues involving the long time and large body limits that I’ve given short shrift to.
Rod Bsays
Hank, I really didn’t intend to get into the bulk and detail of the question, for the reason you stated. But raypierre asked about my assertion and I thought it best to reply.
Rod Bsays
Hank, there is also broadband kinetic “vibrations” from the continual collisions and interactions among the molecules of liquids and gases.
Danielsays
Hi!
RealClimate’s links below “Other Opinions” are kind of outdated. I just checked them all so I can give you some help tidying up:
“Andrew Dessler” hasn’t been updated in a long while.
“Nexus 6” also hasn’t been updated in quite some time.
“The Intersection” has moved and can be found here now: http://blogs.discovermagazine.com/intersection/
And then there’s Stephen Schneider’s blog (which I realize feels sad to remove).
Best regards,
Daniel
[Response: Thanks. I’ll tidy. – gavin]
Ray Ladburysays
Rod, Vibrations? Harmonic oscillator energies are also quantized. Unless the electrons are free, you get line, or at least band emission, not continuum Try agian.
John, I think that is what Raypierre means. It’s not a trivial problem. Depending on how much the material departs from a perfect absorber, reaching true equilibrium would have to depend on very low-probability events to fill in the gaps between lines and bands.
No, Rod.
As Ray L. says: these are different:
solids and plasmas
vs.
liquids and gases
You’re not getting the difference.
Seriously, look at least at 1.1 through 1.3 of the book at Google.
Rod Bsays
Ray L, translation energy is not quantized ala vibration and rotation energy.
Are you saying the nice continuous solid camel-hump curve of insolation is made up of a helluva spreading of a few H and He line spectra?? Or that the 100-year old Planck equation is nonsense??
I didn’t say spectral strength is equal to the number of molecules. I said it’s dependent on the number of excited molecules — which I think is what you said…
I’ll let you go tell NASA/JPL that they’re full of crap.
If gas molecules radiate at all energy levels we would be a helluva lot warmer only if they radiated a helluva lot. I don’t know if this is accurate in the least, but with a hypothetical example, radiation of a body at 250K and an emissivity of 1.0 radiates 220 watts; at 0.1 emissivity, 22 watts, which is less than 7% of the earth’s downwelling.
john byatt says
Re Brisbane flood , Wivenhoe dam at 197% capacity, came close
http://www.moggill.nnub.net/files/2011/02/wivenhoe.jpg
Theo Kurtén says
Bob, related to your question in 259: in general, when chemical bonds form (or the bonding type changes from weaker to stronger, as in your phase change example), the energy is usually released as vibrational motion, though there are probably exceptions to this. The extra energy is then removed by collisional energy transfer (in the atmosphere that means collisions with N2 or O2 molecules, in a liquid it means collisions with whatever solvent molecules are present). So when part of a water droplet starts to freeze (typically around some seed particle), I would guess that the latent heat is initially released as increased vibrational motion of adjacent water molecules, which is then removed by thermalizing collisions with the surrounding gas. Does this help?
Bob (Sphaerica) says
301, Theo,
Yes, thank you very, very much! That’s exactly what I was looking for.
R. Gates says
Jim Steele @ #291:
Interesting points. I am wondering as well about the AMO cycle. Seems like the potential for the AMO and PDO to both align in a negative phase is there, and this too, could provide for a longer period where we see no new record warm years. On the flip side, should we see a few record warm years during a cool PDO/AMO cycle, that would be quite telling as well. Regardless, the flattening of the global temp rise during such a period will give plenty of ammo to AGW skeptics and make it that much harder to press forward with any meaningful GHG reductions. If you’re a believer that this is a crucial period, the timing couldn’t be worse…
[Response: The PDO (at least) is not periodic, so this sort of prediction is meaningless. Yes, things might happen to align that way, by chance, but even if they did, these have negligible impact on global temperatures anyway. Statistics may conspire against a continuing trend for a few years here and there, but I’m with Stefan on this: it’s not a strong bet.–eric]
JCH says
R. Gates – while it’s simplistic and probably meaningless, you can go to wood for trees and crudely replicate the Tsonis-Swanson graph using GISS, UAH and RSS, and get a very different picture of the “shift”. Which is something Swanson hinted at with respect to GISS in his article here.
jmho from the cheap seats, but I think some people need to sell off some natural variation and buy some Asian brown cloud.
John E. Pearson says
So I agreed to discuss climate science with an old highschool friend and denialist. I told him I wouldn’t discuss anything that was on “the wrong side of crazy” which included an 8 page manifesto from the heartland institute that started started stupid and quickly devlolved into demonstrable lies. Today I received this which was supposed to be demonstrating a big inconsistency.
================================================
James Overland 2004 :
When scientists trained their analytical tools on the North Pole and its environs, they quantified the local knowledge: The polar ice cap is 40 percent thinner and millions of acres smaller than it was in the 1970s.
What happens at the North Pole can affect the rest of the planet, potentially altering the course of the Gulf Stream, which moderates climate from the East Coast of the United States to the British Isles. Closer to home, the jet stream that dictates much of Seattle’s weather can be diverted when the polar vortex speeds up.
“It’s probably contributing to the fact that it’s warmer and we’ve been getting less snow,” Overland said.
http://seattletimes.nwsource.com/html/localnews/2001910590_northpole23m.html
James Overland 2011 :
Our region experienced record snowfall last winter, topping the charts dating at least as far back as the late 1800s. In all, more than six feet of snow fell at sites such as Baltimore-Washington International Marshall Airport. Extreme weather nailed other U.S. cities last winter, too, and swaths of Europe saw unprecedented snowfalls and record cold temperatures. This year, the nation’s capital has suffered one unusually severe storm. Parts of the East Coast from Atlanta to Boston have been experiencing blizzard conditions. Last week, a vast swath of the country’s midsection and East Coast got deluged with sleet and snow, paralyzing travel. What gives?
================================================
The WP article wasn’t quoting Overland. Here is the correct link to the WP article: http://www.washingtonpost.com/wp-dyn/content/article/2011/02/07/AR2011020703936.html
They weren’t quoting Overland. bloviators wouldn’t deliberately obfuscate would they?
Chris Dudley says
Chris (#248),
Looks like I left the ‘k’ off the carbon mass estimates in #271. Multiply by 1000.
Looking into what is worrying Hansen a little more deeply we have (in addition to the 5 Tt of carbon from burning all coal, oil and gas) ‘Canadian and Venezuelan deposits contain about 3.6 trillion barrels … of recoverable oil’ Here recoverable means about 10% of the carbon in place. But, the mining method exposes all of that carbon to oxidation over time so at 7 barrels per tonne that comes to another 5 Tt of carbon. http://en.wikipedia.org/wiki/Oil_sands
There are also large reserves of oil shale. We might expect all of the estimated oil in place to be oxidized if mining is used. Less if in situ methods are used. This looks like about 0.5 Tt of carbon known so far.
http://en.wikipedia.org/wiki/Oil_shale_reserves
Hansen then invokes a warm ocean once all ice sheets are gone which would destabilize clathrates. That comes to 10 Tt of carbon. http://ethomas.web.wesleyan.edu/ees123/clathrate.htm
Those are the sources he mentions. About 20 Tt in all based on these estimates.
Additionally there is organic carbon in permafrost regions. For areas without ice sheets this comes to about 1.8 Tt of carbon. http://www.aibs.org/bioscience-press-releases/resources/Schuur.pdf
For areas under ice sheets I don’t think much is known but we might guess a total of 4 Tt of land based frozen soil carbon available for carbon feedback over a several thousand year timescale.
If equatorial deserts expand a great deal, we might get another 2 Tt from soil carbon, loss of forests and burned peat. So, all together, that is another 6 Tt that Hansen does not mention.
So there are presently about 0.75 Tt of carbon in the atmosphere. If we assume that the oceans saturate in their carbon uptake soon then the atmospheric content might go to 27 Tt, or 35 times the current concentration. That is somewhat under 6 doublings. Even with these very generous assumptions, using Hansen’s 4 W/m^2 of forcing per doubling, that is only 22 W/m^2 and not the 25 which is what he required (10% over current solar forcing).
But, when discussing fig. 30 he says that he only needs 10 to 20 W/m^2 change to get either a snowball or a ‘runaway greenhouse’ (p. 227) so if that is the case then the generous assumptions get him there. But, in the way I read that figure (and I’m not sure I’m doing it right), the 20 W/m^2 would be the number for the runaway greenhouse, not 10 and we don’t get there without more carbon sources than he mentions (plus ocean uptake saturation). So I’m still not too sure how he has convinced himself without these unstated aspects.
Other possible assumptions within the carbon sources he mentions are more clathrates than in the estimate I linked to, or undiscovered tarsands or oil shale. Unless he has decided that 10 to 20 W/m^2 means 15 W/m^2 it does not seem to add up to a dead certainty.
But, if we grant him that, then I think it probable that he thinks that getting to 1.1 times the solar forcing leads to inevitable loss of the oceans. This is strange because he seems sanguine about walking back from 450 ppm to 350 ppm, something that Solomon might not sanction. And, while one dimensional models probably can’t inform about silicate weathering rates, at the level that he is certain of a runaway, we might be nearly as certain that the process would be halted by formation of calcium carbonate and like minerals and removal of carbon from the atmosphere.
I think this may be the most important problem. He seems to think that a runaway must runaway but his runaway involves losing the oceans and that is not inevitable. Ocean loss does not lead to more ocean loss in this regime. I think raypierre would say that it never gets started (never gets to a damp stratosphere), but it is pretty likely that if it does, it will not continue. So, it could be that semantics are the conceptual issue. When the Sun gets brighter, things will ‘runaway’ but only because the Sun is not going to get any fainter going forward. Once the oceans are gone, weathering will cease and we should get an atmosphere similar to that of Venus. But, carbon dioxide can become less so it is not the backstop that the Sun is. The word ‘runaway’ sounds inevitable, but it needs something to run from.
Anyway, that is what I think at present.
SecularAnimist says
Calling BPL …
Drought in the Amazon:
http://climateprogress.org/2011/02/08/science-amazon-drought-co2-emissions-source-sink-simon-lewis/
Drought in China:
http://www.nytimes.com/2011/02/09/business/global/09food.html?_r=1&hp
R. Gates says
Re: Eric reply in #303:
No periodicity in the PDO? I find this odd, though it may be nonlinear, it seems charts like this show clear periodicity:
http://jisao.washington.edu/pdo/
And certainly as does the AMO, with even some mechanisms given:
https://www.cfa.harvard.edu/~wsoon/BinWang07-d/DimaLohmann07-AMO-final.pdf
I happen to believe the AGW is occurring, but I’m hardly blind to these longer term ocean cycles, nor what it could mean if they happen to align to both be warm or cold at the same time. Witness, the well-known “Great Climate Shift” of 1976, was a point in time when both the AMO and PDO began strong uptrends toward warm.
I think a more interesting question, and perhaps one we’ll never know, is how AGW could be affecting these longer-term multi-decadal cycles.
tamino says
Re: #308 (R. Gates)
The statement “it seems charts like this show clear periodicity” is a very poor indicator of genuine periodicity. In fact it’s usually indicative of misinterpretation.
This is my specialty, and I often suspect that the false identification of periodic behavior is one of the most common mistakes in the peer-reviewed scientific literature, let alone in blog posts and idle speculation. Some posts on the topic:
http://replay.waybackmachine.org/20100104072004/http://tamino.wordpress.com/2009/12/22/cyclical-not/
http://replay.waybackmachine.org/20100104073057/http://tamino.wordpress.com/2009/12/31/cyclical-probably-not/
I haven’t yet seen any solid evidence that the PDO (or the AMO for that matter) is periodic. Period.
R. Gates says
Tamino @ 309
I’ve only recently began looking at both of these in any detail, and I’ll trust your expertise. I still can’t but help think that the fact that both the PDO and AMO started going positive around 1976 was part of the “climate shift” so noted in that year, but of course, this is confusing cause and effect I suppose, as that is exactly what they were measuring, as opposed to causing. Also, the current cold cycle of the PDO looks rather like La Nina in pattern, and I’m wondering how one might distinguish them, other than length of duration?
Erikthefish says
Interesting posts recently, thank you for the clarifications.
I have a question please?
Whilst viewing the Cryosphere today images and the NOAA/METOP Infra-red passes together, it strikes me that there maybe a significant error in the display of the northern polar ice-cover as represented. The raw satellite Infra-red images show that there is a massive variation in the homogeneity of the Sea ice cover, many areas of ice are very cracked with fissures and leads that travel in some cases completely across the polar Seas.
This cracking varies and produces areas that reveal large areas of the warmer sea beneath, and these cracks do allow the warmer sea to warm the areas along the cracks, in several instances this area can be more than 10 times the width of the crack.(several 1000’s square km). The result of this effect on the infra-red images are warm areas within the main ice mass. (To a level that would only normally be expected along the sea/ice boundary.)
The question is, why in most visible displays of ice cover is not the quality of the ice given more prominance?
And is the ratio of crack/ice factored into calculations of sheets total coverage, if so how?
(I know estimated ice thickness is)
The warmth along the cracks from the Infra-red images seems to show that when low/high pressure storms “impact” the top of the ice sheets in the arctic the cracks increase, and the area gets “warmer” in the infra-red images. In some cases like the recent breakup of the eastern Greenland coastal sea ice shelf, the cracks precipitate much faster changes in the quality of the ice, which is of significance.
The Cryosphere today images don’t appear to reflect the quality of the ice properly, in some cases from some images; I estimate the area of cracked ice well exceeds that of the un-cracked component within an area, yet on sites showing ice coverage these areas are shown as ICE. Some areas I estimate only have approx 20% Ice cover yet are shown as Ice.
From my own archive of data, it seems to me that the cracks in the Arctic sea ice have increased over the last few years but this is not illustrated in images or information I have seen.
Not picking on the Cryosphere today site, as it is very good, just using it as an example, the NASA/NOAA sites show the same.
Thanks.
David B. Benson says
Mongol Invasion in 1200s Altered Carbon Dioxide Levels
http://www.livescience.com/environment/wars-plagues-carbon-climate-110208.html
deconvoluter says
Thought experiment.
Imagine a world in which these opinion pieces were published anonymously. How much attention would they receive? As far as I can remember his bubble-economics appeared to be almost as over-optimistic as his technological speculation.
By the way would the term inactivist be more appropriate than contrarian? Except for his equation :
climate change = warming of the cold regions.
deconvoluter says
Apologies for clumsy aim:
My previous comment was intended for the thread on FD.
David B. Benson says
R. Gates @310 — There is evidence that both the AMO and PDO have existed for a long time as quasi-periodic oscillations (QPOs); try searching for tree ring studies in the North Pacific and in Northern Europe. However, along with the SOI for ENSO, these are but indicies of inteernal variability and make but little difference in explaining the global temperatures over the instrumental period.
Here is an ultra-simple illustration to make the point for the AMO:
https://www.realclimate.org/index.php/archives/2010/10/unforced-variations-3-2/comment-page-5/#comment-189329
I’ve recently completed a somewhat more realistic two reservoir model using annualized data. In that, the net of all forcings explains most of GISTemp with SOI a substantial aid in explaining more of it. Then a modest dose of AMO explains some more and a much smaller dose of the PDO explains a tiny bit more. The gist of the matter is that the forcings explain what has been happening to global temperatures since 1880 quite well and the indices of internal variability (SOI,AMO,PDO) explain very little.
Vendicar Decarian says
Proposals in Texas and Iowa single out public education for steep cuts.
In Texas, one proposal would reduce total education spending by 15 percent, or $7-$10 billion, depending on K-12 enrollment. The plan also calls for shutting down four community colleges in a bid to close the state’s $15 billion deficit while honoring Republican Gov. Rick Perry’s pledge not to use rainy-day funds or raise taxes.
Iowa GOP Gov. Terry Branstad wants to cut preschool funding from $71 million to $43 million, which would eliminate the universal free pre-school for four year-olds the state established in 2007.
[moved]
Hank Roberts says
Oh, lordy, someone’s gone bonkers and Google News is featuring this:
http://www.salem-news.com/articles/february042011/global-superstorms-ta.php
It’s a mishmash of some real science items, not new, and not related, to make it sound like there’s been an announcement by NASA.
The actual paper mentioned (at least by name) is a speculative one from 1974!
http://www.nature.com/nature/journal/v247/n5437/abs/247131a0.html
Weather and the Earth’s magnetic field
JW King – 1974 – nature.com “A comparison of meteorological pressures and the strength of the geomagnetic field suggests a possible controlling influence of the field on the longitudinal variation of the average pressure in the troposphere at high latitudes….”
And that’s been conflated with recent paleo work on major storm events that may happen again in California, and with a the recent paper about movement of the North magnetic pole.
Mush together, stir wildly, rush to print. Egad.
Briefly: Yes, there are a variety of magnetic poles, not just one straight through the middle, yes, the surface locations change, no, recent changes do not mean the poles are going to flip and cause giant storms, and no, NASA said nothing of the sort.
wayne davidson says
I agree, with Rai, there is a large tendency amongst most contrarians to minimize AGW with one oscillation or another. La-Nina usual effects are floundering because planetary waves are placing themselves according to NH surface conditions not strictly stemming from the temperature of the equatorial Pacific, but from a combination as a whole. Check out what these oscillations guys predict or project, since they admit having a basic misunderstanding of Climate, utterly rejecting AGW, they usually flunk more often than not in any predictions,
or they utterly are incapable of daring one.
john byatt says
A new take on the Australian poem My Country by D Mackellar
I love a sunburnt country
A land of sweeping plains
Of climate change deniers
Of droughts and flooding rains,
Of bleaching coral reefs
In oceans that are warming –
making cyclones rage through towns
and huge fires go a-storming.
I love her coal industry lobbyists,
I love her rightwing jocks –
either choosing wilful ignorance
or lying through their socks.
I love her weak politicians,
heads firmly in the sand –
‘the greatest moral challenge’
has pretty much been canned.
I weep for what will happen,
And wonder where it ends
Watching scenes of great destruction,
untold damage, death of friends.
I love her far horizons
with their burnt or ripped-up trees,
her vulnerability and our terror –
the wide-brown land for me.
Sou says
@ john byatt #319, best add the ‘poet’ and source:
Poem by Judy Horacek who write: “A poem written more than 100 years ago by a homesick 19 year old versus an ever-increasing body of refereed scientific thought…hmm, hard to know which way to jump really.”
As published here:
http://www.abc.net.au/unleashed/43762.html
Ray Ladbury says
R. Gates, not to put too fine a point on it, but the thing about oscillations is that they repeat…repeatedly. In other words, you really don’t know if it’s periodic until you see several periods. There are all kinds of oscillations–some are actually periodic, and these must be driven by a periodic forcing. More commonly, the system has a sort of bi-stability and the distribution of perturbations makes it look quasi-periodic. Exercise for the reader: Graph the following sequence
1 – 2
2 – 7
3 – 1
4 – 8
5 – 2
6 – 8
7 – 1
8 – 8
9 – 2
10 – 8
Looks periodic, doesn’t it? But it ain’t. The “y”s in the pairs are the digits of the transcendental number e–the base of Napierian logarithms. The human eye is trained to see order. Sometimes it sees it when there really is none there.
John E. Pearson says
321 Ray:
You don’t need periodic forcing for a system to respond with periodic oscillations. There are two generic ways for a nonlinear dynamical system to bifurcate from steady to periodic: hopf bifurcation and saddle node bifurcation. Periodic forcing isn’t needed in either case.
J. Bob says
#299 Ray
Here is a sample of a longer term outlook on temperature (HadCRUT3gl) using 3 different methods: MOV, 2-Pole “filtfilt” Chebushev & a Fourier Convolution. The time frame was a 10 year MOV, and a 0.1 cycles/year filter cut-off freq.
http://www.imagenerd.com/uploads/filter_er_10yr-g6l8y.jpg
The three methods were used as a comparison between the methods, and to evaluate their strengths & weaknesses. The MOV “cuts out” before the end, which is not helpful, and the “filtfilt”, or forward & reverse recursive is subject to “reflection” conditions at the end. The Fourier does seem to give the best result, particularly at the end point.
The upper figure shows the three filter methods, and their respective responses. The bottom figure is a little more interesting, in that it shows the difference between the Fourier Convolution filter response and the “raw” signal. Of more importance, it provides a metric to evaluate the filtered value error from the actual. It shows the difference is pretty much less then +/- 0.2 deg. over all, while the mean is virtually negligible. Assuming a Gaussian distribution, the 3 sigma error would be less then 0.3 deg. Probably better then most of the temperature data taken over the past 150 years.
Hank Roberts says
John E., sure, but the fact that you can get a periodic oscillation that way has no bearing on whether what you’re seeing in some aspect of climate is a periodic oscillation. Could be, it’s not impossible. Nobody says it is.
Ray and Tamino and others persistently explain that we’ve developed tools to tell us whether there’s a pattern because we can’t trust our eyes — we know by now that people are highly biased toward seeing patterns, whether or not a pattern really is there. It’s been a useful bias for a long time.
Fail to detect the leopard spots in the leafy shade? No more offspring.
Nowadays it’s our own actions that may cause patterns we need to know are there in time to avoid their consequences. It’s still a problem to know.
Could be periodic? Could be a trend? Could be multiple periods and multiple trends. What are your tools? How do you know? Why do you trust your tools?
Brian Dodge says
@ Harold Pierce Jr — 7 Feb 2011 @ 3:26 PM (and thanks for the Freudian slip “Brain” compliment &;>)
L.B. KLYASHTORIN and A.A. LYUBUSHIN in CYCLIC CLIMATE CHANGES AND FISH PRODUCTIVITY start out admitting there has been global warming –
“As shown in Fig. 1.1, the increasing secular linear trend (about 0.06 °C per each 10 years; Sonechkin, 1998) at the background of interannual year Global dT variations is observed.”
then do their best to obscure this fact by removing the trends (sorta like McLean, de Freitas and Carter); at least they didn’t use a derivative filter which would have increased short term noise.
“After detrending, the long-period fluctuations of Global dT with the maxima at about 1870s, 1930s and, apparently, at 1990s are clearly observed”
“Fig. 1.6. Comparative dynamics of detrended Global dT (bold line) and zonal ACI…”
“Moreover, similar to Global dT, dynamics of water area free of ice demonstrates increasing secular linear trend (about 15% per century). To compare multiyear fluctuations of the water area free of ice and Global dT. both detrended curves are shown in Fig. 2.15.”
They use the word “detrend(ed,ing)” 41 times.
They say
“It is thus not our goal to discuss particular mechanisms of the climatic processes.”
“Similarity in the shape of Global dT and AT anomalies curves also suggests that there is a relationship between changes in the atmospheric transfer and subsequent fluctuations of the global temperature, although the mechanism of this dependence in not clear yet.”
Basically they are doing curve fitting without attempting to understand the underlying physics, and assuming that the mechanisms are periodic without any explanation why that should be.
“60-year periodicity of the climate fluctuations”
“(50-70 year) temperature periodicity”
“16.8-, 32-, 60- and 108-year periodicity”
“periodicities: 17.5, 23 and 57 years”
“54-year periodicity, the 32-year peak”
“predominance of the 60-year peak, whereas the 32-year peak is lower”
“predominance of the 76-year peak, whereas 26- and 39-year peaks are less intensive.”
“predominance of the 55.4-year peak”
“predominant peaks (54 years)”
“close to the 60-year periodicity” (but no cigar)
Their figure 1.13 shows four spectra, ice cores and tree rings, with peaks at 25.6, 32, 32.5, 38.6, 53.9, 55.3, 60.2, and 75.8 year periods.
They go on to say “Fluctuation spectra of California sardine and anchovy populations during the recent 1700 years (Fig. 1.15) demonstrate well defined predominant peaks: 57 and 76 years for sardine and 57, 72, and 99 year for anchovy. This correlates well with the predominant spectra of climatic fluctuations during the last 1500 years according to ice core samples and tree growth rings.” But they don’t say what the correlation coefficient(s) is(are).
I made a spreadsheet with periodic (sin) functions at the frequencies Klyashtorin et al found in their data, and plotted them over 400 years, along with their average + 20% random noise, and a lowpass filtered (40 year running average) random sequence. I also generated a graph of average of all frequencies, the average of the four frequencies from 50-70 years, (with noise), and a lowpass random sequence.
The chart is here – http://www.imagenerd.com/uploads/fishy_cycles-HjmIs.jpg
In the top graph the dotted curves are the noise and the average + noise. Clearly there are no strong “eyeball” correlations. The bottom plot is the comparison of all frequencies, four frequencies, and random data.
Referencing Kryizhov V.N. 2002. Regional features of climatic changes in the North Europe and the West Siberia in the 20th century, they leap to the conclusion that –
“The absence of secular increasing temperature trends in the Arctic region seems unexplainable from the point of view of the so-called global warming,…
This might explain why they included a chapter on global fossil fuel consumption in a work about fisheries.
PS In my chart, the random data is blue, all frequencies is tan, and four frequencies is green.
John E. Pearson says
324: Hank.
Perhaps I took Ray’s remark out of context. My reading was that he said that periodicity required periodic forcing as a general statement not with specific reference to climate. I was talking about mathematics not climate. I think that people who look at 130 year’s worth of data and insist there is a period of 65 (or whatever) years are talking through an orifice not their mouth. I know full well how difficult it is to not see patterns in noisy data. These days I devote essentially all my time to trying not to see patterns where there are none. I’m making some headway.
Hank Roberts says
This subject is picking up interest
http://www.google.com/trends?q=superstorm
probably because of a regrettably confused Google News promotion of a wacko grab-bag confused story (featuring magnetic field change a few days ago).
The original has already been dealt with here: http://www.fark.com/comments/5931465/Magnetic-Polar-Shifts-Causing-Massive-Global-Superstorm-Looking-glass-Were-through-it
The original nonsense threw together mention of sources that are real and related to climate.
So, perhaps the real stuff is worth addressing?
http://www.arwi.us/precip/Sympro2010/Cox_notesfmt.pdf
http://74.125.155.132/scholar?q=cache:D7ld6_P4s_YJ:scholar.google.com/+ARkStorm++landslides&hl=en&as_sdt=0,5
\The United States Geological Survey (USGS) Multi-Hazards Demonstration Project (MHDP) is preparing a new emergency-preparedness scenario, called ARkStorm, to address massive U.S. West Coast storms analogous to those that devastated California in 1861–62. Storms of this magnitude are projected to become more frequent and intense as a result of climate change.
The MHDP has assembled experts from the National Oceanic and Atmospheric Administration (NOAA), USGS, Scripps Institute of Oceanography, the State of California, California Geological Survey, the University of Colorado, the National Center for Atmospheric Research, and other organizations to design the large, but scientifically plausible, hypothetical storm scenario that would provide emergency responders, resource managers, and the public a realistic assessment of what is historically possible.
The ARkStorm is patterned after the 1861–1862 historical events but uses modern modeling methods and data from large storms in 1969 and 1986. The ARkStorm …, with a ferocity equal to hurricanes, slam into the U.S. West Coast for several weeks.\
—————–
That’s a distinction between \alarming\ and \alarmist\ — realism.
M says
From another thread:
“Nope. Dr Spencer’s models show that warmer temperatures cause oceans to outgas CO2, so much so that 80-90% of the rise since ~ 1930 has been caused by warming, not anthropogenic emissions.
http://www.drroyspencer.com/2009/01/increasing-atmospheric-co2-manmade%E2%80%A6or-natural/
http://www.drroyspencer.com/2009/05/global-warming-causing-carbon-dioxide-increases-a-simple-model/
http://wattsupwiththat.com/2008/01/25/double-whammy-friday-roy-spencer-on-how-oceans-are-driving-co2/
http://wattsupwiththat.com/2008/01/28/spencer-pt2-more-co2-peculiarities-the-c13c12-isotope-ratio/”
Those posts by Spencer clearly illustrate his inability to distinguish interesting but wrong mathematical analyses and the reality supported by the scientific consensus of the experts. The relationship between global temperatures and atmospheric CO2 due to ocean outgassing is well known: the exact magnitude is not precisely pinpointed, but has been estimated to be around 7 to 10 ppm/degree C (Denman et al., Chapter 7 WGI IPCC AR4). Not 7-10 ppm/degree/year, but 7-10 ppm/degree in equilibrium. Therefore, the past century’s temperature rise should account for less than 10 ppm of the total CO2 rise.
Not to mention, this requires ignoring the ice core record of the past 800,000 years which suggests that CO2 concentrations did not exceeded 300 ppm during this time period, until the last century. Or, maybe, is Spencer suggesting that the past century was warming than any time in the past 800,000 years? Not even the most alarmist of mainstream scientists claims that that is true (Yet, anyway. By the end of the 21st century, if we continue emitting GHGs, and we’re unlucky, we may unfortunately reach temperatures unprecedented in the past million years).
So, basically, Spencer confuses the factors leading to year to year noise in the system – namely biological and oceanic response to temperature change – with the factors leading to the long term trend in the system – namely, anthropogenic CO2 emissions. Hmm. This is surprisingly parallel to his arguments about temperature change and his weird scribble diagrams attempting to claim low climate sensitivity but high unforced cloud response…
-M
john byatt says
From Starship vs Starship thread
danny bloom
Rebrand Earth with More Fitting Name — Before All Hell Breaks Loose
A climate activist is seeking to rebrand the name of our planet Earth with a more fitting name — one that would better reflect a better understanding of where we live and our place in the cosmos. He says that his
crusade is a public awareness campaign and has no financial backers or corporate sponsorship.
“It’s my way to seeking to give this planet a more apt name, and I have no agenda, other than to help people think of the planet in a new and improved way, since ‘earth’ really just means the ground, the dirt, soil,” he says.
He says he is campaigning now for a new name for the planet we live on, given that, in his opinion, the word Earth is not a very good one.
“Let’s rename the Earth,” he adds, ” especially in regard to the fact that we need to work hard to stop global warming and climate change from doing a huge number on this third rock from the sun.”
Okay, so what name would you suggest, dear Reader?
One word is best, but this rebranding could aso use as many as 2 -5 words in a term as well, such as “Third Rock from the Sun” or “Terra Firma”.
He says he is looking for a word that will help teach younger generations in the future to treat the planet with more respect and gratitude for giving us life.
By the way, the current name “Earth” derives from the lame Anglo-Saxon word ”erda”, which means ground or soil, and is related to the lame German word ”erde”. Duh. It became ”eorthe” later, and then ”erthe” in Middle English. But people in the 16th Century had no idea what the plaent was really all about. Now we do. What new name would you suggest?
How about Fubbel you wubble you cucked ?
john byatt says
# 320, TA sou, i use the ABC (Aust} climate change forum articles to promote Real Climate and Skeptical Science,
Rod B says
Ray Ladbury (252 et al), Planck function radiation and spectroscopy radiation are not the same thing but neither are they mutually exclusive. Temperature is the direct cause of Planck radiation; it is only indirect for spectral radiation. CO2 vibration band at 15um will emit ONLY if the 1st energy level is excited, and that can happen in a number of ways. However a higher amBEiant temperature makes it more likely to be excited so temperature plays a secondary role. An exception to this is temperature often is the direct cause of electron level excitation and so is directly responsible for spectral radiation in the visible.
There is no reason why a body cannot emit “blackbody” radiation and spectral radiation at the same time. One gets almost a perfect broadband blackbody from the sun while simultaneously seeing He and H, et al spectral lines (though in the sun’s case they are mostly absorption lines as emission lines will be mostly drowned out by the blackbody radiation. Still…. they’re not the same thing (other than after the fact all radiation (photons) is exactly alike.)
The 2nd part of your comment gets us into the bugaboo of a gas emitting Planck radiation. I’m in the camp that it does (“ALL bodies radiate…”), though by my limited survey I’d say 2/3 or so say it doesn’t. In any case a Planck-radiating gas does pose some oddities. Figuring out the surface and assessing the emissivity are no easy tasks. It’s my view that this radiation is very weak (small emissivity), which might explain the prominence of the spectral line radiation.
I do not agree that the relative strength of the line emission is determined by where it intersects a Planck curve (if I understand your point correctly…) Since the energy absorbed or emitted by CO2 vibration is exactly fixed and the same, the line strength is purely a function of the number of molecules absorbing or emitting. If you draw a blackbody curve at, say, 275K and determine the intensity at the precise frequency where CO2 emits, you cannot can say the CO2 emitted intensity is (always) exactly on the blackbody curve for that frequency.
I disagree just a little bit with just a part of raypierre’s response. At least by my rough math, Doppler broadening is virtually nil and even collisional broadening is quite small (and subject to a fair amount of uncertainty, btw). Where collisional broadening does spread the line frequency, the intensity is very low and still the broadening is not much to write home about. It seems the “banding” is more a function of closely spaced rotational lines – which can be significant. (I don’t know if increased density (collisions) tends to enhance the rotational lines or not; I suppose it might….) ray’s other points are well taken and might also explain the confusion between planck and spectral/line radiation.
Isotopious says
Jim Steele, the biggest problem with discussions about internal variability is time. For example, what is a short term trend with regards to climate?
For the current world, trends exist anywhere between 100kyr and 5 years, so:
100kyr = very very long term trend
10kyr = very long term trend
200-300 years = long term trend
60 years = short term trend
30 years = macro trend
5-15 years = micro trend
In the last 10kyrs, we have had a very long term cooling trend with plenty of long term /short term warming and cooling…
So what does that say about the current global warming?
Andy says
Has anyone else read the above paper? It get’s quoted some here and there. I lack more than rudimentary statistics, but he seems to be saying that the actual theory of how the physical components of climate interact doesn’t matter, since the data can be ‘generated as a random walk process.’ Anything here make sense?
This quote shows up a lot too, including in my local paper:
Professor Terry Mills, professor of applied statistics and econometrics at Loughborough University in England, looked at the same data as the IPCC and found that the warming trend it reported over the past 30 years or so was just as likely to be due to random fluctuations as to the impacts of greenhouse gases. Mills findings are to be published in Climatic Change, a peer-reviewed environmental journal.
I found the journal, but you have to pay to read anything.
There’s also this:
In a paper published in the Journal of Data Science Terry Mills concludes
Indeed, examining much longer records of temperature reconstructions from proxy data reveals a very different picture of climate change than just focusing on the last 150 years or so of temperature observations, with several historical epochs experiencing temperatures at least as warm as those being encountered today: see, for example, Mills (2004, 2007b) for trend modelling of long-run temperature reconstructions. At the very least, proponents of continuing global warming and climate change would perhaps be wise not to make the recent warming trend in recorded temperatures a central plank in their argument.
Any comments? I couldn’t find Mills in the RCWiki. Has he shown up on this thread already? If so I apologize for not reading alll of it first. In that case will someone direct me. I appreciate anything.
Thanks, Andy
Andy says
I found no reference to this Terry Mills on RCWiki or RC, and his comments are showing up in my local newspaper in science-denial land. I lack the statistics to critique his, but he seems to discount the physical components of climate theory in favor of pure statistical arguments, even though he SAYS he doesn’t.
from here: http://catallaxyfiles.com/2010/06/ and also see paper in above journal.
——————-
In March this year I had a post that reported on some forthcoming research into the econometrics of climate change.
I haven’t been able to track down a paper but I have seen reports on this
Professor Terry Mills, professor of applied statistics and econometrics at Loughborough University in England, looked at the same data as the IPCC and found that the warming trend it reported over the past 30 years or so was just as likely to be due to random fluctuations as to the impacts of greenhouse gases. Mills findings are to be published in Climatic Change, a peer-reviewed environmental journal.
In a paper published in the Journal of Data Science Terry Mills concludes
Indeed, examining much longer records of temperature reconstructions from proxy data reveals a very different picture of climate change than just focusing on the last 150 years or so of temperature observations, with several historical epochs experiencing temperatures at least as warm as those being encountered today: see, for example, Mills (2004, 2007b) for trend modelling of long-run temperature reconstructions. At the very least, proponents of continuing global warming and climate change would perhaps be wise not to make the recent warming trend in recorded temperatures a central plank in their argument.
I should point out that Mills is not a climate scientist either, he is an econometrician and author of magnificent book on modelling time series. It might be a bit hard claiming that he doesn’t understand first year stats.
The paper is still forthcoming but is now on the website in press (subscription required). What does Mills conclude? (emphasis added)
While a number of interesting features of the monthly global temperature series have been uncovered by fitting structural models, the trend component is likely to be of most interest, as this naturally features in debates concerning global warming. The trend component is generated as a random walk process with no drift, so that a pronounced warming trend cannot be forecast. Indeed, sensitivity analysis shows that, within this class of model, it is almost impossible to deliver an increase in trend temperatures over the twenty-first century that is consistent with that projected by conventional coupled atmospheric-ocean general circulation models: to do so would require choosing ill-fitting models statistically dominated by simpler specifications and then imposing a value on the slope parameter that, on statistical grounds, is highly unlikely. A similar, if less extreme, conclusion may be arrived at from a sensitivity analysis of the breaking trend model, although here, of course, some degree of global warming is forecast. In contrast, cointegration/error correction models, when supplemented with the assumption of a 1%annual increase in radiative forcing (equivalent to a doubling over 70 years), produce long-range forecasts much more in keeping with those from coupled general circulation models.
Given these alternative models of observed temperature series—and from the references given above, several others could be offered—there is no doubt that, in this area, there are indeed many ways to skin the proverbial cat. Which of the alternatives should be chosen? Do you adopt a carefully specified univariate time series model that, because of its property of adapting quickly to current movements in the series, essentially is unable to deliver any increase in forecasted temperatures; do you choose a simpler trend break model in which the breaks are a consequence of rare and large changes in key external forcing factors, as proposed by Gay-Garcia et al. (2009); or do you explicitly model the long-run, cointegrating relationship between temperatures and radiative forcing that is based on the hypothesis that changes in radiative forcing , influenced in part by human activity, generate changes in temperatures (Kaufmann et al. 2010). Statistical arguments alone are unlikely to settle issues such as these, but neither are appeals to only physical models or the output of computer simulations of coupled general circulation models. In such circumstances it would appear that, to quote another ageless proverb, ‘you pays your money and you takes your choice’. Indeed, it could be argued that such a proverb is particularly apposite given the ongoing debate concerning the potential costs of combating global warming and climate change, the most notable recent protagonists being Stern (2007) and his reviewers, for example, Nordhaus (2007), Tol and Yohe (2006) andWeitzman (2007).
I also found a paper at the Journal of Cosmology – I don’t know anything about this journal, but Mills makes much the same argument.
The most notable implication of these structural models is that concerning the presence of a global warming signal. The absence of a significant drift in the trend component, making this a simple random walk, thus precludes the presence of a warming – or, indeed, a cooling – trend. Long term forecasts of trend temperatures are given by the current value of the trend component and future trend temperatures have as much chance of going down as they have of going up. This may seem surprising given the general upward movement of the trend component over the last thirty years or so but such departures, which would be attributable to natural variation, are entirely possible for random walks and are a consequence of the arcsine law of probability (see Feller, 1971). As a consequence, forecast bounds for the trend component quickly become large as forecast error variances increase linearly with the forecast horizon for a random walk.
——————-
How does one respond to this? Thanks for anything. Andy
also sorry if I accidentally multi-post
Didactylos says
Andy: Tamino has rejected the “random walk” hypothesis very firmly.
But that is beside the point – the whole issue is a neatly dried red herring, since we don’t rely on statistics to conclude that the globe will continue warming. We measure recent trends purely so we can know exactly how fast we are warming.
So, not even wrong.
Didactylos says
Andy: one recollection from the past “random walk” discussions is that it is very easy to show that any data is a random walk simply by making some inappropriate choices when choosing parameters.
It is easy to test this on synthetic data.
From the introduction, we see that he splats a linear trend from 1850 to 2010 and starts from there. That’s clearly not a good start. How on earth does he plan to model all the complex climate interactions over that period? Duh. He doesn’t.
Economists all fall into the same trap. Their economic data doesn’t follow rules, so they think all data is the same – and then they mess with climate and embarrass themselves in the literature. I don’t want to generalise, but I genuinely can’t think of a counter-example.
Loughborough: one of the top sports science universities in the United Kingdom!
Ray Ladbury says
OK, Rod, think about this. We have an atom or molecule in an energy eigenstate E2, a second excited state above its ground state E0. It may be able to radiate to E1 or E0, but it cannot radiate to a state halfway between E1 and E2. No energy states exist there. The wave function goes to zero. So just how–by what physical process–do you propose to have this molecule radiate over a continuum? There is no separate “blackbody radiation mechanism”.
Now think of a hydrogen gas lamp. Pass a current through it to heat it to a high temperature and look at the spectrum. It is a line spectrum. Is there a continuum in addition to those lines? No. Between the lines is black. So, why would there not be a continuum if it were possible for the hydrogen to radiate in the continuum as well as between its energy states?
We get a blackbody from the sun because 1)the plasma is quite dense and quite hot so there is lots of spectral broadening, 2)it takes years for a photon radiated in the photosphere to make its way to the surface and escape.
And no, the spectral strength is not equal to the number of molecules absorbing and emitting, but to the difference between the number of molecules being excited and the number of molecules that relax by processes other than radiation.
Rod, find just one textbook that says that matter emits thermal radiation in other than between its (broadened) energy eigenstates. I don’t know why you persist in this fantasy when you have found bupkes to support it coming from actual scientists.
Rod B says
Ray Ladbury, here’s a simple (but too long) copy from a course put out by JPL/NASA I think for advanced secondary schools. http://www2.jpl.nasa.gov/radioastronomy/radioastronomy_all.pdf
There are bupkes like this up the yin-yang. I’ll respond to some of your specific ideas later.
“Examples of thermal radiation include:
1) Continuous spectrum emissions related to the temperature of the object or material.
2) Specific frequency emissions from neutral hydrogen and other atoms and molecules.
….. any object that contains any heat energy at all emits radiation…. ….and then emits the energy at all frequencies… All the matter in the known universe behaves this way.
Electromagnetic radiation is produced whenever electric charges accelerate, that is, when they
change …speed or direction of their movement. In a hot object, the molecules are continuously vibrating (if a solid) or bumping into each other (if a liquid or gas), sending each other off in different directions and at different speeds. Each of these collisions produces electromagnetic radiation at frequencies all across the electromagnetic spectrum. However, the amount of radiation emitted at each frequency (or frequency band) depends on the temperature of the material producing the radiation.
…… A blackbody with a temperature higher than absolute zero emits some energy at all wavelengths.”
Hank Roberts says
Rod:
http://scholar.google.com/scholar?cluster=8344315542838581899&hl=en&as_sdt=0,5
Look at the introduction, 1.1 up through 1.3, available from the Google Books images. I’m nowhere near a physicist but I can follow this much. In a solid those vibrations aren’t single pure notes. A musician can ‘bend’ a note on a vibrating string by stretching it. A solid composed of atoms/molecules is full of bonds that are bent and stretched. Those are the charges accelerating/changing that cause the infrared photons to come off a solid, while heat’s moving through those same bonds.
Then by 1.3 he gets to “thermal” radiation — discrete specific frequency emissions.
If y’all could just pick a book like that one, and start a blog to discuss it, maybe attract a teacher you could agree on, this wouldn’t have to keep coming up over and over here, and might be a good education for the rest of us bystanders instead of a recurring argument that never gets anywhere.
AIC says
Has anybody seen this one before?
http://www.venturaphotonics.com/GlobalWarming.html
“It Is Impossible For A 100 ppm Increase In
Atmospheric CO2 Concentration To Cause Global
Warming”
john byatt says
At ABC Australia unleashed
No its not the joke of the day
john byatt :
10 Feb 2011 1:26:10pm
” You ain’t seen nothing yet”
By 2200, the PCF strength in terms of cumulative permafrost carbon flux to the atmosphere is 190 ± 64 Gt C. This estimate may be low because it does not account for amplified surface warming due to the PCF itself and excludes some discontinuous permafrost regions where SiBCASA did not simulate permafrost. We predict that the PCF will change the Arctic from a carbon sink to a source after the mid 2020s and is strong enough to cancel 40–88% of the total global land sink. The thaw and decay of permafrost carbon is irreversible and accounting for the PCF will require larger reductions in fossil fuel emissions to reach a target atmospheric CO2 concentration.” Kevin Schaefer, Tingjun Zhang, Lori Bruhwiler, Andrew P. Barrett, 2011, Tellus B, DOI: 10.1111/j.1600-0889.2011.00527.x.
Reply Alert moderator
scotty :
10 Feb 2011 2:59:38pm
“We predict that the PCF will change the Arctic from a carbon sink to a source after the mid 2020s”
And correspondingly, the Antarctic will become a sink, as Antartic ice is at record levels on the majority of its landmass, and Antartic ice grows when Arctic ice recedes (and vice versa).
Selective analysis to prove your pre-determined point isn’t scientific, in fact that’s the precise problem with much of the climate ‘science’ that is out there.
recaptcha made more sense
Ray Ladbury says
Rod, first, that is not a physics text. I’ll take Landau over a tutorial for amateur astronomers. Second, we are not talking about
1)a solid, in which allowed energies form continuous bands
2)a plasma, in which the electrons are free to interact with the electromagnetic fields of the photons and scatter them
In a gas, energy levels are discrete lines–that is precisely why you get spectral lines when starlight passes through a gas cloud. Blackbody radiation is the spectrum you get when photons are in thermal equilibrium. Photons do not interact with each other, so the only way they come into equilibrium is by interacting with surrounding matter. That means absorption and emission. Those absorptions and emissions must occur between allowed energy states–in a gas, lines; in a solid, bands. Just how is a neutral material going to absorb and emit at energies that are forbidden?
In addition, there are no true blackbodies. The closest you can get is a cavity with a small hole. Why? Because it allows for maximum time for the gas to truly reach equilibrium and maximum time for extremely improbable interactions to fill in a bit the gaps between lines. And even then you don’t get a true blackbody spectrum–just close.
Look at the emissions from a vapor lamp–they’re line emissions, not blackbody. How is that consistent with your thesis?
Rod, I am harping on this for a reason. You cannot understand the greenhouse effect if you think CO2 molecules radiate and emit at all energies. If they did, we’d be a whole helluva lot warmer than we are, since some of the excited states are long-lived and would turn light into kinetic energy until we no longer had near the lapse rate we do.
John E. Pearson says
raypierre in 252 said “A more productive discourse would be on the origins of Kirchoff’s law. I mean the microscopic origins; the macroscopic origins as a consequence of the Second Law are clear.”
I’m not sure what you meant by microscopic origins, a mechanism? Is a mechanism required? At equilibrium all fluxes balance. If the body is at equilibrium with a radiation field, by definition of equilibrium, the photon flux absorbed by the body must equal the photon flux emitted by the body. If the fluxes were not balanced the radiation spectrum of the body would be changing with time. The flux balance must hold for all wavelengths. It seems to me that it is as interesting to ask how bodies reach (radiative) thermodynamic equilibrium. My discussion is obviously naive; I think there are issues involving the long time and large body limits that I’ve given short shrift to.
Rod B says
Hank, I really didn’t intend to get into the bulk and detail of the question, for the reason you stated. But raypierre asked about my assertion and I thought it best to reply.
Rod B says
Hank, there is also broadband kinetic “vibrations” from the continual collisions and interactions among the molecules of liquids and gases.
Daniel says
Hi!
RealClimate’s links below “Other Opinions” are kind of outdated. I just checked them all so I can give you some help tidying up:
“Andrew Dessler” hasn’t been updated in a long while.
“Nexus 6” also hasn’t been updated in quite some time.
“The Intersection” has moved and can be found here now: http://blogs.discovermagazine.com/intersection/
And then there’s Stephen Schneider’s blog (which I realize feels sad to remove).
Best regards,
Daniel
[Response: Thanks. I’ll tidy. – gavin]
Ray Ladbury says
Rod, Vibrations? Harmonic oscillator energies are also quantized. Unless the electrons are free, you get line, or at least band emission, not continuum Try agian.
John, I think that is what Raypierre means. It’s not a trivial problem. Depending on how much the material departs from a perfect absorber, reaching true equilibrium would have to depend on very low-probability events to fill in the gaps between lines and bands.
Hank Roberts says
No, Rod.
As Ray L. says: these are different:
solids and plasmas
vs.
liquids and gases
You’re not getting the difference.
Seriously, look at least at 1.1 through 1.3 of the book at Google.
Rod B says
Ray L, translation energy is not quantized ala vibration and rotation energy.
Are you saying the nice continuous solid camel-hump curve of insolation is made up of a helluva spreading of a few H and He line spectra?? Or that the 100-year old Planck equation is nonsense??
I didn’t say spectral strength is equal to the number of molecules. I said it’s dependent on the number of excited molecules — which I think is what you said…
I’ll let you go tell NASA/JPL that they’re full of crap.
If gas molecules radiate at all energy levels we would be a helluva lot warmer only if they radiated a helluva lot. I don’t know if this is accurate in the least, but with a hypothetical example, radiation of a body at 250K and an emissivity of 1.0 radiates 220 watts; at 0.1 emissivity, 22 watts, which is less than 7% of the earth’s downwelling.