However the paper is till published so now what? How would you handle this in a IPCC report, how reliable is the reconstruction? The new one has not got any comments yet but is the comments on the method enough to dismiss it?
(personally I think some of the flaws are pretty severe)
CMsays
Patrick 027,
Reading from the sidelines, I appreciate your patient, civil and substantive explanations to Michele.
Adam R.says
@Edward Greisch:
We have a very difficult idea to sell until the “Pearl Harbor” event happens.
Since the startling events of the last 12 months or so and the 5-figure death count of the 2003 European heat wave didn’t, I am having difficulty imagining just what plausible event would qualify as a climate “Pearl Harbor” in the public mind.
Michelesays
@ Patrick
“Once photons are emitted, the energy can be considered to be seperate from the energy of the air. Photons can potentially travel long distances.”
Not at all. The photons behave as the molecules which locally (at microscopic scale) move with a mean speed of several hundreds of m/s, but they move as hornets into a jar. At macroscopic scale, they move with a group speed that can be also much little, even zero.
Referring to a volume of 1 m³ containing some neutral gas besides M molecules of CO2 in LTE we have that:
The partial molecular pressure of the CO2 is Pm’=MkT (that’s, it is affected only by the numerical density), which coincide with the elasto-molecular energy density, being N/m² = J/m³. The molecules meanly move with the speed (3kT/m) around the center of mass of the particle containing them (m is the molecular mass) and their vectorial sum is zero. If at distance Δx there is another m³ containing CO2 at the molecular pressure Pm’’, the molecules of CO2 move between the two volumes at the rate Rm=-Dm*(Pm’’-Pm’)/Δx, where Dm is the coefficient of diffusion of the CO2 with respect the other gas. All that without bulk macroscopic flow.
The CO2 emits/absorbs and so we also have a photonic density F, a pressure Pph’=Fhν (again affected by the numerical density) which coincide with the elasto-EM energy density. The photons move (locally) with the light speed c around the center of mass of the particle which contains them but also their vectorial sum is zero. The second volume has the pressure Pph’’ and, in agreement with the molecules above, the photons move at the rate Rph=-Dph*(Pph’’-Pph’)/Δx, where Dph is the diffusion coefficient of the photons. Again no bulk flow.
The parallel molecules-photons holds perfectly.
Edward Greischsays
403 Adam R.: “I am having difficulty imagining just what plausible event would qualify as a climate “Pearl Harbor” in the public mind.”
Me too.
SecularAnimistsays
Adam R. wrote: “I am having difficulty imagining just what plausible event would qualify as a climate ‘Pearl Harbor’ in the public mind.”
I have difficulty imagining Rush Limbaugh going on the radio in December 1941 and proclaiming that the very existence of “Japan” was a hoax, while millions of people who called themselves “skeptics” unquestioningly believed his every word.
Patrick 027says
Re 404 Michele –
A population of molecules within a gas of sufficient density will diffuse more slowly than their speeds because they are so often turned back by collisions (collisions however can transport energy, so heat can be conducted by both collisions and diffusion). This is only true for photons when scattering is occuring. What is the mean free path of a molecule? It’s quite short. From Wallace and Hobbs (complete reference pending), about 1 E-7 m (a tenth of a micron) at the surface, about a micron near 20 km, about 1 mm somewhere around 70 km height. You have to go up to between 110 and 120 km height (this is near the turbopause) before it gets to 1 m. (The turbopause is where molecular diffusion starts to dominate over eddy diffusion; above the turbopause, each constituent (exceptions?) decreases in absolute concentration with it’s own scale height, as if the atmosphere is actually several atmospheres sharing space – except for the effects of sources and sinks (chemical reactions). See also CRC handbook of Chemistry and Physics.
And photons? Well of course it varies over wavelength; some will have mean free paths on the order of several km or more, except for clouds or high humidity. From memory, the optical thickness near the center of the CO2 band may peak around 10^4 for the whole atmosphere, which would imply a mean fee path on the order of a meter near the surface (setting aside variations over height in optical thickness per unit mass path). That’s only the center of the band. And this is between emission and absorption; the photons are mainly not being scattered around, they are mainly making one-way trips. Not that photons are not scattered but the scattering optical thickness doesn’t dominate for LW frequencies, and that should be especially true near peaks in absorption spectra. So it is mainly the case that photons are not bouncing around as if in a jar, or if and when that is the case, it is a larger jar. Photons carry gross and net fluxes of energy between locations, which may span variations in temperature, or else in the limit of large optical thicknesses including some sufficient contribution from emissivity and absorptivity, photons allow the diffusion of heat down local thermal gradients, but even in this case, photons are still not appreciably carried by convection of non-photons in familiar conditions.
(Consider also the lifetime of a photon with mean free path of even 10 km between emission and absorption – it would be 1E5 m / 3E8 m/s ~= 3.3 E-4 s. How much does air usually move in that time?)
Radge Haverssays
SA @ 406
“I have difficulty imagining Rush Limbaugh going on the radio in December 1941 and proclaiming that the very existence of “Japan” was a hoax, while millions of people who called themselves “skeptics” unquestioningly believed his every word.”
Rush Limbaugh is a man of this time and part and parcel with his audience. If he said it today and phrased it in a way that insulted “liberals”, then almost certainly the dittoheads who worship him would be more than happy to accept it into their absurd catechism, arguing the definition of nation so as to exclude some aspect of Japanese culture and generally clogging up discourse with all sorts of infectious glop.
Transport the lot of them back to 1941 — same deal. After Dec. 7 you would hear all sorts of rhetorical hokus pokus from them justifying their faith in the Cult of Rush. (“It was really Russian pinkos disguised as Japanese…blah, blah, blah.”) They never admit they’re wrong about anything.
Re my 374, on LTE – It is only photons that make the difference between that and complete thermodynamic equilibrium … with photons. Generally we don’t have complete, complete thermodynamic equilibrium. Having as much CH4 in the atmosphere as we do when there is also so much O2 – not thermodynamic equilibrium. Really, as I understand it, thermodynamic equilibrium would have most protons and neutrons combined into nuclei of masses at or similar to some isotope(s) of Fe – then again, maybe it would all be electrons and positrons, because of proton decay, over trillions (of trillions of trillions of ….?) years. And photons, of course.
But when certain processes don’t happen, or happen sufficiently slowly, we can have a incomplete thermodynamic equilibrium with the other aspects of the system, which isn’t much perturbed by slow equilibration of other aspects or the disequilibrium that is frozen-in. And we can have, or approximately have, LTE in the sense that that molecules, atoms, ions and electrons and crystal lattice vibrations have energy distributions are (approximately) in equilibrium for the sensible heat content, for a given temperature.
You can also have a partial LTE. An interesting case is excited electron-hole populations in semiconductors, such as in a solar cell; after being excited across a band gap, the electrons and remaining holes can settle – as I recall/think, via interactions with the crystal lattice – into distributions within the two seperate energy bands, each one fitting the equilibrium distribution for the temperature (the temperature you would measure if you put a thermometer on the material), but with two seperate distributions not in equilibrium with each other. Because emission via recombination across the band gap involves that disequilibrium, the emission behaves according to a modified Planck function, which can be rewritten as the Planck function for an effective temperature, which will vary as a function of photon energy and the chemical potential of the electron and hole populations as well as the ‘actual’ temperature.
Convection of an ideal gas to different pressures, as I understand it, would tend to preserve LTE even without ongoing molecular collisions, in the absence of other processes. There are ways, besides via diffusion/mixing and radiation, to disrupt LTE. If, in order to maintain (some level of) thermodynamic equilibrium (within a parcel, among non-photons), physical phase changes or (electro-)chemical (or nuclear) reactions had to occur, kinetic barriers to reactions may delay the achievement of equilibrium. For example, with physical phase changes, some diffusion of constituents may be necessary, requiring compositional gradients, and the down-gradient diffusion increases entropy. The nucleation of new phases can itself be delayed (see Kohler curve, homogeneous nucleation of ice crystals). With latent heating or cooling coming from phase changes or phase growth that is not uniform (in a heterogeneous material), there must also be temperature gradients to support diffusion of heat, and this diffusion increases entropy. If a process is fully adiabatic, (some level of) thermodynamic equilibrium is maintained, so that entropy is also conserved (the process is isentropic, and reversable). If there is thermodynamic disequilibrium in some aspect and a process occurs tending towards equilibrium of that aspect, but it doesn’t happen with infinite speed, then the process is no longer adiabatic, reversable, and isentropic; attempting to reverse the process will involve hysteresis, and some net consumption of work (Gibb’s free energy). Note that removing condensed water from a cloud makes the moist convection irreversable – you can’t trace the same adiabat all the way back to the higher pressure of the original cloud base, because evaporation will be complete before then. In general, removing or adding heat, but also substance, as in mixing or something more selective (like prepitation out of – or into – a parcel), will tend to make the process irreversable – unless things are balanced in some way (ie photons are exchanged but the net radiant heating is zero, or is balanced by conduction – or mixing is between identical air parcels, etc.).
The deviation from adiabatic processes due to kinetics of reactions is not generally important in Earth’s atmosphere at least so far as causing convection to veer-off from an adiabat; the formation of new cloud droplets (as distinct from haze particles – see Kohler curve) tends to be delayed (maybe it isn’t always but I’m not sure) from the moment the relative humidity (for a flat surface of pure liquid water) reaches 100 % – but it isn’t generally delayed very long, so far as I know)
–
except maybe (?) when the formation of ice is involved, but fortunately the latent heating of freezing is an order of magnitude less than that of condensation. However, it is important to the other characteristics of the material – significant delay of ice formation occurs in clouds; it is common to have a population of ‘supercooled’ droplets – liquid water below freezing; supercooled droplets can freeze onto ice crystals they run into (riming if they stick) (if they freeze from the outside in, then freezing of the middle may cause them to break into pieces), and because equilibrium vapor pressure is a function of phase, a small number of ice crystals surrounded by many supercooled droplets will tend to result in a reduction in water particles as evaporation from smaller droplets feeds growth of larger ice crystals, which may then precipitate). In the Earth’s mantle, relatively cold descending slabs have slower reaction rates (relative to warmer material), and resulting delays in phase changes have importance to rheology, but I don’t know of a significant effect on the lapse rate.
Growing cloud droplets (via condensation – there is also collision/coalescence, which is very important in producing precipitation in the absence of ice crystals) or ice crystals (from freezing or deposition) will release latent heat, and thus these cloud particles will tend to be warmer than the air around them, allowing the diffusion of that latent heat so that the parcel as a whole ultimately warms. If such a parcel is taken as the smallest unit being dealt with, then it would effectively be perturbed from LTE – it would not radiate with a single temperature – the emissivity and absorptivity would appear to be different if the cp-weighted average temperature is used. Given variations in optical properties between the gas and other phases, interaction with radiation could also cause some temperature variation. However you could work around this by dealing with the range of temperatures in the parcel. Alternatively you could approximate the situation as being at LTE if the temperature variations are sufficiently small, which they will tend to be if the diffusion of heat (as with the rate of collisions among molecules)is sufficient ; my understanding is that this is usually if not always the case. Etc. for evaporation and melting of water particles.
Patrick 027says
PS
Mixing of cloud air with dry air can be important at least in some ways.
Patrick 027says
Re my 407 – the last part, I think, should also help in seeing that even a purely scattering situation would not tend to involve much convection of radiation. For example, consider what if the surface were a blackbody and the atmosphere had only scattering. How many scattering events on average would be required to return a photon to the surface, or else allow it to escape to space (the later is reduced by reducing total optical thickness; I think the former may depend less on that)? It depends on the kind of scattering, of course, but roughly, a large optical thickness of mainly forward scattering may be similar to a smaller optical thickness of Raleigh scattering or isotropic scattering…
(for either of those, a photon has equal chance of being backscattered or forward-scattered for each scattering; a raleigh scattering distribution has mirror symmetry between the forward and reverse directions; from memory I think isotropic scattering would be achieved by a spherical mirror ball with geometric optics, no diffraction)
… Well I’m not as familiar with scattering (though I know where to go to find out more) but for now, just to get a sense of things, consider if all photons are scattered at path lengths x and half are forward scattered and half are backscattered, and assume scattered radiation is isotropic or otherwise sufficient so that the same equality exists for scattering relative to the vertical direction (upward and downward). Set aside the variation in directions, so x is always vertical. Then the average total path for return would be x*(2*0.5 + 4*0.5^3 + 6*0.5^5 + 8*0.5^7 + …) = …
Alternatively, the low energy density of photons suggests that convection of radiation would tend to be rather small.
Patrick 027says
above the turbopause, each constituent (exceptions?) decreases in absolute concentration with it’s own scale height, as if the atmosphere is actually several atmospheres sharing space – except for the effects of sources and sinks (chemical reactions)…
And escape to space, of course. And charged particles are affected by the magnetic field, too – this has greater effect when the radius of gyration is small relative to the mean free path. Magnetic field effects start to become important for electrons at a different height than for ions – hence the E-region dynamo. (PS I’m not sure how the electrons behave above the turbopause regarding particle vs eddy diffusion, though. I anticipate that having electrical charge should reduce the mean free path (stronger interactions over distance), but … well the E-region dynamo exists and I can look up it’s height; that should be a clue…
Also I think it should be possible for the different constituents to have their own temperatures sufficiently above the turbopause, although even for a given constituent, LTE may be no longer be a good approximation there, if not before (but LTE does approximately hold up through the stratosphere at least).
Patrick 027says
Then the average total path for return would be x*(2*0.5 + 4*0.5^3 + 6*0.5^5 + 8*0.5^7 + …) = … – expression doesn’t account for escape, but it’s late and anyway I don’t see a need to pursue this one for now.
Michelesays
@ Patrick
We can say all that we want only if the fundamentals are saved.
Assuming that the M molecules of CO2 fill the first two vibrational energetic levels we would have M0 molecules at the level 0 and M1 at the level 1. Missing the constants and being F the photonic density, we know well that the photons are absorbed at the rate F*M0, spontaneously emitted at the rate M1 and emitted by stimulation at the rate F*M1. For the radiative equilibrium it is required F*M0=M1+F*M1 and hence, at equilibrium the photonic density is F=1/(M0/M1-1). Furthermore, if the gas is in thermodynamic equilibrium, in agreement with the Maxwell-Boltzmann distribution, setting β=hv/kT, we have for the frequency v the photonic density F=1/(expβ-1) and the energetic density ρ(v)=hv/(expβ-1), that’s, the Planck formula.
In other words the photons are emitted/absorbed (scattered) locally and, missing sources/sinks, there must be DF/Dt = δF/δt + u∇F = 0, i.e., F constant over the time and the space.
Ray Ladburysays
Michelle–Unfortunately, Einstein’s derivation–which you are using–presumes equilibrium, which does not hold in the atmosphere. There is an excess flux of 15 micron radiation compared to equilibrium, so you will have relaxation processes other than spontaneous and stimulated emission.
Patrick 027says
In general, removing or adding heat, but also substance, as in mixing or something more selective (like prepitation out of – or into – a parcel), will tend to make the process irreversable – unless things are balanced in some way (ie photons are exchanged but the net radiant heating is zero, or is balanced by conduction – or mixing is between identical air parcels, etc.).
… the later (mixing between identical air parcels) is isentropic while the former (balance between radiant heating/cooling and conduction of heat) will allow the air to be isentropic but there will an increase in entropy somewhere. Another example that maintains the entropy of an air parcel is precipitation of a phase of water with some temperature into the air balances the precipitation out of the parcel of water at the same phase and temperature.
Precipitation of liquid water out of an updraft both removes heat capacity and the latent heat of freezing – fortunately, in familiar conditions the heat capacity of liquid water content is relatively small compared to the gas, and the latent heat of freezing is an order of magnitude less than that of condensation.
I’m thinking the first ice-free summer of the Arctic Ocean may come close. It lacks the catastrophic aspect (in its direct manifestation at least) but will, I think, be a rather stubborn fact.
There may be denialists about that; it could be played rather like the moon landing conspiracy theory. But that will tend to demolish whatever credibility they may have managed to retain for most observers.
#418 Kevin, Ice free? not soon but likely to come, the next earth mega event is no ice at the Pole. The Russian NE passage is poised to open quite earliest in history, it is now just a question of cloud coverage. Present Arctic temps are very warm. Now
I am trying to guess how contrarians will explain the next great melt.
Brian Dodgesays
“‘Climatic Pearl Harbor’ – I’m thinking the first ice-free summer of the Arctic Ocean may come close.”
The same denialists who calculated that “a Vesuvius size eruption under the Arctic Ice (area June 22, 2011 : 9,673,281 sq km) would melt an area the size of Massachusetts (21456 sq km), so that must be the cause of record low ice” will be nattering on about how “the Arctic isn’t truly ice free yet – it still has an area of ice equal to the state of Rhode Island (4002 sq km)”.
Recaptcha – thorough custer
Pete Dunkelbergsays
# 409 Ed G mentions the article by Gore in Rolling Stone. The comments there show a lot of GDS (Gore Derangement Syndrome).
It is worth remembering that the big boost in US arms production after the Pearl Harbor attack was only necessary because Japan and Germany had already hit the big time in arms production ahead of the US and under very economically difficult conditions. I’m not sure a climate Pearl Harbor has a lot of meaning. Might as well say a climate assassination of Archduke Franz Ferdinand of Austria for an example of a reactive frenzy.
Edward Greischsays
New subject: http://ilovemountains.org/dirty-water-act-2011
“THE DIRTY WATER ACT OF 2011 (HR 2018)
The Clean Water Cooperative Federalism Act would gut the Clean Water Act by giving the states, rather than the EPA, the ultimate decision-making authority over our nation’s water quality standards. This would spell disaster in states where mountaintop removal coal mining is practiced, as seen by the states’ abysmal record on permitting and enforcement.”
New subject: resolution HR 1391 would prevent the EPA from regulating coal ash, cinders and slag.
as always, the largest problem of determining summer sea ice melt is not having cloud extent analysis, aside from our brain looking at polar orbiting HRPT satellite shots , clouds are the predominating factor of a great Arctic Ocean ice melt. Has anyone found a good live site showing cloud extent anomalies? I elaborate further on my website blog, there is a chance that a great melt may be averted despite warmer weather.
Michelesays
@ Ray Ladbury
You are right, I omitted to quote Einstein. Thanks.
As far as I know, the density F=1/(M0/M1-1) is an instantaneous value that tends to F=1/(expβ-1) when the LTE is reached, i.e., after several times the relaxation time.
I agree with you, the atmosphere does not hold in the atmosphere then the CO2 molecules don’t emit, or better, emit with very much difficulty and in any case more and more less than that the Planck formula tells us.
PS. My name is Michele, Italian male give name.
Michelesays
Errata Corrige
I agree with you, the equilibrium does not hold in the atmosphere then the CO2 molecules don’t emit, or better, emit with very much difficulty and in any case more and more less than that the Planck formula tells us.
PS. My name is Michele, Italian male given name.
Patrick 027says
Re 426, 426 Michele –
I’ve wondered about stimulated emission’s role in blackbody radiation – it would seem to imply that LTE among non-photons is an insufficient condition for emissivity to equal absorptivity. However, stimulated emission is a rather minor issue at the temperatures we are dealing with, is it not? (How hot does it have to get before about 5 % of photons are emitted by stimulation? Of course when the temperature goes past infinity to negative values, it’s a different story, as I understand it (lasers), but that’s not something I know a lot about.)
Anyway, CO2, and H2O and clouds, and to some extent CH4 and ozone and some other gases, clearly do emit and absorb photons at least approximately the way one could predict based on emission cross sections (related to absorption coefficients) and the Planck function. The radiation can be measured. There is no problem.
Why wouldn’t a population of CO2 molecules, colliding with other molecules (including N2 and the rest, not just themselves) at some nonzero temperature, or within air at some nonzero temperature, be emitting radiation?
Patrick 027says
And re 426/425 Michele, remember that even if photons are being absorbed or scattered within short distances, this doesn’t mean they aren’t being emitted. (Aside from scattering, photons will be absorbed within shorter distances if they are being emitted more often per unit volume – this is the effect of absorptivity = emissivity.)
Another fault of us nice guys understanding AGW, is lamenting the contrarian garbage out there without pouncing on them when they were wrong,. So they the experts of erring, rage at every little detail, spreading confusion they love to brainwash on the lay, despite having been utterly wrong often, they appear again on contrarian prone anti-AGW programs. There should be no relenting against those who are paraded like real climate experts, when they cant predict anything about it, or refuse to do so because they don’t believe anybody else can. I encourage RC to ‘enhance’ any anti-AGW spokesperson on the merit of their ability with climate especially in failing predictions. In mind lately is the famous PDO enhancing an “ice age” gang, some at accuweather and the shows that support them. What is clear, and really above the politics of climate discussions is the ability to predict accurately, why would anyone listen to a contrarian failing projections again and again, we must keep them wallowing in their failures as opposed to disseminate a new delay tactic.
RickAsays
GIA – sea-level rise adjustment question.
If the volume of ocean should be adjusted upward to counteract the sinking basins and rising coast (in some places), shouldn’t the volume of the ocean be adjusted downward to take out the thermal expansion due to the .8C temperature rise?
Or is thermal expansion already taken out?
Does anybody know?
[Response: It depends on what you want to do. Adding in the GIA is necessary if you want to use the satellite data to constrain the ocean volume changes (including stearic and eustatic effects). If you just want to understand the eustatic effects (i.e. how much more water is entering the ocean than is leaving), you would need to correct for density changes (mostly thermal expansion) as well. – gavin]
SecularAnimistsays
Those pondering what might constitute a “climatic Pearl Harbor” might wish to read this analysis of the extreme weather events of 2010 and 2011, by meteorologist Jeff Masters:
He discusses “the top twenty most remarkable weather events of 2010” which include:
Earth’s hottest year on record
Most extreme winter Arctic atmospheric circulation on record
Arctic sea ice: lowest volume on record, 3rd lowest extent
Record melting in Greenland, and a massive calving event
Second most extreme shift from El Niño to La Niña
Second worst coral bleaching year
Wettest year over land
Amazon rainforest experiences its 2nd 100-year drought in 5 years
Global tropical cyclone activity lowest on record
A hyperactive Atlantic hurricane season: 3rd busiest on record
A rare tropical storm in the South Atlantic
Strongest storm in Southwestern U.S. history
Strongest non-coastal storm in U.S. history
Weakest and latest-ending East Asian monsoon on record
No monsoon depressions in India’s Southwest Monsoon for 2nd time in 134 years
The Pakistani flood: most expensive natural disaster in Pakistan’s history
The Russian heat wave and drought: deadliest heat wave in human history
Record rains trigger Australia’s most expensive natural disaster in history
Heaviest rains on record trigger Colombia’s worst flooding disaster in history
Tennessee’s 1-in-1000 year flood kills 30, does $2.4 billion in damage
Sounds like a multitude of “climatic Pearl Harbors” to me.
Excerpt:
Any one of the extreme weather events of 2010 or 2011 could have occurred naturally sometime during the past 1,000 years. But it is highly improbable that the remarkable extreme weather events of 2010 and 2011 could have all happened in such a short period of time without some powerful climate-altering force at work. The best science we have right now maintains that human-caused emissions of heat-trapping gases like CO2 are the most likely cause of such a climate-altering force …
… the ever-increasing amounts of heat-trapping gases humans are emitting into the air puts tremendous pressure on the climate system to shift to a new, radically different, warmer state, and the extreme weather of 2010 – 2011 suggests that the transition is already well underway. A warmer planet has more energy to power stronger storms, hotter heat waves, more intense droughts, heavier flooding rains, and record glacier melt that will drive accelerating sea level rise. I expect that by 20 – 30 years from now, extreme weather years like we witnessed in 2010 will become the new normal.
RickAsays
Gavin in-line comment to RickA #430:
Thank you Gavin.
I am interested in sea level as it relates to climate change. For example how much of Florida will be underwater by 2100.
So based on your answer and Wikipedia – I am guessing I would be interested in both eustatic (change of sea level relative to a fixed point) and isostatic (change of land level relative to a fixed point) measurements. The two together should allow me to describe eustatic sea level relative to isostatic Florida coast at 2100 (assuming I had the proper data)(for example).
Depends on the amount of SLR by 2100. I put together SLR maps (with 1 and 6-meter projected rises) for various places around the world (including Florida) at Skeptical Science here.
OT Summer beach reading: How the Hippies Saved Physics by MIT physicist and science historian David Kaiser
Michelesays
@ 427-428 Patrick
I was not saying about the BB but about what occurs inside a particle of air containing also an absorbing/emitting gas.
OK, the relationship above is obtained taking into account only the scattered EM-photons that directly excite the molecule and then are suddenly re-emitted. It seems that we are omitting the thermal-photons which become thermal energy of all the molecules (neutral and active) and are created from the thermal energy by means the collisions among the active molecules with the other (active and neutral) molecules.
I think it is implicitly contained by the claimed LTE, otherwise the temperature wouldn’t be constant. Of course, if the thermal photons aren’t contained by the density F, we should think that is F = Fem + Fth = constant (really, I have some doubt about it).
In any case we have always DF/Dt = 0, that means ∇•(uF) = F∇•u = 0 in a steady state, i.e., the photons density obey the continuity law as the mass density Dρ/Dt = 0 or the equivalent ∇•(ρu) = ρ∇•u = 0.
Can you help? I went to this page that I like to refer to sometimes about climate change commitments and the article has disappeared. All that’s left is a link to a pdf file in italian.
Re 435 Michele (I had assumed the thermally-emitted photons were included in the F you were describing) –
DF/Dt = 0. Yes, if nothing is changing. If you add a new source of heat the temperature will rise until emission-absorption adjusts to balances the new heat source. If a parcel of air is rising and cooling adiabatically then the temperature declines and the rate at which photons are emitted decreases. If a change in temperature happens somewhere else than a change in the photon population passing through another location can change; DF/Dt can change.
Perhaps I’m mistaken about what you’re thinking – are you thinking that the photons which are emitted to space are somehow present in the air during ascent and when the air expands enough the photons are released? Because that’s not what happens. I’m having trouble understanding your point.
The continuity equation regarding photon density – yes, of course, if photons are redistributed over a larger volume then their density decreases – this is just a form of the conservation of energy, as the continuity equation in fluid mechanics is a form of the conservation of mass. But when photons are absorbed or emitted, energy is leaving one form and entering another – such as between the enthalpy of a population of non-photons and the radiant heat energy of a population of photons.
Perhaps you should look up – first, ‘Beer’s Law’, then, ‘Schwarzchild’s equation’ (Beer’s Law is a special case of Schwarzchild’s equation).
Patrick 027says
DF/Dt can change.
– Actually I meant it can be nonzero, but it can change too; and dF/dt can also change (D/Dt is used in fluid dynamics as the derivative while followiing fluid motion (I think this can be refered to as the material derivative or the Lagrangian derivative); in this case I assume we’re following the macroscopic motion of non-photons; partial derivative symbols (the curly-d, looks like a backward 6) is then used instead for the Eulerian derivative, the rate of change at a given location in some frame of reference; which is what I meant by dF/dt, but I didn’t take the time to use the best symbol.
Of course, once equilibrium climate is achieved, then averaged globally and annually and over internal variation, etc, eulerian derivative dF/dt = 0, but DF/Dt can still be nonzero.
I can’t tell for sure whether all costs are included, but there is no mention of subsidies.
733MW of power at a cost of $2.6 billion gives (assuming at least 80% of rated power for at least 30 years for at least 250 days per year for at least 8 hours per day): 3.52E10 kwh at a cost of $2.6E9, or $0.0739/kwh. That’s less than the cost of the unit from Home Depot because it is a mass purchase and mass installation.
caveat emptor in all cases.
As concentrated PV power is mass produced, the cost per kwh should be cut in half in a few years. I can hardly wait to update these calculations in the years ahead.
Septic Matthewsays
Quick correction to my previous post: there is a federal loan guarantee, but not a direct subsidy to reduce the purchase price. The loan guarantee is worth some amount on the interest.
Still worth looking at next year to assess price changes, I think.
He presents his “pdf file of a power point presentation of the basic science that results from the application of elementary basic physics and chemistry to the published spectroscopic data for vibration-rotation transitions for the asymmetric stretch of carbon dioxide, the known Planck radiation curves for the earth and the sun, and the solution of Beer’s Law modified for broad-band diffuse transmission by carbon dioxide in the air using partial pressures given by the widely accepted Keeling curve”
May I ask someone here, with more recent physical chemistry education than mine, to look over Franzen’s presentation to see whether there are any obvious wrong turns?
If we want depict what occurs to a water particle flowing in a river, we don’t remain on a bridge because we would have a partial description (Eulerian point of view over the sole time) but we get into the boat and we go downstream together the water particle (Lagrangian point of view over the time and the space). So ‘Beer’s Law’ and ‘Schwarzchild’s equation’, referring to a fluid at rest, don’t hold in this case.
The Lagrangian derivative tells us more than the Eulerian one. DF/Dt tell us that the process is uniform, i.e., invariable over the time and the space.
The photons which contribute to the LTE within a volume V exist because V contains the emitting/absorbing molecules together the neutral ones, but all rest confined within V.
Michelesays
Of course, until the volume V don’t become a source if therein are created photons using the thermal energy yielded by the sorroundings. In this case the CO2 molecules behaves as heat engines.
Economic cost of weather may total $485 billion in U.S.
Everything has its price, even the weather. New research indicates that routine weather events such as rain and cooler-than-average days can add up to an annual economic impact of as much as $485 billion in the United States.
The study, led by the National Center for Atmospheric Research (NCAR), found that finance, manufacturing, agriculture, and every other sector of the economy is sensitive to changes in the weather. The impacts can be felt in every state.
The article is by Jeffrey Lazo, Megan Lawson, Peter Larsen, and Donald Waldman. It is titled, “U.S. Economic Sensitivity to Weather Variability.” A preprint of the article is up on the Bulletin of the American Meteorological Society web site. Article. 36 pages, including charts and graphs.
Patrick 027says
Re 445 Michele If we want depict what occurs to a water particle flowing in a river, we don’t remain on a bridge because we would have a partial description (Eulerian point of view over the sole time) but we get into the boat and we go downstream together the water particle (Lagrangian point of view over the time and the space). So ‘Beer’s Law’ and ‘Schwarzchild’s equation’, referring to a fluid at rest, don’t hold in this case.
1.
But sometimes you can use Eulerian point of view with the awareness of motion (eulerian derviative + advection term = lagrangian derivative).
2.
Beer’s Law and Scharzchild’s equation don’t have much to do with fluid motion. Whatever optical properties exist at a location at the time a photon passes through, or is emitted or absorbed or scattered there, that’s what applies and should be used in the equation – if conditions are changing rapidly you could at least consider a population of photons moving along a path at one time, a pulse, which is only in one location at one time, and you could use such equations (add scattering terms to Schwarzchild’s) to describe how the intensity of that pulse changes as photons are added or removed from it.
3.
But fluid motions are so slow compared to photons that really, who cares? Photon travel from emission (or entrance into system) to absorption (or escape) are approximately instantaneous processes relative to the time it takes for temperature, pressure, or composition/phase to change at various locations (Eulerian) or in various parcels (Lagrangian) – at least in the context of radiation through planetary atmospheres.
The Lagrangian derivative tells us more than the Eulerian one. DF/Dt tell us that the process is uniform, i.e., invariable over the time and the space.
Poor application of a conservation law – the system is not closed and isolated, and stuff is happening; it is not generally allowed to reach thermodynamic equilibrium (as distinct from LTE among non-photons in a sufficiently small volume). I can see things changing all the time. When you flip a switch on a – let’s say incandescent light bulb (remember those?) – the temperature goes up and the radiation changes. Turn it off and it changes again. At night the ground cools off (on land, especially in some conditions, not so much others, can be overuled by advection) and emits less upward radiation; the air near the ground in that case also cools off and will emit less. Which brings F down. This would happen if the air was perfectly still (actually that tends to enhance nocturnal surface inversions). It’s got nothing to do with Eulerian vs Lagrangian.
The photons which contribute to the LTE within a volume V exist because V contains the emitting/absorbing molecules together the neutral ones, but all rest confined within V.
That (confinement) would tend to favor photons being closer to thermodyanic equilibrium with the non-photons. This tends to happen with sufficiently large optical thickness. The atmosphere is not so opaque at all frequencies.
Of course, until the volume V don’t become a source if therein are created photons using the thermal energy yielded by the sorroundings. In this case the CO2 molecules behaves as heat engines.
No, a heat engine takes a flow of heat from higher to lower temperature and diverts some of that flow and converts it to work. See related discussion at Skeptical Science (or did I cover that here?).
Taking one form of heat and converting to another form of heat, or absorbing heat from one thing by another, or releasing heat from something, is not what a heat engine does.
Michelesays
@ 448 Patrick
Helmholtz’s theorem states that a vectorial field is wholly defined if we know anywhere its divergence and its curl. The divergence brings us to continuity that has to be obeyed always.
Heat engine.
The molecules behave as heat engines because they absorb the thermal energy, that represents the lowest energetic form, and transforms part of it to EM wave energy, wasting the rest still as thermal energy. The heat is transferred by EM waves but, mind out there, the EM waves aren’t heat.
Patrick 027says
Re 449Michele Helmholtz’s theorem states that a vectorial field is wholly defined if we know anywhere its divergence and its curl. The divergence brings us to continuity that has to be obeyed always.
Know anywhere? I know that you can take a vector field and find linearly superimposable components and you can choose to take just 2 to create the whole field with one being irrotational and containing any of the divergence, and one being non-divergent and containing any of the vorticity (curl). I think you can reconstruct in full any non-divergent flow pattern given the distribution of vorticity and add any necessary non-divergent and irrotational component to the result to fit boundary conditions (I know this is true for flow in two-dimensions; I think it should be true in three, … maybe 4 and 5, etc, but that’s not necessary here. … Of course you can reconstruct any non-divergent flow field given the potential vorticity distribution, a balance relationship (such as geostrophic or gradient-wind balance) and sufficient boundary conditions (such as that required by conservation of momentum), but that’s actually going a step beyond what we’ve been concerned with here because it brings both the potential density field and velocity field into play).
But I don’t think you can reconstruct any specified velocity field just knowing the divergence and vorticity at a couple locations. Also, there is also such a thing as non-divergent and irrotational deformation, which comes in two flavors, actually (in two-dimensional flow; I haven’t thought about that in 3 dimensions.
PS Synoptic and planetary scale atmospheric dynamics generally has a focus on two-dimensional quasi-horizontal flow patterns; (resolved) vertical motions are much slower on those scales – not that it isn’t important but it can be treated in a distinct way – the behavior of the atmosphere and ocean is very anisotropic in three dimensions. I suspect it should be a bit less so in the mantle, where there isn’t a general tendency (except perhaps for the effects of the ~ 660 km depth phase transition) to stable stratification (even though the troposphere is convective, the existence of moist convection tends to stabilize the atmosphere to dry convection, while the existence of large-scale unsaturated conditions makes this stratification important) and also the depth scale is more similar to the horizontal length scale; also in the outer core, except for the orientation of flow patterns with respect to the axis of rotation).
Continuity only has to be obeyed when there are not sources and sinks. Or you could include that in your continuity equation; the point is that when you are only looking at one form of energy (radiation), it certainly can be increased or decreased in amount, thus divergent flux can exist even with density being constant, or density can change without divergence of flux – so long as the descrepancy is balanced by a source or sink.
There are sources and sinks.
I suppose EM waves are not heat in the same way that molecular collisions and mass diffusion are not heat? Radiant transfer of energy between two objects at LTE involves two photons fluxes with a net flux being from higher to lower temperature, which increases entropy because the entropy flux = heat flux / temperature and so with the same net flux leaving at higher temperature and entering at low temperature there is a gain in entropy, just as there would be for a flux of sensible heat. Latent heat can flow with entropy gains via a flux from higher to lower concentration of a substance, and I don’t know how to calculate the entropy in that case, but there is an entropy gain.
You can assign a brightness temperature to any particular monochromatic (and if necesary, polarized, etc.) radiant intensity, which indicates the entropy involved.
You can certainly use higher brightness temperature radiation to run a heat engine if the heat sink is at lower temperature. As when the sun heats the Earth (in this case the heat source temperature is the temperature where the solar radiation is absorbed, so with respect to the sun a heat source there is substantial inefficiency there; but the solar radiation temperature has to be at least as hot as the temperature where it is being absorbed in order for this to work, otherwise there would be more radiation lost than gained and the Earth’s climate system’s heat engine’s heat source would no longer be a source), convection occurs, and radiation emitted at lower temperature (the temperature of space itself is not the Tc for the heat engine).
But it is just incorrect to say that CO2 molecules are heat engines, at least if they are as a group at LTE, and on the individual molecular scale, molecular collisions involve work, so it is a work-work conversion; the disorder renders both the collisional energy transfer and photons as heat on a macroscopic scale.
Magnus W says
This might be a better place to discuss this:
Reconstruction of the extra-tropical NH mean temperature over
the last millennium with a method that preserves low-frequency
http://web.dmi.dk/solar-terrestrial/staff/boc/millennium_reconstr.pdf
The method it uses have got some shortcomings highlighter earlier:
http://www.people.fas.harvard.edu/~tingley/Comment_on_Christiansen.pdf
However the paper is till published so now what? How would you handle this in a IPCC report, how reliable is the reconstruction? The new one has not got any comments yet but is the comments on the method enough to dismiss it?
(personally I think some of the flaws are pretty severe)
CM says
Patrick 027,
Reading from the sidelines, I appreciate your patient, civil and substantive explanations to Michele.
Adam R. says
Since the startling events of the last 12 months or so and the 5-figure death count of the 2003 European heat wave didn’t, I am having difficulty imagining just what plausible event would qualify as a climate “Pearl Harbor” in the public mind.
Michele says
@ Patrick
“Once photons are emitted, the energy can be considered to be seperate from the energy of the air. Photons can potentially travel long distances.”
Not at all. The photons behave as the molecules which locally (at microscopic scale) move with a mean speed of several hundreds of m/s, but they move as hornets into a jar. At macroscopic scale, they move with a group speed that can be also much little, even zero.
Referring to a volume of 1 m³ containing some neutral gas besides M molecules of CO2 in LTE we have that:
The partial molecular pressure of the CO2 is Pm’=MkT (that’s, it is affected only by the numerical density), which coincide with the elasto-molecular energy density, being N/m² = J/m³. The molecules meanly move with the speed (3kT/m) around the center of mass of the particle containing them (m is the molecular mass) and their vectorial sum is zero. If at distance Δx there is another m³ containing CO2 at the molecular pressure Pm’’, the molecules of CO2 move between the two volumes at the rate Rm=-Dm*(Pm’’-Pm’)/Δx, where Dm is the coefficient of diffusion of the CO2 with respect the other gas. All that without bulk macroscopic flow.
The CO2 emits/absorbs and so we also have a photonic density F, a pressure Pph’=Fhν (again affected by the numerical density) which coincide with the elasto-EM energy density. The photons move (locally) with the light speed c around the center of mass of the particle which contains them but also their vectorial sum is zero. The second volume has the pressure Pph’’ and, in agreement with the molecules above, the photons move at the rate Rph=-Dph*(Pph’’-Pph’)/Δx, where Dph is the diffusion coefficient of the photons. Again no bulk flow.
The parallel molecules-photons holds perfectly.
Edward Greisch says
403 Adam R.: “I am having difficulty imagining just what plausible event would qualify as a climate “Pearl Harbor” in the public mind.”
Me too.
SecularAnimist says
Adam R. wrote: “I am having difficulty imagining just what plausible event would qualify as a climate ‘Pearl Harbor’ in the public mind.”
I have difficulty imagining Rush Limbaugh going on the radio in December 1941 and proclaiming that the very existence of “Japan” was a hoax, while millions of people who called themselves “skeptics” unquestioningly believed his every word.
Patrick 027 says
Re 404 Michele –
A population of molecules within a gas of sufficient density will diffuse more slowly than their speeds because they are so often turned back by collisions (collisions however can transport energy, so heat can be conducted by both collisions and diffusion). This is only true for photons when scattering is occuring. What is the mean free path of a molecule? It’s quite short. From Wallace and Hobbs (complete reference pending), about 1 E-7 m (a tenth of a micron) at the surface, about a micron near 20 km, about 1 mm somewhere around 70 km height. You have to go up to between 110 and 120 km height (this is near the turbopause) before it gets to 1 m. (The turbopause is where molecular diffusion starts to dominate over eddy diffusion; above the turbopause, each constituent (exceptions?) decreases in absolute concentration with it’s own scale height, as if the atmosphere is actually several atmospheres sharing space – except for the effects of sources and sinks (chemical reactions). See also CRC handbook of Chemistry and Physics.
And photons? Well of course it varies over wavelength; some will have mean free paths on the order of several km or more, except for clouds or high humidity. From memory, the optical thickness near the center of the CO2 band may peak around 10^4 for the whole atmosphere, which would imply a mean fee path on the order of a meter near the surface (setting aside variations over height in optical thickness per unit mass path). That’s only the center of the band. And this is between emission and absorption; the photons are mainly not being scattered around, they are mainly making one-way trips. Not that photons are not scattered but the scattering optical thickness doesn’t dominate for LW frequencies, and that should be especially true near peaks in absorption spectra. So it is mainly the case that photons are not bouncing around as if in a jar, or if and when that is the case, it is a larger jar. Photons carry gross and net fluxes of energy between locations, which may span variations in temperature, or else in the limit of large optical thicknesses including some sufficient contribution from emissivity and absorptivity, photons allow the diffusion of heat down local thermal gradients, but even in this case, photons are still not appreciably carried by convection of non-photons in familiar conditions.
(Consider also the lifetime of a photon with mean free path of even 10 km between emission and absorption – it would be 1E5 m / 3E8 m/s ~= 3.3 E-4 s. How much does air usually move in that time?)
Radge Havers says
SA @ 406
Rush Limbaugh is a man of this time and part and parcel with his audience. If he said it today and phrased it in a way that insulted “liberals”, then almost certainly the dittoheads who worship him would be more than happy to accept it into their absurd catechism, arguing the definition of nation so as to exclude some aspect of Japanese culture and generally clogging up discourse with all sorts of infectious glop.
Transport the lot of them back to 1941 — same deal. After Dec. 7 you would hear all sorts of rhetorical hokus pokus from them justifying their faith in the Cult of Rush. (“It was really Russian pinkos disguised as Japanese…blah, blah, blah.”) They never admit they’re wrong about anything.
Edward Greisch says
New subject:
http://www.rollingstone.com/politics/news/climate-of-denial-20110622
A long article by Al Gore.
Patrick 027 says
Re my 374, on LTE – It is only photons that make the difference between that and complete thermodynamic equilibrium … with photons. Generally we don’t have complete, complete thermodynamic equilibrium. Having as much CH4 in the atmosphere as we do when there is also so much O2 – not thermodynamic equilibrium. Really, as I understand it, thermodynamic equilibrium would have most protons and neutrons combined into nuclei of masses at or similar to some isotope(s) of Fe – then again, maybe it would all be electrons and positrons, because of proton decay, over trillions (of trillions of trillions of ….?) years. And photons, of course.
But when certain processes don’t happen, or happen sufficiently slowly, we can have a incomplete thermodynamic equilibrium with the other aspects of the system, which isn’t much perturbed by slow equilibration of other aspects or the disequilibrium that is frozen-in. And we can have, or approximately have, LTE in the sense that that molecules, atoms, ions and electrons and crystal lattice vibrations have energy distributions are (approximately) in equilibrium for the sensible heat content, for a given temperature.
You can also have a partial LTE. An interesting case is excited electron-hole populations in semiconductors, such as in a solar cell; after being excited across a band gap, the electrons and remaining holes can settle – as I recall/think, via interactions with the crystal lattice – into distributions within the two seperate energy bands, each one fitting the equilibrium distribution for the temperature (the temperature you would measure if you put a thermometer on the material), but with two seperate distributions not in equilibrium with each other. Because emission via recombination across the band gap involves that disequilibrium, the emission behaves according to a modified Planck function, which can be rewritten as the Planck function for an effective temperature, which will vary as a function of photon energy and the chemical potential of the electron and hole populations as well as the ‘actual’ temperature.
Convection of an ideal gas to different pressures, as I understand it, would tend to preserve LTE even without ongoing molecular collisions, in the absence of other processes. There are ways, besides via diffusion/mixing and radiation, to disrupt LTE. If, in order to maintain (some level of) thermodynamic equilibrium (within a parcel, among non-photons), physical phase changes or (electro-)chemical (or nuclear) reactions had to occur, kinetic barriers to reactions may delay the achievement of equilibrium. For example, with physical phase changes, some diffusion of constituents may be necessary, requiring compositional gradients, and the down-gradient diffusion increases entropy. The nucleation of new phases can itself be delayed (see Kohler curve, homogeneous nucleation of ice crystals). With latent heating or cooling coming from phase changes or phase growth that is not uniform (in a heterogeneous material), there must also be temperature gradients to support diffusion of heat, and this diffusion increases entropy. If a process is fully adiabatic, (some level of) thermodynamic equilibrium is maintained, so that entropy is also conserved (the process is isentropic, and reversable). If there is thermodynamic disequilibrium in some aspect and a process occurs tending towards equilibrium of that aspect, but it doesn’t happen with infinite speed, then the process is no longer adiabatic, reversable, and isentropic; attempting to reverse the process will involve hysteresis, and some net consumption of work (Gibb’s free energy). Note that removing condensed water from a cloud makes the moist convection irreversable – you can’t trace the same adiabat all the way back to the higher pressure of the original cloud base, because evaporation will be complete before then. In general, removing or adding heat, but also substance, as in mixing or something more selective (like prepitation out of – or into – a parcel), will tend to make the process irreversable – unless things are balanced in some way (ie photons are exchanged but the net radiant heating is zero, or is balanced by conduction – or mixing is between identical air parcels, etc.).
The deviation from adiabatic processes due to kinetics of reactions is not generally important in Earth’s atmosphere at least so far as causing convection to veer-off from an adiabat; the formation of new cloud droplets (as distinct from haze particles – see Kohler curve) tends to be delayed (maybe it isn’t always but I’m not sure) from the moment the relative humidity (for a flat surface of pure liquid water) reaches 100 % – but it isn’t generally delayed very long, so far as I know)
–
except maybe (?) when the formation of ice is involved, but fortunately the latent heating of freezing is an order of magnitude less than that of condensation. However, it is important to the other characteristics of the material – significant delay of ice formation occurs in clouds; it is common to have a population of ‘supercooled’ droplets – liquid water below freezing; supercooled droplets can freeze onto ice crystals they run into (riming if they stick) (if they freeze from the outside in, then freezing of the middle may cause them to break into pieces), and because equilibrium vapor pressure is a function of phase, a small number of ice crystals surrounded by many supercooled droplets will tend to result in a reduction in water particles as evaporation from smaller droplets feeds growth of larger ice crystals, which may then precipitate). In the Earth’s mantle, relatively cold descending slabs have slower reaction rates (relative to warmer material), and resulting delays in phase changes have importance to rheology, but I don’t know of a significant effect on the lapse rate.
Growing cloud droplets (via condensation – there is also collision/coalescence, which is very important in producing precipitation in the absence of ice crystals) or ice crystals (from freezing or deposition) will release latent heat, and thus these cloud particles will tend to be warmer than the air around them, allowing the diffusion of that latent heat so that the parcel as a whole ultimately warms. If such a parcel is taken as the smallest unit being dealt with, then it would effectively be perturbed from LTE – it would not radiate with a single temperature – the emissivity and absorptivity would appear to be different if the cp-weighted average temperature is used. Given variations in optical properties between the gas and other phases, interaction with radiation could also cause some temperature variation. However you could work around this by dealing with the range of temperatures in the parcel. Alternatively you could approximate the situation as being at LTE if the temperature variations are sufficiently small, which they will tend to be if the diffusion of heat (as with the rate of collisions among molecules)is sufficient ; my understanding is that this is usually if not always the case. Etc. for evaporation and melting of water particles.
Patrick 027 says
PS
Mixing of cloud air with dry air can be important at least in some ways.
Patrick 027 says
Re my 407 – the last part, I think, should also help in seeing that even a purely scattering situation would not tend to involve much convection of radiation. For example, consider what if the surface were a blackbody and the atmosphere had only scattering. How many scattering events on average would be required to return a photon to the surface, or else allow it to escape to space (the later is reduced by reducing total optical thickness; I think the former may depend less on that)? It depends on the kind of scattering, of course, but roughly, a large optical thickness of mainly forward scattering may be similar to a smaller optical thickness of Raleigh scattering or isotropic scattering…
(for either of those, a photon has equal chance of being backscattered or forward-scattered for each scattering; a raleigh scattering distribution has mirror symmetry between the forward and reverse directions; from memory I think isotropic scattering would be achieved by a spherical mirror ball with geometric optics, no diffraction)
… Well I’m not as familiar with scattering (though I know where to go to find out more) but for now, just to get a sense of things, consider if all photons are scattered at path lengths x and half are forward scattered and half are backscattered, and assume scattered radiation is isotropic or otherwise sufficient so that the same equality exists for scattering relative to the vertical direction (upward and downward). Set aside the variation in directions, so x is always vertical. Then the average total path for return would be x*(2*0.5 + 4*0.5^3 + 6*0.5^5 + 8*0.5^7 + …) = …
Alternatively, the low energy density of photons suggests that convection of radiation would tend to be rather small.
Patrick 027 says
above the turbopause, each constituent (exceptions?) decreases in absolute concentration with it’s own scale height, as if the atmosphere is actually several atmospheres sharing space – except for the effects of sources and sinks (chemical reactions)…
And escape to space, of course. And charged particles are affected by the magnetic field, too – this has greater effect when the radius of gyration is small relative to the mean free path. Magnetic field effects start to become important for electrons at a different height than for ions – hence the E-region dynamo. (PS I’m not sure how the electrons behave above the turbopause regarding particle vs eddy diffusion, though. I anticipate that having electrical charge should reduce the mean free path (stronger interactions over distance), but … well the E-region dynamo exists and I can look up it’s height; that should be a clue…
Also I think it should be possible for the different constituents to have their own temperatures sufficiently above the turbopause, although even for a given constituent, LTE may be no longer be a good approximation there, if not before (but LTE does approximately hold up through the stratosphere at least).
Patrick 027 says
Then the average total path for return would be x*(2*0.5 + 4*0.5^3 + 6*0.5^5 + 8*0.5^7 + …) = … – expression doesn’t account for escape, but it’s late and anyway I don’t see a need to pursue this one for now.
Michele says
@ Patrick
We can say all that we want only if the fundamentals are saved.
Assuming that the M molecules of CO2 fill the first two vibrational energetic levels we would have M0 molecules at the level 0 and M1 at the level 1. Missing the constants and being F the photonic density, we know well that the photons are absorbed at the rate F*M0, spontaneously emitted at the rate M1 and emitted by stimulation at the rate F*M1. For the radiative equilibrium it is required F*M0=M1+F*M1 and hence, at equilibrium the photonic density is F=1/(M0/M1-1). Furthermore, if the gas is in thermodynamic equilibrium, in agreement with the Maxwell-Boltzmann distribution, setting β=hv/kT, we have for the frequency v the photonic density F=1/(expβ-1) and the energetic density ρ(v)=hv/(expβ-1), that’s, the Planck formula.
In other words the photons are emitted/absorbed (scattered) locally and, missing sources/sinks, there must be DF/Dt = δF/δt + u∇F = 0, i.e., F constant over the time and the space.
Ray Ladbury says
Michelle–Unfortunately, Einstein’s derivation–which you are using–presumes equilibrium, which does not hold in the atmosphere. There is an excess flux of 15 micron radiation compared to equilibrium, so you will have relaxation processes other than spontaneous and stimulated emission.
Patrick 027 says
In general, removing or adding heat, but also substance, as in mixing or something more selective (like prepitation out of – or into – a parcel), will tend to make the process irreversable – unless things are balanced in some way (ie photons are exchanged but the net radiant heating is zero, or is balanced by conduction – or mixing is between identical air parcels, etc.).
… the later (mixing between identical air parcels) is isentropic while the former (balance between radiant heating/cooling and conduction of heat) will allow the air to be isentropic but there will an increase in entropy somewhere. Another example that maintains the entropy of an air parcel is precipitation of a phase of water with some temperature into the air balances the precipitation out of the parcel of water at the same phase and temperature.
Precipitation of liquid water out of an updraft both removes heat capacity and the latent heat of freezing – fortunately, in familiar conditions the heat capacity of liquid water content is relatively small compared to the gas, and the latent heat of freezing is an order of magnitude less than that of condensation.
Kevin McKinney says
“Climatic Pearl Harbor”–
I’m thinking the first ice-free summer of the Arctic Ocean may come close. It lacks the catastrophic aspect (in its direct manifestation at least) but will, I think, be a rather stubborn fact.
There may be denialists about that; it could be played rather like the moon landing conspiracy theory. But that will tend to demolish whatever credibility they may have managed to retain for most observers.
wayne davidson says
#418 Kevin, Ice free? not soon but likely to come, the next earth mega event is no ice at the Pole. The Russian NE passage is poised to open quite earliest in history, it is now just a question of cloud coverage. Present Arctic temps are very warm. Now
I am trying to guess how contrarians will explain the next great melt.
Brian Dodge says
“‘Climatic Pearl Harbor’ – I’m thinking the first ice-free summer of the Arctic Ocean may come close.”
The same denialists who calculated that “a Vesuvius size eruption under the Arctic Ice (area June 22, 2011 : 9,673,281 sq km) would melt an area the size of Massachusetts (21456 sq km), so that must be the cause of record low ice” will be nattering on about how “the Arctic isn’t truly ice free yet – it still has an area of ice equal to the state of Rhode Island (4002 sq km)”.
Recaptcha – thorough custer
Pete Dunkelberg says
# 409 Ed G mentions the article by Gore in Rolling Stone. The comments there show a lot of GDS (Gore Derangement Syndrome).
Chris Dudley says
It is worth remembering that the big boost in US arms production after the Pearl Harbor attack was only necessary because Japan and Germany had already hit the big time in arms production ahead of the US and under very economically difficult conditions. I’m not sure a climate Pearl Harbor has a lot of meaning. Might as well say a climate assassination of Archduke Franz Ferdinand of Austria for an example of a reactive frenzy.
Edward Greisch says
New subject: http://ilovemountains.org/dirty-water-act-2011
“THE DIRTY WATER ACT OF 2011 (HR 2018)
The Clean Water Cooperative Federalism Act would gut the Clean Water Act by giving the states, rather than the EPA, the ultimate decision-making authority over our nation’s water quality standards. This would spell disaster in states where mountaintop removal coal mining is practiced, as seen by the states’ abysmal record on permitting and enforcement.”
New subject: resolution HR 1391 would prevent the EPA from regulating coal ash, cinders and slag.
Find them in http://thomas.loc.gov/home/thomas.php
Call your congressman ASAP and say “Vote no.” We need the EPA to be able to regulate coal somehow.
wayne davidson says
as always, the largest problem of determining summer sea ice melt is not having cloud extent analysis, aside from our brain looking at polar orbiting HRPT satellite shots , clouds are the predominating factor of a great Arctic Ocean ice melt. Has anyone found a good live site showing cloud extent anomalies? I elaborate further on my website blog, there is a chance that a great melt may be averted despite warmer weather.
Michele says
@ Ray Ladbury
You are right, I omitted to quote Einstein. Thanks.
As far as I know, the density F=1/(M0/M1-1) is an instantaneous value that tends to F=1/(expβ-1) when the LTE is reached, i.e., after several times the relaxation time.
I agree with you, the atmosphere does not hold in the atmosphere then the CO2 molecules don’t emit, or better, emit with very much difficulty and in any case more and more less than that the Planck formula tells us.
PS. My name is Michele, Italian male give name.
Michele says
Errata Corrige
I agree with you, the equilibrium does not hold in the atmosphere then the CO2 molecules don’t emit, or better, emit with very much difficulty and in any case more and more less than that the Planck formula tells us.
PS. My name is Michele, Italian male given name.
Patrick 027 says
Re 426, 426 Michele –
I’ve wondered about stimulated emission’s role in blackbody radiation – it would seem to imply that LTE among non-photons is an insufficient condition for emissivity to equal absorptivity. However, stimulated emission is a rather minor issue at the temperatures we are dealing with, is it not? (How hot does it have to get before about 5 % of photons are emitted by stimulation? Of course when the temperature goes past infinity to negative values, it’s a different story, as I understand it (lasers), but that’s not something I know a lot about.)
Anyway, CO2, and H2O and clouds, and to some extent CH4 and ozone and some other gases, clearly do emit and absorb photons at least approximately the way one could predict based on emission cross sections (related to absorption coefficients) and the Planck function. The radiation can be measured. There is no problem.
Why wouldn’t a population of CO2 molecules, colliding with other molecules (including N2 and the rest, not just themselves) at some nonzero temperature, or within air at some nonzero temperature, be emitting radiation?
Patrick 027 says
And re 426/425 Michele, remember that even if photons are being absorbed or scattered within short distances, this doesn’t mean they aren’t being emitted. (Aside from scattering, photons will be absorbed within shorter distances if they are being emitted more often per unit volume – this is the effect of absorptivity = emissivity.)
wayne davidson says
Another fault of us nice guys understanding AGW, is lamenting the contrarian garbage out there without pouncing on them when they were wrong,. So they the experts of erring, rage at every little detail, spreading confusion they love to brainwash on the lay, despite having been utterly wrong often, they appear again on contrarian prone anti-AGW programs. There should be no relenting against those who are paraded like real climate experts, when they cant predict anything about it, or refuse to do so because they don’t believe anybody else can. I encourage RC to ‘enhance’ any anti-AGW spokesperson on the merit of their ability with climate especially in failing predictions. In mind lately is the famous PDO enhancing an “ice age” gang, some at accuweather and the shows that support them. What is clear, and really above the politics of climate discussions is the ability to predict accurately, why would anyone listen to a contrarian failing projections again and again, we must keep them wallowing in their failures as opposed to disseminate a new delay tactic.
RickA says
GIA – sea-level rise adjustment question.
If the volume of ocean should be adjusted upward to counteract the sinking basins and rising coast (in some places), shouldn’t the volume of the ocean be adjusted downward to take out the thermal expansion due to the .8C temperature rise?
Or is thermal expansion already taken out?
Does anybody know?
[Response: It depends on what you want to do. Adding in the GIA is necessary if you want to use the satellite data to constrain the ocean volume changes (including stearic and eustatic effects). If you just want to understand the eustatic effects (i.e. how much more water is entering the ocean than is leaving), you would need to correct for density changes (mostly thermal expansion) as well. – gavin]
SecularAnimist says
Those pondering what might constitute a “climatic Pearl Harbor” might wish to read this analysis of the extreme weather events of 2010 and 2011, by meteorologist Jeff Masters:
2010 – 2011: Earth’s most extreme weather since 1816?
By Dr. Jeff Masters
June 24, 2011
Weather Underground
http://www.wunderground.com
He discusses “the top twenty most remarkable weather events of 2010” which include:
Sounds like a multitude of “climatic Pearl Harbors” to me.
Excerpt:
RickA says
Gavin in-line comment to RickA #430:
Thank you Gavin.
I am interested in sea level as it relates to climate change. For example how much of Florida will be underwater by 2100.
So based on your answer and Wikipedia – I am guessing I would be interested in both eustatic (change of sea level relative to a fixed point) and isostatic (change of land level relative to a fixed point) measurements. The two together should allow me to describe eustatic sea level relative to isostatic Florida coast at 2100 (assuming I had the proper data)(for example).
Again – thank you.
Daniel Bailey says
@ RickA
Depends on the amount of SLR by 2100. I put together SLR maps (with 1 and 6-meter projected rises) for various places around the world (including Florida) at Skeptical Science here.
Note: The mapping visualization tools used were developed by by Jeremy Weiss, lead author of Implications of Recent Sea Level Rise Science for Low-elevation Areas in Coastal Cities of the Conterminous U.S.A..
ReCaptcha: objec tiontype (that one was good)
Pete Dunkelberg says
OT Summer beach reading: How the Hippies Saved Physics by MIT physicist and science historian David Kaiser
Michele says
@ 427-428 Patrick
I was not saying about the BB but about what occurs inside a particle of air containing also an absorbing/emitting gas.
OK, the relationship above is obtained taking into account only the scattered EM-photons that directly excite the molecule and then are suddenly re-emitted. It seems that we are omitting the thermal-photons which become thermal energy of all the molecules (neutral and active) and are created from the thermal energy by means the collisions among the active molecules with the other (active and neutral) molecules.
I think it is implicitly contained by the claimed LTE, otherwise the temperature wouldn’t be constant. Of course, if the thermal photons aren’t contained by the density F, we should think that is F = Fem + Fth = constant (really, I have some doubt about it).
In any case we have always DF/Dt = 0, that means ∇•(uF) = F∇•u = 0 in a steady state, i.e., the photons density obey the continuity law as the mass density Dρ/Dt = 0 or the equivalent ∇•(ρu) = ρ∇•u = 0.
Sou says
Can you help? I went to this page that I like to refer to sometimes about climate change commitments and the article has disappeared. All that’s left is a link to a pdf file in italian.
https://www.realclimate.org/index.php/archives/2010/03/climate-change-commitments/
[Response: Click on the US/UK flag next to the title. (PS. Spanish, not Italian). – gavin]
Sou says
Thanks Gavin, that worked. My mouse skills aren’t bad but my language skills are lacking. :D
Ron R. says
OT. What a fitting visual epithet. A gas station surrounded by the polluted flood waters it helped to create.
Man’s Last Stand.
http://images.ctv.ca/archives/CTVNews/img2/20110625/800_minot_north_dakota_gas_station_flood_ap_110625.jpg
Patrick 027 says
Re 435 Michele (I had assumed the thermally-emitted photons were included in the F you were describing) –
DF/Dt = 0. Yes, if nothing is changing. If you add a new source of heat the temperature will rise until emission-absorption adjusts to balances the new heat source. If a parcel of air is rising and cooling adiabatically then the temperature declines and the rate at which photons are emitted decreases. If a change in temperature happens somewhere else than a change in the photon population passing through another location can change; DF/Dt can change.
Perhaps I’m mistaken about what you’re thinking – are you thinking that the photons which are emitted to space are somehow present in the air during ascent and when the air expands enough the photons are released? Because that’s not what happens. I’m having trouble understanding your point.
The continuity equation regarding photon density – yes, of course, if photons are redistributed over a larger volume then their density decreases – this is just a form of the conservation of energy, as the continuity equation in fluid mechanics is a form of the conservation of mass. But when photons are absorbed or emitted, energy is leaving one form and entering another – such as between the enthalpy of a population of non-photons and the radiant heat energy of a population of photons.
Perhaps you should look up – first, ‘Beer’s Law’, then, ‘Schwarzchild’s equation’ (Beer’s Law is a special case of Schwarzchild’s equation).
Patrick 027 says
DF/Dt can change.
– Actually I meant it can be nonzero, but it can change too; and dF/dt can also change (D/Dt is used in fluid dynamics as the derivative while followiing fluid motion (I think this can be refered to as the material derivative or the Lagrangian derivative); in this case I assume we’re following the macroscopic motion of non-photons; partial derivative symbols (the curly-d, looks like a backward 6) is then used instead for the Eulerian derivative, the rate of change at a given location in some frame of reference; which is what I meant by dF/dt, but I didn’t take the time to use the best symbol.
Of course, once equilibrium climate is achieved, then averaged globally and annually and over internal variation, etc, eulerian derivative dF/dt = 0, but DF/Dt can still be nonzero.
Septic Matthew says
More information on solar power costs:
http://www.triplepundit.com/2011/06/announcement-bank-america-putting-billions-solar/
I can’t tell for sure whether all costs are included, but there is no mention of subsidies.
733MW of power at a cost of $2.6 billion gives (assuming at least 80% of rated power for at least 30 years for at least 250 days per year for at least 8 hours per day): 3.52E10 kwh at a cost of $2.6E9, or $0.0739/kwh. That’s less than the cost of the unit from Home Depot because it is a mass purchase and mass installation.
caveat emptor in all cases.
As concentrated PV power is mass produced, the cost per kwh should be cut in half in a few years. I can hardly wait to update these calculations in the years ahead.
Septic Matthew says
Quick correction to my previous post: there is a federal loan guarantee, but not a direct subsidy to reduce the purchase price. The loan guarantee is worth some amount on the interest.
Still worth looking at next year to assess price changes, I think.
Richard Woods says
I’ve just come across Fritz Franzen’s site http://hfranzen.org/
He presents his “pdf file of a power point presentation of the basic science that results from the application of elementary basic physics and chemistry to the published spectroscopic data for vibration-rotation transitions for the asymmetric stretch of carbon dioxide, the known Planck radiation curves for the earth and the sun, and the solution of Beer’s Law modified for broad-band diffuse transmission by carbon dioxide in the air using partial pressures given by the widely accepted Keeling curve”
May I ask someone here, with more recent physical chemistry education than mine, to look over Franzen’s presentation to see whether there are any obvious wrong turns?
Richard Woods says
Oops. I left out the direct link to Franzen’s PDF
http://www.hfranzen.org/GWPPT6.pdf
and a secondary one at
http://www.hfranzen.org/CB%20with%20buffering.pdf
Michele says
@ 439-440 Patrick
If we want depict what occurs to a water particle flowing in a river, we don’t remain on a bridge because we would have a partial description (Eulerian point of view over the sole time) but we get into the boat and we go downstream together the water particle (Lagrangian point of view over the time and the space). So ‘Beer’s Law’ and ‘Schwarzchild’s equation’, referring to a fluid at rest, don’t hold in this case.
The Lagrangian derivative tells us more than the Eulerian one. DF/Dt tell us that the process is uniform, i.e., invariable over the time and the space.
The photons which contribute to the LTE within a volume V exist because V contains the emitting/absorbing molecules together the neutral ones, but all rest confined within V.
Michele says
Of course, until the volume V don’t become a source if therein are created photons using the thermal energy yielded by the sorroundings. In this case the CO2 molecules behaves as heat engines.
The Raven says
Another sort of climate response.
From NCAR:
UCAR/NCAR Press release
The article is by Jeffrey Lazo, Megan Lawson, Peter Larsen, and Donald Waldman. It is titled, “U.S. Economic Sensitivity to Weather Variability.” A preprint of the article is up on the Bulletin of the American Meteorological Society web site. Article. 36 pages, including charts and graphs.
Patrick 027 says
Re 445 Michele
If we want depict what occurs to a water particle flowing in a river, we don’t remain on a bridge because we would have a partial description (Eulerian point of view over the sole time) but we get into the boat and we go downstream together the water particle (Lagrangian point of view over the time and the space). So ‘Beer’s Law’ and ‘Schwarzchild’s equation’, referring to a fluid at rest, don’t hold in this case.
1.
But sometimes you can use Eulerian point of view with the awareness of motion (eulerian derviative + advection term = lagrangian derivative).
2.
Beer’s Law and Scharzchild’s equation don’t have much to do with fluid motion. Whatever optical properties exist at a location at the time a photon passes through, or is emitted or absorbed or scattered there, that’s what applies and should be used in the equation – if conditions are changing rapidly you could at least consider a population of photons moving along a path at one time, a pulse, which is only in one location at one time, and you could use such equations (add scattering terms to Schwarzchild’s) to describe how the intensity of that pulse changes as photons are added or removed from it.
3.
But fluid motions are so slow compared to photons that really, who cares? Photon travel from emission (or entrance into system) to absorption (or escape) are approximately instantaneous processes relative to the time it takes for temperature, pressure, or composition/phase to change at various locations (Eulerian) or in various parcels (Lagrangian) – at least in the context of radiation through planetary atmospheres.
The Lagrangian derivative tells us more than the Eulerian one. DF/Dt tell us that the process is uniform, i.e., invariable over the time and the space.
Poor application of a conservation law – the system is not closed and isolated, and stuff is happening; it is not generally allowed to reach thermodynamic equilibrium (as distinct from LTE among non-photons in a sufficiently small volume). I can see things changing all the time. When you flip a switch on a – let’s say incandescent light bulb (remember those?) – the temperature goes up and the radiation changes. Turn it off and it changes again. At night the ground cools off (on land, especially in some conditions, not so much others, can be overuled by advection) and emits less upward radiation; the air near the ground in that case also cools off and will emit less. Which brings F down. This would happen if the air was perfectly still (actually that tends to enhance nocturnal surface inversions). It’s got nothing to do with Eulerian vs Lagrangian.
The photons which contribute to the LTE within a volume V exist because V contains the emitting/absorbing molecules together the neutral ones, but all rest confined within V.
That (confinement) would tend to favor photons being closer to thermodyanic equilibrium with the non-photons. This tends to happen with sufficiently large optical thickness. The atmosphere is not so opaque at all frequencies.
Of course, until the volume V don’t become a source if therein are created photons using the thermal energy yielded by the sorroundings. In this case the CO2 molecules behaves as heat engines.
No, a heat engine takes a flow of heat from higher to lower temperature and diverts some of that flow and converts it to work. See related discussion at Skeptical Science (or did I cover that here?).
Taking one form of heat and converting to another form of heat, or absorbing heat from one thing by another, or releasing heat from something, is not what a heat engine does.
Michele says
@ 448 Patrick
Helmholtz’s theorem states that a vectorial field is wholly defined if we know anywhere its divergence and its curl. The divergence brings us to continuity that has to be obeyed always.
Heat engine.
The molecules behave as heat engines because they absorb the thermal energy, that represents the lowest energetic form, and transforms part of it to EM wave energy, wasting the rest still as thermal energy. The heat is transferred by EM waves but, mind out there, the EM waves aren’t heat.
Patrick 027 says
Re 449Michele Helmholtz’s theorem states that a vectorial field is wholly defined if we know anywhere its divergence and its curl. The divergence brings us to continuity that has to be obeyed always.
Know anywhere? I know that you can take a vector field and find linearly superimposable components and you can choose to take just 2 to create the whole field with one being irrotational and containing any of the divergence, and one being non-divergent and containing any of the vorticity (curl). I think you can reconstruct in full any non-divergent flow pattern given the distribution of vorticity and add any necessary non-divergent and irrotational component to the result to fit boundary conditions (I know this is true for flow in two-dimensions; I think it should be true in three, … maybe 4 and 5, etc, but that’s not necessary here. … Of course you can reconstruct any non-divergent flow field given the potential vorticity distribution, a balance relationship (such as geostrophic or gradient-wind balance) and sufficient boundary conditions (such as that required by conservation of momentum), but that’s actually going a step beyond what we’ve been concerned with here because it brings both the potential density field and velocity field into play).
But I don’t think you can reconstruct any specified velocity field just knowing the divergence and vorticity at a couple locations. Also, there is also such a thing as non-divergent and irrotational deformation, which comes in two flavors, actually (in two-dimensional flow; I haven’t thought about that in 3 dimensions.
PS Synoptic and planetary scale atmospheric dynamics generally has a focus on two-dimensional quasi-horizontal flow patterns; (resolved) vertical motions are much slower on those scales – not that it isn’t important but it can be treated in a distinct way – the behavior of the atmosphere and ocean is very anisotropic in three dimensions. I suspect it should be a bit less so in the mantle, where there isn’t a general tendency (except perhaps for the effects of the ~ 660 km depth phase transition) to stable stratification (even though the troposphere is convective, the existence of moist convection tends to stabilize the atmosphere to dry convection, while the existence of large-scale unsaturated conditions makes this stratification important) and also the depth scale is more similar to the horizontal length scale; also in the outer core, except for the orientation of flow patterns with respect to the axis of rotation).
Continuity only has to be obeyed when there are not sources and sinks. Or you could include that in your continuity equation; the point is that when you are only looking at one form of energy (radiation), it certainly can be increased or decreased in amount, thus divergent flux can exist even with density being constant, or density can change without divergence of flux – so long as the descrepancy is balanced by a source or sink.
There are sources and sinks.
I suppose EM waves are not heat in the same way that molecular collisions and mass diffusion are not heat? Radiant transfer of energy between two objects at LTE involves two photons fluxes with a net flux being from higher to lower temperature, which increases entropy because the entropy flux = heat flux / temperature and so with the same net flux leaving at higher temperature and entering at low temperature there is a gain in entropy, just as there would be for a flux of sensible heat. Latent heat can flow with entropy gains via a flux from higher to lower concentration of a substance, and I don’t know how to calculate the entropy in that case, but there is an entropy gain.
You can assign a brightness temperature to any particular monochromatic (and if necesary, polarized, etc.) radiant intensity, which indicates the entropy involved.
You can certainly use higher brightness temperature radiation to run a heat engine if the heat sink is at lower temperature. As when the sun heats the Earth (in this case the heat source temperature is the temperature where the solar radiation is absorbed, so with respect to the sun a heat source there is substantial inefficiency there; but the solar radiation temperature has to be at least as hot as the temperature where it is being absorbed in order for this to work, otherwise there would be more radiation lost than gained and the Earth’s climate system’s heat engine’s heat source would no longer be a source), convection occurs, and radiation emitted at lower temperature (the temperature of space itself is not the Tc for the heat engine).
But it is just incorrect to say that CO2 molecules are heat engines, at least if they are as a group at LTE, and on the individual molecular scale, molecular collisions involve work, so it is a work-work conversion; the disorder renders both the collisional energy transfer and photons as heat on a macroscopic scale.