Guest commentary by David Karoly, Professor of Meteorology at the University of Melbourne in Australia
On Saturday 7 February 2009, Australia experienced its worst natural disaster in more than 100 years, when catastrophic bushfires killed more than 200 people and destroyed more than 1800 homes in Victoria, Australia. These fires occurred on a day of unprecedented high temperatures in south-east Australia, part of a heat wave that started 10 days earlier, and a record dry spell.
This has been written from Melbourne, Australia, exactly one week after the fires, just enough time to pause and reflect on this tragedy and the extraordinary weather that led to it. First, I want to express my sincere sympathy to all who have lost family members or friends and all who have suffered through this disaster.
There has been very high global media coverage of this natural disaster and, of course, speculation on the possible role of climate change in these fires. So, did climate change cause these fires? The simple answer is “No!” Climate change did not start the fires. Unfortunately, it appears that one or more of the fires may have been lit by arsonists, others may have started by accident and some may have been started by fallen power lines, lightning or other natural causes.
Maybe there is a different way to phrase that question: In what way, if any, is climate change likely to have affected these bush fires?
To answer that question, we need to look at the history of fires and fire weather over the last hundred years or so. Bushfires are a regular occurrence in south-east Australia, with previous disastrous fires on Ash Wednesday, 16 February 1983, and Black Friday, 13 January 1939, both of which led to significant loss of life and property. Fortunately, a recent report “Bushfire Weather in Southeast Australia: Recent Trends and Projected Climate Change Impacts”(ref. 1) in 2007 provides a comprehensive assessment on this topic. In addition, a Special Climate Statement(ref 2) from the Australian Bureau of Meteorology describes the extraordinary heat wave and drought conditions at the time of the fires.
Following the Black Friday fires, the MacArthur Forest Fire Danger Index (FFDI) was developed in the 1960s as an empirical indicator of weather conditions associated with high and extreme fire danger and the difficulty of fire suppression. The FFDI is the product of terms related to exponentials of maximum temperature, relative humidity, wind speed, and dryness of fuel (measured using a drought factor). Each of these terms is related to environmental factors affecting the severity of bushfire conditions. The formula for FFDI is given in the report on Bushfire Weather in Southeast Australia. The FFDI scale is used for the rating of fire danger and the declaration of total fire ban days in Victoria.
Fire Danger Rating FFDI range High 12 to 25 Very High 25 to 50 Extreme >50
The FFDI scale was developed so that the disastrous Black Friday fires in 1939 had an FFDI of 100.
To understand the environmental conditions associated with the catastrophic bushfires on 7 February 2009, we need to consider each of the factors and the possible role of climate change in them.
Maximum temperature: This is the easiest factor to consider. Melbourne and much of Victoria had record high maximum temperatures on 7 February (2). Melbourne set a new record maximum of 46.4°C, 0.8°C hotter than the previous all-time record on Black Friday 1939 and 3°C higher than the previous February record set on 8 February 1983 (the day of a dramatic dust storm in Melbourne), based on more than 100 years of observations. But maybe the urban heat island in Melbourne has influenced these new records. That may be true for Melbourne, but many other stations in Victoria set new all-time record maximum temperatures on 7 February, including the high-quality rural site of Laverton, near Melbourne, with a new record maximum temperature of 47.5°C, 2.5°C higher than its previous record in 1983. The extreme heat wave on 7 February came after another record-setting heat wave 10 days earlier, with Melbourne experiencing three days in a row with maximum temperatures higher than 43°C during 28-30 January, unprecedented in 154 years of Melbourne observations. A remarkable image of the surface temperature anomalies associated with this heat wave is available from the NASA Earth Observatory.
Increases of mean temperature and mean maximum temperature in Australia have been attributed to anthropogenic climate change, as reported in the IPCC Fourth Assessment, with a best estimate of the anthropogenic contribution to mean maximum temperature increases of about 0.6°C from 1950 to 1999 (Karoly and Braganza, 2005). A recent analysis of observed and modelled extremes in Australia finds a trend to warming of temperature extremes and a significant increase in the duration of heat waves from 1957 to 1999 (Alexander and Arblaster, 2009). Hence, anthropogenic climate change is likely an important contributing factor in the unprecedented maximum temperatures on 7 February 2009.
Relative humidity: Record low values of relative humidity were set in Melbourne and other sites in Victoria on 7 February, with values as low as 5% in the late afternoon. While very long-term high quality records of humidity are not available for Australia, the very low humidity is likely associated with the unprecedented low rainfall since the start of the year in Melbourne and the protracted heat wave. No specific studies have attributed reduced relative humidity in Australia to anthropogenic climate change, but it is consistent with increased temperatures and reduced rainfall, expected due to climate change in southern Australia.
Wind speed: Extreme fire danger events in south-east Australia are associated with very strong northerly winds bringing hot dry air from central Australia. The weather pattern and northerly winds on 7 February were similar to those on Ash Wednesday and Black Friday, and the very high winds do not appear to be exceptional nor related to climate change.
Drought factor: As mentioned above, Melbourne and much of Victoria had received record low rainfall for the start of the year. Melbourne had 35 days with no measurable rain up to 7 February, the second longest period ever with no rain, and the period up to 8 February, with a total of only 2.2 mm was the driest start to the year for Melbourne in more than 150 years (2). This was preceded by 12 years of very much below average rainfall over much of south-east Australia, with record low 12-year rainfall over southern Victoria (2). This contributed to extremely low fuel moisture (3-5%) on 7 February 2009. While south-east Australia is expected to have reduced rainfall and more droughts due to anthropogenic climate change, it is difficult to quantify the relative contributions of natural variability and climate change to the low rainfall at the start of 2009.
Although formal attribution studies quantifying the influence of climate change on the increased likelihood of extreme fire danger in south-east Australia have not yet been undertaken, it is very likely that there has been such an influence. Long-term increases in maximum temperature have been attributed to anthropogenic climate change. In addition, reduced rainfall and low relative humidity are expected in
southern Australia due to anthropogenic climate change. The FFDI for a number of sites in Victoria on 7 February reached unprecedented levels, ranging from 120 to 190, much higher than the fire weather conditions on Black Friday or Ash Wednesday, and well above the “catastrophic” fire danger rating (1).
Of course, the impacts of anthropogenic climate change on bushfires in southeast Australia or elsewhere in the world are not new or unexpected. In 2007, the IPCC Fourth Assessment Report WGII chapter “Australia and New Zealand” concluded
An increase in fire danger in Australia is likely to be associated with a reduced interval between fires, increased fire intensity, a decrease in fire extinguishments and faster fire spread. In south-east Australia, the frequency of very high and extreme fire danger days is likely to rise 4-25% by 2020 and 15-70% by 2050.
Similarly, observed and expected increases in forest fire activity have been linked to climate change in the western US, in Canada and in Spain (Westerling et al, 2006; Gillett et al, 2004; Pausas, 2004). While it is difficult to separate the influences of climate variability, climate change, and changes in fire management strategies on the observed increases in fire activity, it is clear that climate change is increasing the likelihood of environmental conditions associated with extreme fire danger in south-east Australia and a number of other parts of the world.
References and further reading:
(1) Bushfire Weather in Southeast Australia: Recent Trends and Projected Climate Change Impacts, C. Lucas et al, Consultancy Report prepared for the Climate Institute of Australia by the Bushfire CRC and CSIRO, 2007.
(2) Special Climate Statement from the Australian Bureau of Meteorology “The exceptional January-February 2009 heatwave in south-eastern Australia”
Karoly, D. J., and K. Braganza, 2005: Attribution of recent temperature changes in the Australian region. J. Climate, 18, 457-464.
Alexander, L.V., and J. M. Arblaster, 2009: Assessing trends in observed and modelled climate extremes over Australia in relation to future projections. Int. J Climatol., available online.
Hennessy, K., et al., 2007: Australia and New Zealand. Climate Change 2007: Impacts, Adaptation and Vulnerability. Contribution of Working Group II to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, M.L. Parry, et al., Eds., Cambridge University Press, Cambridge, UK, 507-540.
Westerling, A. L., et al., 2006: Warming and Earlier Spring Increase Western U.S. Forest Wildfire Activity. Science, 313, 940.
Gillett, N. P., et al., 2004: Detecting the effect of climate change on Canadian forest fires. Geophys. Res. Lett., 31, L18211, doi:10.1029/2004GL020876.
Pausas, J. G., 2004: Changes In Fire And Climate In The Eastern Iberian Peninsula (Mediterranean Basin). Climatic Change, 63, 337–350.
Ray Ladbury says
Mark, I would disagree that h, c G, etc. are any less fundamental than pi. As to the term “natural units,” it merely means that the equations simplify if these units are used.
Mark says
Ray, I’m not saying they aren’t fundamental either.
I’m saying that, according to physics today, there is no reason why G takes precisely that value. The value we get for G is entirely measured.
Can you stop telling me things I haven’t said were false and act as if I were unable to grasp the concept of scale sizes or fundamental constants.
Hank Roberts says
Just in case anyone needs it, down this thread a ways you’ll find the calculation for planck’s constant in the furlong/firkin/fortnight system:
http://www.physicsforums.com/archive/index.php/t-179681.html
Timothy Chase says
Mark, the difficulty at this moment I believe was essentialized right here where you write in 250:
As I have pointed out more than once, c (the speed of light) is not a pure number. It has dimensionality, namely, that of speed, or LT-1.
So if you attempt to measure c, you must specify the standard of measurement that you are using to measure that speed. Will it be miles per second? Then you are speaking of (roughly) 186,272. Are you speaking of meters per second? Then you are speaking of 299,792,458. Two very different numbers — because they aren’t pure numbers but pure numbers multiplied by a standard of measurement existing along a specific dimension of measurement.
The speed of light isn’t 186,272. The speed of light is 186,272 miles per second. The speed of light isn’t 299,792,458. The speed of light is 299,792,458 meters per second. More or less. But in either case it is a mistake to drop the unit of measurement, for without the unit of measurement the number becomes meaningless as a measure of what it is measuring.
*
Now if you see this, can you see why it is in all likelyhood impossible for us to ever explain why the speed of light has the value that it has?
We measure the speed of light in terms of a human system of measurement. But the speed of light is something that exists independently of that human system of measurement and is universal. Nothing with rest mass can travel at the speed of light, and nothing without rest mass can travel at any speed other than the speed of light. And as far as we can tell, this has been true since the beginning of our universe. As such it is a truly universal constant. Likewise, Planck’s constant and the gravitational constant are universal. Even pure energy traveling at the speed of light will have an equivilent mass and consequently a gravitational field to which G applies.
But let’s keep things simple and focus on the speed of light.
Now how would you explain the value of the speed of light?
I don’t have a problem with measuring the speed of light in the English system or the Metric system. But you can’t explain the value of the speed of light except by reference to something or some things of the appropriate dimensionality — so that the speed of light could be measured in terms of ratio when compared against something else. And it would have to be something universal, indepedent of human history. Planck distance? Planck time? Well, perhaps. But why two different units of different dimensionality when you already have something of the appropriate dimensionality? Namely, the speed of light itself.
However, maybe there is something that has the appropriate dimensionality and is equally universal. Could you explain the speed of light in terms of that? Well, yes. Perhaps the speed of light is twice the speed of… Well, no, there really isn’t anything else, no other speed at least.
So in terms of theory, it would seem that the speed of light represents something truly fundamental, something which can be used in explanations, theories, etc., but cannot itself be explained, not in terms of its numerical value at least. We can measure it using our standards or systems of measurement, but when we do so, that is actually saying more about our standards and systems than it is about the speed of light itself.
*
Now would it make any difference to the world if the speed of light were twice what it currently is?
In all likelyhood, it would not. Since the speed of light isn’t simply the speed of a single particle, but is a universal limit which applies to all particles, doubling the speed of light would be equivilent to doubling the speed of all processes in the universe, or alternatively, halving all durations. But as anything that we might measure things in relation to also exists within the universe, it will be similarly affected — and none of the ratios will change. Have you ever wondered what would happen if everything were suddenly become twice as large as before? The answer of course is, “Nothing at all.”
As was pointed out in the link that Ray gave us:
Anyway, I hope this helps.
Phil says
Paulo (246), you are right – no fuel without fires. The difficulty is that especially for these conditions, the forest is the fuel. If you want to remove the fuel you have to remove the forest. It’s not valid to compare dry Western Australian forest on flat ground to steep, tall Mountain Ash. There’s just a lot more forest there, and when it dries out due to unprecedented climatic extremes, it’s all available as fuel. These fires ripped through logged, regularly burnt Forestry land. The entire argument that “more burning off would have stopped the fires but greens prevented that happening” is painfully irrelevant and completely misguided.
Paulo says
Phil (255),
Yes, I am aware of the differences between forests in WA and SE (could see them last August). However, understorey fuel structure in the Dandenongs did not seem different from the jarrah forest in WA. Note that the forest is the fuel only when fuel accumulation in the surface, near-surface, elevated and bark fuel layers (to use the oz terminology)exceeds a given thresold. To achieve fuel management on an adequate spatial scale is probably more difficult in steep terrain and moister forest, but this is a different issue …
Mark says
Tim 254. And your essay said what?
Yes, a speed is a velocity and has dimensionality. Do you think I managed to get through physics with astrophysics degree without learning of this element?
And you still have to measure it. there is no reason why it happens to be this figure.
Your later rambling seems to be saying the same thing.
So why the heck did you post all that crud? Why did you include such startling revelations that by changing the units you get simpler equations because your proportionality constants become one ***and mathematical convention*** says you can leave them out if they are multipliers of 1.
Whoo. I never knew.
Just like I never knew the SPEED OF LIGHT was, like, a velocity, man!
Such revelations.
Did you think I didn’t know??? Why?
Now, please, tell me how you could get the velocity of light from first principles?
If not, then as I have said EVERY SINGLE BLEEDING TIME ***you have to measure it***.
Which takes us back to the ORIGINAL POINT that saying “you have to ***measure*** stephan’s constant to get the radiative transfer rates” is the same bleeding phenomenological result as “you have to ***measure*** planck’c constant and multiply by some figures you can calculate from first principles to get the radiative transfer rates”.
And which is easier to find an experiment that returns stephan’s constant: measuring intensity values of a black body with a photometer and diffraction system or find a way to measure planck’s constant directly?
You tell me.
Timothy Chase says
Mark, you write in 257:
You don’t get or “derive” the speed of light from first principles — because the speed of light is a first principle. That is what it means for something to be “fundamental.” And when you measure the speed of light, this is actually telling you about how your (largely arbitrary) system of measurement stands in relation to world, not about the world itself — which simply is what it is. And it makes no sense to speak of how the universe would be different if the speed of light were different from what it is, say twice or half as fast — because the universe that would result would be for all intents and purposes the same.
If you don’t feel like reading as much as I wrote in 254, perhaps you could read just the short paragraph by English cosmologist John Barrow that I quote at the end. However, at this point I am finished.
Mark says
258.
Good.
So you know. I know.
Now did I never say you COULD derive from first principles?
No.
Now why are you telling me you can’t derive it from first principles?
Any reason?
Any at all?
I never said you could. I said you had to measure.
So now it looks like you were going:
Me: you have to measure light speed. You can’t derive from first principles.
Tim: No, mark you have it wrong. (you can’t derive light speed from first principles) (
Rod B says
Ray, Stefan’s constant can be derived; even the speed of light can be derived in a sense. In any case the fact that it is log dependent certainly could be based on radiation absorption theory though it is directly derived only for certain ranges that have been analyzed. Theory does not say exactly where it changes for linear to log to square root dependency. (In fact different people will say different sequences.) SEcondly (though more pertinent to my assertion) the fact that the constant is empirically derived based on strictly lab observations has no credibility (for a “strong” science) just because other physical or chemical properties are empirical. The latter have been derived 10 ways from Sunday over very long periods. More illustrative, that the constant changes significantly over a few years because the observations are different is not in the least characteristic of a “strong” science, let alone “…the most exacting and solid physics available to humanity..” as Timothy Chase hyperbolizes in an otherwise decent post (232). Ray, believe your baby has no blemishes if you wish; perfectly acceptable — but not true.
Hitting logs and exponents once more (though it’s getting tiresome — to me and I’m sure for everybody else), where do the units “bel” come from a LOG(I/Io)? We just made them up — for convenience and clarity; like the watts/meter-squared in today’s 5.35.
(My units comment pales compared to Timpthy’s, Mark’s and Ray’s discussion: just trying to keep it short and simple.)
Finally, before it gets totally lost in the shuffle, recall I am not totally refuting this part of the science. There are strong indicators that it might at least be in the ballpark, at least for some concentrations and/or gases. I’m addressing only the so-called “degree of strength”.
FCH, I’m addressing only one part of the science here.
I’d like to explain imply and infer to Hank, but this topic is already way out of control. Everyone is upset… because maybe my karma ran over your dogma???
Phil says
Ah Paulo! (256) A man that knows the Aussie fire science, I think I have you now. I suspect you also saw some Adelaide country when you were last down under.
If you have a copy of the Project Vesta report handy, have a look at the graphs for shrub height (fig 3.8) and ROS (fig. 6.6) against time since fire. They have only fit a curve to the McCorkhill data for shrub height because the DeeVee data has an opposite trend. Whereas McCorkhill shrubs continue to grow up to 15 years or more, Dee Vee shrubs grow up to about 6 years then begin to senesce so that the 20 year shrub height is the same as the 2 year height. Shrub height is the main fuel predictor used for flame height, but the correlation matrix in the appendix shows that it is also the strongest predictor for ROS. Accordingly, the Dee Vee data shows that ROS does not continue to increase with fuel age as it does at the McCorkhill site, but it either does not change or at some wind speeds has a marked decrease after 6 years. Basically, the longer McCorkhill forests go without fire, the greater the flame height and ROS produced – exactly as we would expect. On the other hand, the forest at Dee Vee has a maximum ROS and flame height 6 years after fire. From this point on it becomes less flammable. Both are Jarrah forests, but due to the ecology of the understorey the fuel dynamics are quite opposite. I suggest that both sites require different fuel management.
You mentioned that you thought the “understorey fuel structure in the Dandenongs did not seem different from the jarrah forest in WA”. I suggest that since there can be such variance within the Jarrah forest itself, might not the Mountain Ash be quite different again? Consider these points:
1) Long unburnt Mountain Ash has a rainforest understorey. Burning the forest produces an immediate flush of Bracken ferns which provide a near surface fuel score of 4 in the year after being burnt.
2) The understorey in Ash is typically composed of thicker leaved, high moisture content plants with very little dead material, especially in older growth forests. No matter how dense this understorey, these characteristics mean that it can never be ranked higher than a “moderate” elevated fuel score.
3) The dense canopy and high rainfall of Ash forests mean that surface & NS fuels have a very high moisture content most of the time.
Now, considering these factors, in most years it will be very difficult to get a fire going at all because the fuel is wet. In the heat of summer the low fuels will dry out and allow a fire to spread through the bracken or heavy surface litter, but more often than not it will not ignite the shrubs or midstorey due to the fact that the thick leaves and high moisture of the plants gives them a very long ignition delay time. By contrast, fires in Jarrah forest will spread far more readily duel to the drier litter & NS fuels, and the fine leaved shrubs with their high dead component will catch fire quickly.
If we now impose the extreme climatic conditions experienced in the Marysville/Kinglake area, we not only have very dry surface litter & bracken, but the shrubs, midstorey & canopy are very dry and therefore have much shorter ignition delay times. The fuel ladder to the canopy is complete.
Apply this to fuel management. You said “the forest is the fuel only when fuel accumulation in the surface, near-surface, elevated and bark fuel layers (to use the oz terminology)exceeds a given thresold”. That threshold varies widely depending on the conditions. Under most conditions there would not be enough fuel to ignite the higher fuel strata in an Ash forest even at a full surface fuel load of about 30 t/Ha; it’s just too wet. Under the conditions experienced recently however, there needs only to be a very small amount of fuel to produce a flame that will ignite the highly moisture stressed vegetation. The reality is that burning Mountain Ash without killing the trees reduces surface fuels for a very short period while greatly increasing the NS fuels (Bracken) from year 1 onward. It does not affect the midstorey but does produce a (temporarily) shorter and more dense understorey.
Contrasting fire in a recently burnt Mountain Ash forest with fire in a long-unburnt forest: there is a greater flame height/ROS in recently burnt forest under less severe conditions due to the Bracken. The lower shrub layer is closer to the flame and catches fire more readily; being more dense it allows flame spread more easily from plant to plant and increases the ROS yet again. Flames from plants burning in close proximity merge and produce much larger flames, igniting the midstorey and thereby the canopy more easily. In short, recently burnt Ash forest is more flammable than long-unburnt – probably one of the reasons the Aboriginal people never burnt it.
Just to make this clear – I am not suggesting that prescribed fire is a bad idea. I am saying that as illustrated by the differences in the fuel dynamics at Dee Vee & McCorkhill, fire is good management in some environments but not others. I don’t believe that it is good management in Mountain Ash forests (although I’m happy to be corrected by evidence), and I don’t think the current craze in Australia of telling the land managers that they could have saved 200+ lives if they had burnt the Ash more often is justified.
If you’re interested, have a look at http://www.bushfirecrc.com/publications/B_Zylstra.pdf for a rough overview of the effect of plant moisture on forest flammability.
Marco Parigi says
The entire argument that “more burning off would have stopped the fires but greens prevented that happening” is painfully irrelevant and completely misguided.
I do think that that particular sound-bite is overly simplistic, but I think there is some irony and hypocracy exposed if you look at green policy as a whole.
Two other points:
* Greens support the idea of offsets for planting trees – but if these trees are in a fire prone area the carbon stored, rather than being sequestered will be released in an uncontrolled way in the next bushfire or burnoff.
* Logging of trees for furniture/ building /paper tends to lock away the carbon in houses or landfill more effectively, in a market-friendly way than leaving them where they are, yet both logging and the liberal use of timber are frowned upon by the greens.
Richard Hill says
Prof Karoly refers to the “high quality rural site” of Laverton near Melbourne. Andrew Watkins in comments compares it to Melbourne to disount the UHI effect for Melbourne city.
AFAIK the recording station for Laverton is at the aerodrome. The Laverton aerodrome may have been surrounded by farmland when earlier records were set. Today it is quite different. It has been developed as a training college with several additional buildings on the aerodrome itself. It has medium density housing to the east, an 8-lane freeway to the south. To the north east is an industrial area including a steel mill and a chemical plant. The temperature record was set when a strong North wind was blowing. Directly north of the aerodrome is a sports field but north of that is a large scale quarry and associated concrete products manufacturing. (and a prison). Please comment if any of these facts are wrong.
Jim Eaton says
#262 Marco Parigi Says:
“* Logging of trees for furniture/ building /paper tends to lock away the carbon in houses or landfill more effectively, in a market-friendly way than leaving them where they are, yet both logging and the liberal use of timber are frowned upon by the greens.”
Yes, if the trees end up being for furniture/ building /paper are preserved for a very, very long time, this carbon is being sequestered. But if these products are allowed to burn or decay, the carbon is released to the atmosphere.
Also, contemporary logging uses fossil fuels to cut, mill, and transport wood products. Slash and other non-commercial parts of the trees are left to rot (and release CO2).
If the trees being logged are part of an old-growth forest, there are further problems. The prime method of logging is by clear cutting (or euphemisms for essentially the same thing). The slash is burned (releasing CO2), and the replanted forest is what is called an even-aged stand (often of only one specie of tree), These plantations are highly susceptible to fire, which destroys the young forest and releases even more CO2.
Of course, there is the other issue of the loss of biodiversity resulting from the destruction of ancient forests. Old-growth forests tend to be the most fire resistant stands of trees. Logging healthy forests is not the answer.
However, global warming is causing a massive die-off of forests in British Columbia and parts of the Rocky Mountains. Milling some of these trees into products likely to last a very long time, and replanting with trees likely to survive in the changing climate might be a good strategy.
The wizard Captcha says, “argue particular.”
Nathan says
Marco I think you do injustice to Green policies.
Where exactly are they hypocritical or ironic?
Your point 1:
The entire tree doesn’t burn to ash, so the carbon isn’t actually all released. Tree death may occur in very hot fires, but Australian trees are adapted to fire generally and do survive. The quick regrowth by Eucalypts following fires will also get to sequestering the carbon again.
Point 2
Green policy is opposing logging in old growth forests. They are in fact promoting the use of plantations and the liberal use of wood. The interesting thing is that most if the initial opposition to logging in old growth wasn’t to stop furniture being made but rather to prevent the wood being used for paper pulp. Most timber is actually cut for woodchips to make paper. It was a very low quality product.
Alan of Oz says
RE #262 “there is some irony and hypocracy exposed if you look at green policy as a whole.”
The problem with the words “green policy” indeed with the word “greenie” is who exactly are we talking about? The Australian Green Party supports sustainable logging of native species and controlled burning. It supports carbon credits for trees but I doubt they mean pine plantations.
There was a lot of FUD in the news about greenies stopping people from clearing brush, not the least of which was the guy who was fined $100K for “clearing trees” – search for “shehan” here. All this finger pointing misses the point entirely, we here in the SE of Oz have entered a new realatity.
Notes: See my post above. I have lived in Vic for 50yrs, I’m an ex-sawmiller, life-long “greenie” and have no connection to the Green Party.
Paulo says
Phil,
Thanks for the detailed explanation. However,I think you were too selective on the VESTA findings you mention. Different fuel layers behave differently with time in the 2 sites, but the overall hazard keeps growing with fuel age at both sites (Fig. 3.12).
I now understand better the fire environment in the mountain ash, but if fire will promote Bracken fern, it means that an aerated NS dead layer will be available to burn on dry winter days and thus will turn prescribed fire easier to carry out. In our Portuguese forests bracken often seems to moderate fire spread in summer (moisture contents >150%, when shrubs have live moistures of 60-90%) and help prescribed fire in winter when it dries and collapses. What you say about flammability vs. live fuel moisture makes sense, and in Europe many lab studies have been conducted on this since the 70s, but field studies have never, anywhere in the world, succeeded to relate rate of spread and live moisture content. So, it might be that you are placing too much emphasis on the role of living tissues moisture.
Mark says
RodB, 260.
Nope, it’s form is taken from theory. Derived as accurate for the explanation of reality that is radiative theory on corpuscular light. the quantum nature of light and well grounded statistical theory.
Varies as T to the fourth power.
This being mathematics is provable as a correct proportionality for any wavelength from infinity to infinitesimal.
Not restricted to any wavelength range.
Did you do *any* physics at school? They never said there that this proportionality was only over a certain range, so you didn’t get it there. Did you take it any further on so as to know the derivation yourself? If you had, you would know that there is no limit to the proportionality function in wavelength.
Stephan’s constant cannot be derived except by virtue of saying it’s proportional to another number that cannot be derived (so the derivability is commutative: the value of stephan’s constant is defined by your ability to measure rather than your ability to calculate).
Although Ray and Tim seem to disagree with the statement “Stephan’s constant must be measured” this isn’t correct.
Mark says
RodB says in 260: FCH, I’m addressing only one part of the science here.
So you’re only PARTLY lying by omission.
Does that make it alright?
Theory DOES say where a proportionality based on infinitesimal changes changes to a proportionality based on large changes.
SHM is based on that fact: at small angular deflections, the restoring force on a pendulum is linearly proportional to the angular displacement. Theory showing where this is no longer the case is defined by the Taylor Series Expansion of Sin(theta).
Theory DOES say where the proportionality fails to hold. It depends on how inaccurate the system measured or dependent on the simplification must be to be measurably different.
David Horton says
Phil (261) – good summary of the fallacy of prescribed burning ash forests to reduce fire. But the same arguments, though different in detail, apply to prescribed burning in all forest types. That is, the effect of frequent burning perversely (because of changes in soil moisture content and understorey shrubs – often including weed invasion) may result in more fire prone habitats. And in addition, and it needs to keep being said, frequent burning is damaging in the short term to all the components of the ecosystem except perhaps the trees and the large macropods (the two components always trumpeted by fire managers and the media), and in the long term the trees will suffer too. Fire managers are not ecologists, and that fact is going to be the cause of great damage to Australian ecosystems if the fire managers get their way with the help of populist politicians.
Barton Paul Levenson says
Marco, I think what’s frowned on by the greens is clear-cutting huge swaths of forest, no matter how much carbon it sequesters.
Penguin unearthed says
Great summary. Just wanted to say (as others have) that a fairly small (say 1 degree) increase in average temperatures has a much more extreme impact on the probability of high temperatures – doubling or tripling the probability of greater than 40 degree temperatures with low humidity (according to the Centre for Climate Change research here in Australia).
Also, the flashpoint of eucalyptus oil is 49 degrees celcius – so getting close to that matters a lot in the context of whether a fire starts racing through the countryside.
Chris O\'Neill says
Looks like someone’s main interest in posting here is an anti-greens campaign.
Phil Zylstra says
Marco (262), I have to agree that my sentence taken out of context sounds a little overly enthusiastic, so let’s keep the context eh? I wrote that specifically about Black Saturday, and have backed it up in previous posts. I say that it is irrelevant because the land burnt was predominantly forestry and not conservation land, and I say that it is misguided because if you do the science (see 233 & 261), more burning off could not have stopped those fires. I don’t say burning off doesn’t assist the control of any fires anywhere, just that it wouldn’t have saved the 210 lives on Feb 7. You’re welcome to disagree, but if you do I ask that you go beyond words like “hypocracy” and just show where my science is wrong.
As for the other points, I’m not actually talking about “greens” as a political party but in the broad sense of people that care about the environment. But some very brief thoughts –
1) I don’t agree with planting out trees for carbon or timber in a way that will connect fire paths with villages etc. Burning trees however does not increase CO2 long-term because the regrowth takes it back up again. The amount stored is about the total biomass of forest, which is larger if more area is planted out (for sequestration or for forestry). Fires produce only temporary fluctuations in the balance.
2) I agree that using timber products stores carbon, however the issue is the same. If forests are being cleared faster than they are being planted or regrowing, then there is a loss of carbon from the forests. The amount stored in buildings/furniture etc needs to equal the biomass lost to break even – does it? I don’t know any figures so I don’t argue either way, however I am aware of a lot of places that used to be forest in my area but were logged and not replanted or regenerated. These may be the exceptions, but if they’re not then I’d say the Greens are right. Check the figures and satisfy your own conscience, I honestly don’t know.
Ray Ladbury says
Rod B., At this point, you are merely embarassing yourself. Yes, the S-B constant can be derived. That wasn’t a good example. The speed of light, c, cannot. Nor can h, G, the electromagnetic coupling and on and on. There is a reason why physics is an empirical science. Hell, Rod, even aspects of mathematics are empirical.
The contribution of an increase in ghg becomes logarithmic with it’s main effect is to increase absorption in the tails of the distribution. The only ambiguity is where you consider the tails to begin. With CO2, there is an additional ambiguity–the fact that CO2 stays well mixed at high altitudes where the main line absorption is not saturated. This is all well understood by scientists, if not by you. There is zero basis for skepticism of the consensus position based on the radiative physics. Moreover, if the forcing due to CO2 were wrong, you couldn’t explain why Earth is not a snowball. There is only one argument that makes any sense from a physical standpoint–that there is some negative feedback that counters CO2 forcing at temperatures above ~285 K. Unfortunately, there is no mechanism for such a negative feedback, no evidence favoring it and evidence from paleoclimate against it.
Rod, why should we take your skepticism seriously when it is so clear that you don’t understand the science?
William says
#130 Mike.
First of all, thanks for your response to my comment. It did help narrow my thinking as to what exactly the focus of these conversations are really about. It was pointed out to me that Nasa’s glossary has it’s own definition of theory: (http://earthobservatory.nasa.g…../?mode=all )
“An explanation for some phenomenon that is based on observation, experimentation, and reasoning”.
Another blogger also pointed out to me the following line of reasoning: “If we can’t call AGW a theory, can we call it “an explanation for detectable enhanced warming of the earth’s surface as a result of ghg’s added by man? We could name things like “the theory apples suspended in air fall toward the earth as a result of the force of gravity”. But no one is going to because they aren’t important enough to foster much discussion. The proposed “explanation for detectable enhanced warming of the earth’s surface as a result of ghg’s added by man” is discussed. So, it’s been given a name: AGW.”
I agree that there is no dispute in the scientific community that adding CO2 to the atmosphere will increase temperature. What is disputed is how much temperature will increase giving differing levels of CO2 and do we really understand all of the other processes such as clouds and water vapor to understand their positive or negative contribution to the puzzle. In addition, it’s arguable how much the detectable enhanced warming has occurred due to other effects mankind has had on the environment such as changes in land use for over 50% of the surface of the planet or even the placement of devices used to measure temperature. I submit that the debate is a bit less settled than some would like to think.
Thanks again for your reply!
William
Hank Roberts says
Marco, did you look for citations for both of the ‘points’ you state there. They’re common enough beliefs. But not truisms:
Did you look at the pictures from Australia? They look much like those from So. California — many seem to show trees spared, amidst brush and buildings burned. Survival of the trees takes a few years to determine, and if enough fuel builds up unmanaged, yes, trees burn. With frequent gentle fires, well timed, burning in cool conditions, downhill into the wind, the fast fuel burns and spares the trees in our area. Prescribed burns do that intentionally. I’ve got a little hobby restoration that’s worked out that way for two lightning fires now, each fire improved the site by removing fast fuel while sparing the trees and native plants we’re trying to encourage. Do you have a source for your belief or observation that differs?
And I know many people have studied what happens to wood used for furniture — on average it’s turned to waste sooner as furniture. Those cites are easy enough to find. Here for example:
“A large fraction of municipal waste is wood, e.g., old furniture …”
http://www.cbmjournal.com/content/3/1/1 I know you can find a good bit more studying how long carbon is tied up in living forests compared to furniture and structure. Time and termites work faster on human-modified wood.
Mark says
RodB, 260 I think I have found out where you get your idea that the black body theory has limited applicability in wavelength.
Gerhard Gerlich’s self-published “paper” uses the same phraseology in its opening statements (and then uses the black body curves without saying whether they are affected by this supposed inapplicability in any of his use of it).
Did you get it from there?
SecularAnimist says
Mark wrote: “Now, please, tell me how you could get the velocity of light from first principles?”
Timothy Chase replied: “You don’t get or ‘derive’ the speed of light from first principles — because the speed of light is a first principle.”
This conversation is off any topic I can think of, but it seems to me that the two of you are talking past each other.
What I understand Mark to be saying is that the value of pi follows logically from the abstract definition of “circle” and “diameter” and “circumference”. There is no need to empirically measure a bunch of actual circles to determine the value of pi. Which is fortunate, since no such thing as a perfect Euclidean circle exists in reality.
In contrast, the value of the speed of light, while it may be a fundamental parameter of physical reality, cannot be determined logically from the abstract definition of “the universe” or “the phenomenal world”. Which is fortunate, since no such abstract definition exists. There is no “Euclidean”, purely abstract “reality” from which we can derive basic physical parameters and laws through pure logic. There is only “whatever we observe”. The speed of light can only be determined by empirical observation.
As for Timothy Chase’s points that the speed of light can be expressed in different units of measurement, or that a universe where the speed of light and all other basic physical parameters were correspondingly different would be indistinguishable from this one, I must admit I fail to see the relevance to Mark’s fairly simple point.
The only relevance that any of this has to climate science, as far as I can see, is perhaps to point out the fallacious “reasoning” of those who reject the empirically observed reality of anthropogenic warming because it conflicts with some abstract definition of “the universe” that they hold. An example would be those who reject AGW because “human activities are too insignificant to cause such huge changes” or some other a priori concept about “how things are”.
Mark says
I don’t want to give too many hits to the site since it seems to like people who claim “90% of politicians are corrupt is a fact” and that the IPCC are therefore corrupt since politicians selected them, but have a look at Jim’s postings.
http://forum.skyatnightmagazine.com/tm.asp?m=86771&mpage=8&key=
#158
#273
#281
and Spartacus’ unsourced “650 out of 700 experts consider AGW to be wrong” in #211.
Laughable if it weren’t people lapping this up
Mark says
279.
So when I said “You can’t derive planck’s constant and speed of light and G and so on” what did tim mean when he said “you’re wrong mark”?
That I was wrong to say you can’t derive them because you can’t derive them???
And why do you think that I think you CAN derive them? Read the entire message because later on it says “you can’t”.
Now why does that make you think that I think you CAN derive the speed of light?
Are you running my posts through babelfish????
Mark says
PS to reply to 279.
And the reason why I started this was to point out that it was irrelevant to say that you can derive the stephan-boltzman constant because it depended on other values. Go riiiight back and read the post I originally responded to. That’s what the blokey said.
And then, because I said you can’t derive the bits you have to put in to the S-B constant, it is irrelevant and misleading to say you can derive the S-B constant.
And then Tim pops along and says “you’re wrong mark”.
So I asked how you derived planck’s constant.
Then everyone started saying stuff I knew and telling me I was wrong and then saying what I said (that they had said I was wrong about).
Get that knee out of your eye and read.
SecularAnimist says
Mark @ 281: I understand you to be saying that we can determine the value of pi using pure logic from the abstract definitions of “circle”, “diameter” and “circumference”, whereas we cannot determine the value of c using pure logic, but only through empirical measurement.
Kevin McKinney says
The debate on quantities derived (or not): like Hanks’s teleology, “it burns!”
Or at least glazes a lot of eyes.
Phil Zylstra says
Paulo (267) I think we will have to talk in more detail about LFMC & ROS where we can spend some time looking at spreadsheets & equations. If you’re coming to the AFAC/CRC conference this spring I’ll look you up – if all goes well I’ll be presenting a paper there on the subject including a model which does show the role quite explicitly. Failing that there’s a slim chance I’ll be in Spain this spring and may be able to catch you then. I would value your thoughts.
As a very brief answer for the meantime, LFMC & ROS haven’t been correlated yet because its a complex system rather than a straightforward linear one as we try to model it. If we conduct experimental burns under a range of conditions and use LFMC of shrubs as a variable we will almost certainly see no connection because it is masked by a series of bifurcations. It is for instance only 1 factor determining whether the shrubs will catch fire (the surface flame needs to be tall enough for instance, yet an increase in wind speed can tilt it), but if the shrubs don’t burn then it doesn’t matter what their moisture is, they don’t affect the fire. On the other hand, 1 species may have an LFMC of 150% and another of 90% but both could burn the same way. That’s because LFMC is 1 aspect of many in the way a plant catches fire (check the reference in my last post to you). Again, even if the shrubs are burning they will only affect ROS if the they are close enough and wind speed/slope/ignition delay time allows the flame to spread from one to the next. Averaged across all species, there’s no trend. Within 1 species it’s critical.
As for OFH – yes, there is an increase over time at Dee Vee; but this only raises questions for me about the validity of the scoring system (with all due respect of course). If the score is increasing with time, why are the flames getting smaller and spreading slower? As far as I can see this is for 4 reasons:
1) Surface fuels continue to increase for some time after 6 years pushing up the score, even though they have almost no correlation with the fire behaviour
2) Shrub height decreases, reducing flame height and ROS (according to the correlation matrix), but it is not included as a variable in the score
3) Shrubs senesce, producing more dead material which raises the score. But perhaps due to the low height of the shrubs the fresh ones were catching fire without the aid of dead material, so the amount of dead material in the elevated fuels is not relevant to Dee Vee
4) This one’s conjecture as the report doesn’t present stats on it, but I’m prepared to take a punt that shrub height was a stronger predictor of flame height & ROS at Dee Vee than it was at McCorkhill. If it was, I’d say that the NS fuel score would be less relevant there, but as NS fuels continued to increase they also added to the total score.
Overall, the score continued to increase at Dee Vee because it was largely based on factors that weren’t affecting the fire behaviour there. It will be interesting to chat to Jim and see if we can get some stats on point 4 out of him.
Hank Roberts says
William, you wrote:
“… What is disputed is … do we really understand all of the other processes …”
Can you point to your source? Did you read someone who claims there’s some dispute about whether we really understand all of the other processes?
I am guessing you read on some blog someone claiming the scientists don’t really understand everything about all of the possible processes.
That’s true. But who is disputing it? Did you find someone who claims it _is_ possible to understand everything about all of the possible processes, so claims the scientists aren’t doing it right?
This may be part of the problem. Try this:
http://www.google.com/search?q=ipcc+uncertainties
Sorting out the good info from the bad is difficult.
A basic course in statistics is, well, basic to this.
Mark says
Correct, SecularAnimist.
Jim Galasyn says
Mark says
William, what other processes are not understood? And are they understood well enough to consider their effect negligible?
After all, the precise definition of the vector “down” in a gravitational field is the resultant vector addition of all the other masses in the universe, each of which contribute something to the gravitational pull I feel and gives me the sense of “down”.
So, what is the effect of Barnard’s star (the closest one we have) on the location of “down” for me?
Now, if you consider that this figure is probably less than the change that occurs when my blood pumps about my body, microscopically changing the local distribution of weight so very close to my centre of gravity, does it matter that I don’t understand precisely how much the force of gravity Barnard’s star has here on earth for me may not be of any great import?
And if you have anything that is completely unknown, then it is as likely to make things worse as better. And there are potentially an infinite number of them. Since there is no reason to suspect that there is any bias to increasing or decreasing the global warming effect on earth, they add up equally.
So half of infinity push it to a lower change of a small value. And half of infinity push it to a higher change of a small value.
But half of infinity is still infinity.
So the total difference is infinity plus negative infinity.
I mean, if they are COMPLETELY unknown, this is the best guess, isn’t it.
Timothy Chase says
Mark wrote in 282:
Mark, if you prove a theorem from a set of axioms (e.g., euclidean geometry), is it a mistake to say that you derived the theorem given the fact that the axioms themselves cannot be proven or “derived”?
The fact is that it is generally regarded as derivable.
Please see for example:
… even though the constants which it is derived from are themselves not derivable.
In fact, if one were to require that everything which something derivable were derived from were themselves derivable, one would be faced with an infinite regress — or circularity.
Anyway, a large part of what you sought to argue for was simply that we can’t currently explain the values of dimensionful (as opposed to dimensionless) universal constants — which I agree with — but I sought to take things a step further and say that with regard to the fundamental universal constants, we will never be able to explain their values (otherwise they wouldn’t be fundamental) — and that a world in which the fundamental dimensionful universal constants were different from what they are would be observationally indistinguishable from our own.
In any case, I don’t think the differences between us are really that great — although you could be a tad more polite in presence of honest disagreement.
Captcha Fortune Cookie:
deduce water
… seems oddly appropriate
Rod B says
Ray, you counter my assertion of “less than a real strong science” with two “ambiguities”, one of which is exactly the same as one of my points. Secondly, there is emperical and there is empirical. The speed of light has a very precise and unchanging ever (in a vacuum) value; this empiricism is physicists just trying to measure that exact value more closely. The power of the concentration ratio (or the coefficient of the log, if you will) is not the same; it’s more a bunch of physicists looking at some lab results and saying, “let’s call it 5 and 1/2 and see if anyone salutes.” In truth it is not near that flippant; it is serious stuff and they try their best to be accurate with what they see. But that coefficient does not have an exact natural value like C, and it most likely changes with concentration ratios — as it heads for that “ambiguous” transition to linear, e.g.
“Snowball Earth” has no relevance to my contention. All that implies is that CO2, et al, absorbs some radiative energy and indirectly can trap heat and can make things warmer. I have never disagreed with that — and this could probably be called strong science as loose as it is. But exactly how much heat is trapped under what exact conditions and to what exact degree, absolutely and marginally, is… not… a real strong science: the best you can do is make learned estimates.
Rod B says
Mark (278), I don’t follow your question. Blackbody radiation is not directly related to what I am asserting.
Timothy Chase says
Correction to 290…
I should have included the word “only” in the following sentence:
dhogaza says
Learned estimates which you claim are far too high, in comparison with your unlearned denialist stubborness.
If these learned estimates aren’t good enough, your insistence that they are wrong, without evidence, with nothing other than your personal assertion that it must be true, are surely worthless.
Despite your continued disingenuous efforts to tell us that you have an open mind, want to learn, blah blah blah, you are nothing but an ideologically-driven denialist.
And evolutionary biology is trucking along just fine without adopting your rejection of evolution in favor of intelligent design, too.
Phil Zylstra says
David Horton (270), thanks for your thoughts. I have to say that I don’t subscribe to either viewpoint on this; I don’t think we’ve done the sums yet. Your theory is quite valid and needs to be examined properly rather than dismissed as it has been so often; but until that is done it’s still a theory. We have to say that we know very little about the effects of prescribed fire as a whole.
As far as the management of surface litter goes, simple maths says that the value of prescribed burning is limited to only the first few years after fire and that the steeper the country, the shorter this period of advantage. However as you say, the bigger picture is the fire ecology of a site. There are quite a few more forests than Mountain Ash that I am aware of which respond to frequent fire by producing a more dense understorey (not all though – that’s a bit of a generalisation but check my comment at http://realdirt.com.au/2009/02/24/while-the-fires-continue-to-burn/). Does this automatically mean that the bush is more of a fire threat if it has more scrub? Not necessarily. A denser understorey will make a more dangerous scenario if it catches fire, but it may also prevent grasses from growing or maintain a high moisture content in surface litter – both factors that make it harder for fires to get going at all. Where is the tipping point? That’s a question of complex modelling and I think it wise to hold our opinions until the modelling is done – now is too soon to say burning off works or burning off makes things worse. What we should be doing is identifying specific examples where we can say from good, consistent observational evidence that prescribed fire has had a positive, a neutral or a negative effect.
I think there is currently a very unscientific atmosphere around this subject, where someone will say “prescribed burning works, here is evidence”, or “prescribed fire does not work, here’s proof”. This violates the first principles of good science. If you want to prove that prescribed fire always works should be trying your hardest to find examples where you might expect it not to, that way you can define the limits of your statement. In the same way, if you want to state that it never works, then it only takes one example of a success to prove you wrong. We need to start getting specific on this and saying “prescribed fire provides this much advantage for this long in this forest type in this terrain.” Because the argument has been so polarised, neither side has examined it adequately and we’re still arguing over something no one has an answer for. When it comes to a tool like prescribed fire which is presented to so many as the answer to Australia’s woes, I think that’s not acceptable.
Mark says
RodB says
“I don’t follow your question. Blackbody radiation is not directly related to what I am asserting.”
But then what does this mean:
“based on radiation absorption theory though it is directly derived only for certain ranges that have been analyzed. ”
?
what radiation theory for thermal sources does not rely on the theories and equations that define and derive the black body curve? Which, by the way, is how you can derive the S-B equation where the S-B constant is used: by integrating the black body curve.
Or did you know that people had derived the SB constant but didn’t find out how???
Mark says
Tim Chase, #290, stop talking about deriving fundamental constants. It can’t be done. They are fundamental to physics expressed in this universe and cannot be deduced from mathematics or logic like, for example, pi can be. Or the total internal angles of a triangle.
Fundamental constants cannot be derived.
And, unless they are dimensionless constants, the dimensional units used change the values. This may be what is confusing you and making you think they aren’t constants (so cannot be fundamental constants). Take the speed of light. The value in feet per second is higher than it is in meters per second because the foot is shorter and more of them can be covered in the same unit time.
You can even define your unit system so that these constants are unitary themselves! Mathematics then allows you to miss out proportionality constants that are unitary.
However, this is no different from using speed of light as a non-unitary number or an algebraic constant like, for example, “c”.
Mark says
Now how did that feel? Being told what you were already told.
Nice?
Think what it would be like after about a dozen times.
Now on to the new stuff Tim.
What if string theory comes out with the speed of light being dependent on the size of the extra dimensions?
This is no different from Newton’s First Law of motion being derivable from Shroedinger’s equations in Quantum Mechanics if you take the Newtonian equation as the average translation of the mass-energy in the force-energy field.
This then makes something ELSE a fundamental constant and, as I’ve said that doesn’t really change the status of a fundamental constant unless and until you find something that solves the derivation of more than one element.
NOTE: and example of how a fundamental constant can be found to fall out of a better understanding of the dimensionality of the problem is the gorce of gravity itself.
Until then it was really just a fundamental constant that the power law of gravity wrt distance was 2. But this value of 2 could have been and has been derived as a consequence of there being an exchange particle the graviton operating in three independent directional axis. Until the force exchange particle was used as a model, there was no reason for it to be precisely 2.
And if MOND is correct, this could be as a consequence of multiple dimensions of spacetime with all but four of those dimensions wrapped up VERY SMALL.
Marco Parigi says
Thanks for all the replies on my last comment. I would like to individually reply but I will try to keep the thread simple and make a few follow up points.
a) The “carbon accounting” of forestry/trees doesn’t factor in the “black carbon” given off from bush fires, nor the loss of soil carbon when ground cover burns off. I suspect that the bigger the fire, the more significant this carbon release. I would suspect that in the long term, fires would prevent any Victorian wilderness from having a meaningful long term absorption of carbon without human intervention to more purposefully sequester the carbon where it won’t become fuel.
b)Most policy arguments with “Green” issues seem to use rules of thumb such as Trees=good, fire=bad, wilderness=good, farms=bad, pulpmills=bad, land-fill=bad – whereas the science is more nuanced if also a work-in-progress as far as specific policy goes.
To take pulpmills as an example, If you take into account that when a tree is taken for a pulpmill, what grows in the tree’s place will absorb carbon quicker than the original tree, that paper fairly quickly ends up in a landfill, where it is usually buried and unlikely to release its carbon for a lot longer than bark or leaf-litter – The carbon accounting of this argument should really be taken into account. The same with furniture in a landfill. “Green waste recycling” actually encourages the carbon to stay near the surface where it is likely to make its way back into the atmosphere, sometimes even as CH4, and other green-waste recycling releases Nitrous oxide.
I know that I don’t have citations on these things being studied, but I find it hard to believe that scientific conclusions would contradict these assertions.
Ray Ladbury says
Rod B., “Learned estimates”–also known as theories, are what science is about. In any case, it is pointless to have a conversation with someone who doesn’t know the difference between a power and a coefficient. If you don’t want to learn the physics, fine, stay ignorant. However, you shouldn’t expect your opinions to carry any weight.
The ambiguities to which I refer have to do with the line shape–again an empirically measured entity. They are only ambiguous when talking about a general absorption line. It seems that where you have difficulties is with anything having to do with statistical mechanics. I would therefore recommend that you go back and rectify this shortcoming in your education. Look at the derivation of the Maxwell-Boltzmann distribution–it’s no less ambiguous than the logarithmic dependence CO2 forcing.