“Tell us how the US Forest Service, and the National Park Service should clear the undergrowth in forests that are Congressionally Designated Wilderness where use of mechanized tools is prohibited by law? It isn’t possible: if you try it, lawsuits will be filed, the state and federal air quality folks will fine you if you burn it, etc. Now you know the rest of the story. ;)”
Most wildfires, including forest fires, are human caused.
Wilderness, on average, gets a lot less human activity than the rest of our National Forests, and are less likely to have human-caused fires within them. No roads means no dragging trailer chains or locking brakes or vehicle fires. Most backpackers nowadays use portal stoves and don’t build fires, and campfires are banned during fire season and this is enforced by wilderness rangers.
So most large wildfires don’t begin in designated wilderness areas, rather fires outside of wilderness burn into them.
Mechanized treatments, from thinning to dead timber removal to plain old logging are themselves a source of human-caused fires. And forests managed for timber through clearcutting burn more readily than mature forests with closed canopies, because the latter shade the undergrowth and soil, which aids moisture retention. Clearcuts after replanting take decades to close the canopy, they lose moisture readily, are shrubby, and burn quickly and easily.
Here in California, the Dolan Fire in Big Sur, which has burned a fair chunk of the Ventana Wilderness, was human caused (arson, and, no, not by antifa). Within the wilderness and bordering forest land it’s been largely an understory burn, which is not a bad thing. Most of the fire fighting efforts were, as usual, centered around saving a couple of towns along highway 1.
You want to stop the frequency of catastrophic fires in a world where they are more likely to blow up rapidly due to climate change?
Get rid of the human-caused fires. That would, for now, more than mitigate against the increased odds due to climate change.
Trying to make designated wilderness a bogeyman is just ideological crap.
barrysays
Having participated in the study promoted here about blogosphere perceptions of climate changfe risk, I received a copy of the study.
My grade D takeaway from this and a decade’s worth of slugging it out in the blogosphere is that it’s hard to predict who will be a climate change ‘denier’ or ‘activist’.
UAH TLT has posted for September 2020 with an anomaly of +0.57ºC, the second highest anomaly of the year so far. The UAH TLT monthly anomalies for 2020 Jan-Sept sit in the range +0.38ºC to +0.75ºC.
Sept 2020 is the second warmest Sept on the UAH TLT record behind 2019 (+0.61ºC) and sitting ahead of Sept 2017 (+0.56ºC), 2016 (+0.47ºC), 1998 (+0.44ºC), 2010 (+0.36ºC) & 2009 (+0.26ºC).
Sept 2020 has the =10th highest anomaly in the all-month UAH TLT record.
The 2020 year-to-date average anomaly sits 3rd in the ranking tabled below. To gain warmest year accolade from 2016 by year’s end would require the Oct-Dec average to top +0.58ºC, to climb to 2nd above 1998* would require the Oct-Dec average to top +0.41ºC, and to slip to 3rd below 2019 would require Oct-Dec to average less than a rather chilly +0.23ºC. [*1998 still appears near the top of TLT rankings because TLT temperatures are much boosted by El Niño, far more than the surface temperatures. Even in the RSS TLT record which is less trend-defying than UAH, the El Niño-boosted 1998 still sits as 6th warmest year.]
Killian @1 Pretty good news article. My one significant reservation was with the final paragraph. It suggests that avoiding a Pliocene CO2 world “will require big steps now to decrease fossil fuel use and turn down Earth’s thermostat.” A severe understatement, in my opinion.
It seems to me that since we’re already at a Pliocene CO2 level, and busily adding more, we aren’t going to get out of this by merely decreasing fossil fuel use. Granted, if we stopped burning fossil fuel tomorrow, some of the atmospheric CO2 will go into long-term ocean storage. We could pull out maybe another 40 or 50 ppm by recarbonizing the soil reserves that we’ve drawn down, and massive reforestation. However, would be the work of generations.
Before that happens, we’re looking at who-knows-how-many more ppm in the atmosphere from fossil fuel burning. Then, we get to massive feedbacks to the warming resulting from permafrost thaw, peat oxidation, methane releases, and whatever other carbon reservoirs that have been stashed away and building up for over 3M years of ice age climates.
One thing really clear is that the climate system is permanently out of equilibrium on the time scale of our civilization, so we’ll be dealing with one unanticipated consequence after another. I’m not saying that we shouldn’t be doing anything (although what we should be doing belongs in
Solutions.)I’m saying that we shouldn’t pretend that we’ll be dealing with anything like a return to equilibrium until after the ice sheets finish however much melting we commit them to undergo, and the feedbacks resulting from that melting.
InsideClimateNews:
“The intensified layering, called ocean stratification, is happening ______ than scientists expected, an international team of researchers reported in the study, published Sept. 28 in the journal Nature Climate Change. And that means the negative impacts will arrive _____ and also be ____ than expected, said Mann, a co-author of the study.”
So, assuming a competent and unbiased scientific establishment the odds that the blanked out words are “slower”, “later”, and “less” should be 50%.
Who here will bet, even when given 10:1 odds, that those words actually complete the quote?
Ken Fabiansays
dhogaza #2 My observation of forested areas in (fire prone) Australia is they predominately encompass rugged and difficult terrain – they missed being cleared for grazing or cultivation largely for that reason, ie by default.
The costs and difficulties of mechanical undergrowth “management” at such scales are staggering – even were such methods found to be compatible with a viable forest ecosystem or even enduring exploitation of timber. Best management method? That hasn’t been demonstrated. Best method, made illegal – presumably by unreasonable and naive extremism? That has not been demonstrated either, but it presses the “Right” buttons. Reality is, as you say, such methods can increase fire risks; tracked bulldozers are known to start fires in dry conditions just by rolling over rocks that make sparks.
We got similar arguments by deniers of climate change in Australia – that “green” regulation prevented reasonable fire hazard reduction activities and underlying extreme conditions aka global warming (therefore) didn’t have any influence. But it was BS; bushfire authorities, correctly in my view, deemed conditions unsafe. We had less than a few weeks total of “safe” conditions across all of 2019 Winter (when out of control fires started by lightning strike near here) – and big forests around here in global warming enhanced dry conditions need at least 6 weeks of “safe” conditions post burning before risk of fire breaking out is low.
Management of fire risks at the interfaces of forest and farms/towns is complicated. Development has been encouraged in dangerous places but managing fire risks is often outside the everyday experience of those who move in and there is a natural reluctance to “burn off” – where things are never under perfect control and legal liability for any unplanned spread applies.
I suspect also that in the past cool weather fires in Australia were easier to manage – because it was cooler! Dew forming overnight was a (reasonably) reliable natural fire retardant; farmers would light in the evening and go home, confident that fires would self extinguish. It really needs professional management now.
Western Hikersays
[“We’d go out to a big fire or clear cut,” North said, of his college summers working as a tree planter, “and every ten to twelve feet we’d plant another pine tree. At the end it would look just like a corn crop.”
They called it “pines in lines
Pine plantations are notoriously, incredibly flammable until they’re 60 to 80 years old,” North said. Till they’re big enough for their crowns to be out of reach of flames, “if a fire goes through a 12×12 pine plantation it just gets vaporized. The trees go up like matchsticks.”]
Agree with your comment – I did not mean to denigrate wilderness – my point is that California has a lot of wilderness (35% of federal lands in Cal are wilderness), and that fire prevention and fire fighting are not practical in wilderness due to the no-mechanized-tools rules. Yes, I am aware that most of the fires in Cal are human caused, some being arson. Similar here in the PNW. FYI, I have hiked to the summit of Mt. Manuel (sp?), in the Ventana – great views, but can be a hot climb. Wilderness stats by state:
In other CC news, as we enter into the new ice age, we’ve learned recently to do COVID contact tracing. Now, to fight AGW, we will be doing climate tracing to locate sources of CO2, cow farts, etc. You can’t make this stuff up:
Looking again at the Arctic sea ice, 2020 continues post-minimum to sit right between the record year of 2012 and the third lowest 2019 at a current 4.16 million km2. Presumably, we are as a result seeing more energy than most years flowing from ocean to atmosphere. We can look forward to an accelerated rate of ice regrowth as part of the dynamic, too.
The NSIDC should be out soon with their September analysis, which will rule definitely on their assessment of the minimum, and will include the September mean–a useful benchmark according to many folks.
Thanks to MAR for last month’s comment at #44, about the von Schuckman et al (2020) paper “Heat stored in the Earth system.” This research recommends that “Earth Energy Imbalance” become the fundamental metric to measure humanity’s response. The paper reports results from years of research, including several thousand Argo floats that acquire data from various ocean depths around the world. It is important to understand that this metric in not radiative forcing. Although due to it and therefore increasing, it is rather actual current energy flux around the planet’s surface.
Although the value of EEI in the first decade of this century was about 0.6 Watts per square meter, it is now measured during 2010-2018 as 0.87 W/m2. Since this applies to the entire surface, whcih is 5.1 x 10 E14, the total heating is 4.4 x 10 E14, or 440,000 Gigawatts (GW). (an astounding value) With an estimated 3% going into icemelt this means 13,200 GW, or the equivalent of more than thirteen thousand large 1000 MW power plants running 24/7/365, is the cause of some one trillion tons of ice converting to liquid water around the globe each year.
The preferred value of EEI for the continuance of civilization is zero, implying “negative emissions.” Have I missed anything in this “simple calculation?”
-Declining costs are driving increased deployment;
-Increasing scale and duration feed back with point #1;
-China has an overwhelming advantage in battery manufacture (and hence cost) but this is gradually eroding;
-Still need alternatives for applications over 5-6 hours.
They didn’t discuss the integration of storage with RE generation, unfortunately; I’d have liked to hear what he thought on that score.
Adam Lea @225-Sept UV thread,
I think we do perhaps address different aspects of the Atlantic hurricane record. I am thinking back to pre-satellite days while you are thinking of more recent times.
I was thinking in terms of the problems of comparing recent level of hurricane activity with the early 20th century and even the late 19th century.
The reliability of the annual ACE for that early period is surely very poor. For instance, did 1933 truly better 2005 for annual ACE? Or conversely, was it even more the record-holder than it appears on the record? And if this does yield a level of uncertainty for a single year ACE, the rolling average of annual ACE over longer periods will be also unreliable.
The situation is even more of a problem when Storm Numbers are considered. Short lived storms, or storms briefly becoming hurricane-force or major hurricane force would make a nonsense of such accounting.
My thought was that, while ACE may be inaccurate, the lion’s share of the annual total is always major storms. (Perhaps 2020 with its very large number of storms is becoming the exception here.) Thus annual ACE from early years could be assumed not wildly inaccurate. And if this were so, a 5-year run with ACE>100 is seemingly exceptional.
Of course, El Niño will suppress hurricane activity but as you discuss it is not the only factor and forecasters can be caught by surprise by low hurricane activity, as in 2013.
And likely wrapped up in the same considerations is an explanation for the hurricane activity of the late 1990s to early 2000s not been matched by the activity in the years since.
Certainly there have been more years with low activity, but in terms of years with high ACE, the most recent years with higher SST than 1995-2005 (or was it higher where it mattered?) do not show anything as strongly as 1995-2005 in the ratings. (And of course for the satellite era these ACE values are accurately comparable.) The top ten years for ACE since 1995 run:-
2005, 1995, 2004, 2017, 1998, 1999, 2003, 1996, 2010, 2008.
And in those 2017 was high because of a seriously crazy September.
So I do think there is some learning to be had about what has been the drivers/suppressors of hurricane activity over the last 25 years. But, as explained above, in the SeptUV thread I was addressing earlier times.
Al Bundysays
Ken F: The costs and difficulties of mechanical undergrowth “management” at such scales are staggering – even were such methods found to be compatible with a viable forest ecosystem or even enduring exploitation of timber.
AB: Actually, rational forest management pays for itself and provides plenty of forest products while turbocharging evolution (so forests can evolve to the new climate) and preventing serious forest fires.
Like engines and democracy, if ya insist that old failed techniques are the only possible way, then you are holding back progress.
I’ve developed a forest management system that actually works. Perhaps we can talk about it sometime, Ken.
William B Jacksonsays
#10 You may not be able to make this stuff up but as you have proven “you” can definitely completely misunderstand reality. Plus you have a habit of cherry picking nonsense.
New Zealand bushfire that demolished village leads to climate crisis debate
Lake Ōhau village is located at the foothills of the Ben Ohau mountain range, and is home to just 15 permanent residents but its numbers swell significantly during the holiday season. On Sunday morning, a fire tore through the foothills and into the village, forcing 90 people to evacuate.
Although no one was killed or seriously injured, about 50 of the village’s 70 homes were destroyed, as was 4,600 hectares of land, of which 1,900 are conservation estate.
As firefighters continue to subdue the blaze, and residents pick through the ruins of their homes, attention is turning to how and why the fire began, with emergency crews on the scene saying the ferocity of the blaze was rare for New Zealand, and usually only seen in bushfires in Australia, or California.
Climate scientists have warned longer and hotter summers as a result of the climate crisis are making such scenarios more likely, while local officials and industry groups have blamed land management practices.
MA Roger @15: Regarding 1933, I’d estimate the true ACE could easily be higher than the recorded ACE. There is a notable lack of tropical cyclone genesis in the eastern Atlantic (two long trackers formed near the Cape Verde islands), and almost all the storms seem to have formed on the doorstep of the Caribbean islands and the U.S. Were there really no storms in that part of the Atlantic, or could there have been one or more significant storms which formed and recurved quickly, that were missed because they didn’t hit land or any ships? Hurricane Lorenzo last year is the type of storm that could have easily been missed in the early 20th century.
Copernicus ERA5 re-analysis has been posted for September with a global anomaly of +0.63ºC, up from Aug’s +0.44ºC and also above last 4 months (May-Aug) but sitting below Jan-Apr. 2020 ERA5 anomalies for the year-so-far sit in the range +0.80ºC to +0.44ºC.
September 2020 is the warmest on the ERA5 record, sitting above September 2019 (+0.58ºC), 2016 (+0.55ºC), 2017 (+0.48ºC), 2015 (+0.46ºC), 2018 (+0.40ºC), 2014 (+0.38ºC), 2013 (+0.37ºC), 2005 (+0.33ºC) & 2012 (+0.31ºC).
September 2020 has the 18th highest anomaly on the ERA5 all-month record.
With three-quarters of the year gone, 2020 is sitting in 2nd place in the rankings of warmest year-so-far tabulated below. To gain top-spot for the full calendar year from 2016, the 2020 Oct-Dec anomaly would have to average above +0.636ºC and to drop to 3rd spot below 2019 would require Oct-Dec to average less than +0.476ºC.
I want to highlight Mal’s useful response to a point I made. His post unfortunately got a little lost due to the closure of the UV thread at month’s end. Link:
Bottom line, I was right but only up to 2018, when Congress issued a large extra allocation specifically for fighting wildfire. (See above link for more.) Presumably, that has helped forest management budgets out–but there’s a big backlog to be cleared. (So to speak.)
Also getting lost at the end of the month was this comment of mine:
This would be a tad less O/T on Forced Responses, but we seem to be in between FVs. So–I have just released an album containing 14 original songs in full-band arrangements, many of which deal lyrically with issues relevant to this august site’s prime topic.
It’s on all major digital platforms, almost certainly including any you may subscribe to, so feel free to check it out!
If you’re more of a download person, well, Amazon Music has it–you can sample 30 seconds of all tracks by way of ‘tasting’:
(Spotify, Apple Music, and Youtube all work, too, with a little searching of “Doc Snow” and “Carolina Maze.”)
Russellsays
18
Sad to see The New Yorker turn into a well written climate blog re- re-runs : Elizabeth Kolbert is channelling Bill McKibben channelling Jonathan Schell
“So how hot—which is to say, how bad—will things get? One of the difficulties of making such predictions is that there are so many forms of uncertainty, from the geopolitical to the geophysical. (No one, for example, knows exactly where various “climate tipping points” lie.) That being said, I’ll offer three scenarios.”
Al Bundysays
Mrkia: fire fighting are not practical in wilderness due to the no-mechanized-tools rules
AB: what a doofus. As if laws and rules can’t be changed. Really? You are claiming that mechanical systems are impossible because laws can NEVER change? Even a sub-genius knows that laws adjust to fit whims. Much disagreement on whose whims, but your post is obviously a deliberate ploy, “we’re helpless because this rule exists and even though I scream about the need to erase all rules I will support this one to death because…..”
Solar Jim @13,
You ask for confirmation about the “Earth Energy Imbalance” and set out that this EEI should be zero, this “implying negative emissions.”
While the EEI remains positive the planet will warm, the positive EEI representing our past & future GHG emissions which have yet to act. Once our direct* GHG emissions are zero, the EEI will begin to decline through:-
(1) The natural draw-down of GHGs which for CO2 means into the oceans. For CO2, the anthropogenic CO2 increase will halve, a process taking a millennium but with the lion’s share of this draw-down appearing over the first century.
So if we stopped CO2 emissions tomorrow, we could expect an eventual CO2 level of perhaps 350ppm, *although this assumes we would not have kicked off any significant CO2 emissions from the biosphere.
And there would be the residual forcing from other GHGs to be factored in as well.
(2) Planetary warming caused by any left-over EEI which may still need satisfying. Thus, that potential reduction to 350ppm CO2 by AD3100 if emissions magically stopped tomorrow and if ECS=3.2ºC and with no other legacies from AGW, we would be left with a global temperature increase of something like +1.25ºC. But that takes no account of the temperatures we would encounter before AD3100 and with continuing emissions that theoretical AD3100 increase is rising by the year.
When these two processes have run their course, we will have established our new planetary temperature. If that temperature were considered to be looking too high, that would be the point when we would require anthropogenic negative emissions. The science seems to be saying that, for AGW to be restricted to +1.5ºC and with the likely emissions we still expect before we reach zero emissions, we will require significant negative emissions prior to 2100.
Adam Leasays
22: Unfortunately Delta looks likely to come ashore in the same area that Laura devestated, there are still a lot of blue tarps where roofs used to be. Looking less likely it will be a major hurricane at its second landfall, but the storm is forecast to get wider, which will increase the storm surge potential.
RSS has posted its TLT anomaly for September (but so far not updated the browser).
The RSS September global TLT anomaly is +0.87ºC, up from Aug’s +0.77ºC and the third highest of the year sitting below Jan &Feb. 2020 RSS TLT anomalies for the year-so-far sit in the range +0.74ºC to +1.02ºC.
September 2020 is the 3rd warmest on the RSS TLT record (2nd in UAH), sitting below Septembers 2019 & 2017 (both +0.89ºC) and above September 2016 (+0.82ºC), 2010 (+0.66ºC), 2015 (+0.66ºC), 2009 (+0.62ºC) & 2012 (+0.59ºC).
September 2020 has the 10th highest anomaly on the RSS TLT all-month record (=10th in UAH).
With three-quarters of the year gone, 2020 is sitting in 2nd place in the rankings of warmest year-so-far tabulated below (3rd in UAH). To gain top-spot for the full calendar year from 2016, the 2020 Oct-Dec anomaly would have to average above +0.74ºC and to drop to 3rd spot below 2019 would require Oct-Dec to average less than +0.47ºC.
A graph of RSS TLT monthly anomalies plotted year-on-year is here (2 clicks).
(And up thread the ERA5 re-analysis year-on-year monthly anomaly graph suffered a duff link. Not sure what happened but here it is again (2 clicks).
Al Bundysays
Western Hiker: Pine plantations are notoriously, incredibly flammable until they’re 60 to 80 years old,
AB: Pine heartwood is used to make fatwood kindling. Lights about as easily as paper. And pine needles are way flammable when dry. Pines are oily trees. So I believe you 120%.
And 60-80 years sounds suspiciously impractical. If the rotation is that or less, then the only “safe” period is the few years immediately after a cut.
And those pine plantations are surreal to drive past. The geometry goes on forever, so it’s like hundreds of hallways.
But, like I said, the problem is easily solved. I’m sure everyone here has all the data they need to come up with the solution. Heck, it’s as obvious as the temporal torque transfer device (which cuts the number of cylinders an engine needs for a given smoothness in half).
Yep, not actual but effective time travel is easy, almost 100% efficient, and dirt cheap, but only for torque. Send half of a cylinder’s torque into the future and the engine as a whole acts like it has twice as many half-sized cylinders.
Al Bundysays
Yo EP!
I challenge you to figure out how a temporal torque transfer device works.
I’ll even give you a clue:
If an electric motor fails, depending on design, it might result in a temporal torque transfer device.
Just a side note: hereabouts–i.e., the Carolina sandhills region–there’s a fair bit of longleaf pine restoration happening, for instance at the Battle of Camden site:
(In the pictures you can see both the currently dominant loblolly pines, and young longleaf pines. Over much of the site the loblollies have now been removed by cutting, and they will be suppressed on an ongoing basis by intentional burns.)
Point being–as Janisse Rae discussed so beautifully in her Ecology of a Cracker Childhood–the longleaf pines are fire-adapted–historically, they dominated much of the US southeast, but struggled due to over-harvesting (they are a great source of softwood lumber) and fire suppression. So, not *all* pines are highly flammable!
Anyway, carry on!
Al Bundysays
The Guardian: The actual lag effect between halting CO2 emissions and halting temperature rise, then, is not 25 to 30 years but, per Mann, “more like three to five years.”
In short, this game-changing new scientific understanding suggests that humanity can turn down the heat almost immediately by slashing heat-trapping emissions.
AB: Look at that. A “Things are better than we thought” moment!
The JAXA Arctic Sea Ice Extent numbers look like handing 2020 another couple of days of ‘lowest SIE for the time of year’ record in the coming week. This post-minimum October situation of dipping SIE anomalies has become a frequent phenomenon recently, shown more clearly on an anomaly plot of JAXA SIE as shown here (usually 2 clicks to ‘download your attachment’).
I say “frequent” as this dip in SIE anomaly post-minimum is the same path followed in 2007, 2011, 2016, 2018, 2019 and now 2020. And perhaps it seems to be saying that the impact of AGW and all that extra heat sloshing about up there can more easily delay the onset of the freezing season than it can reduce the minimum SIE in September. Perhaps that residual SIE at the height of the melt may be a harder nut to crack than it would simplistically appear.
Perhaps there is another bit of evidence to consider.
A plot of PIOMAS Sea Ice Volume anomalies sourced from Wipneus at Arctic Neven’s is perhaps showing the same/similar situation but in terms of the volume pre-minimum.
By the height of the melt in September, the zero anomaly is just 4,000 cu km below the recorded SIV anomaly. But the anomaly is achieving 2,000 cu km lower values in early July when there is 13,000 cu km to melt. This means the melt of SIV July-to-September in recent years is actually less than seen in earlier years. 2,000 cu km is ‘missing’ relative to earlier years and that ‘missing’ melt is a very significant part of the remaining minimum in September.
(The graph 14b (XIVb) at MARCLIMATEGRAPHS plots the appearance of the ‘zero anomaly’ on a monthly PIOMAS anomalies plot [note the different anomaly base], but a direct link is being denied by GoogleSites for some reason. Being monthly data, the dip in early July is less pronounced.)
Of course, PIOMAS is the output of a model and those July values may be the most difficult to check being in the height of the melting when melt ponds make even Sea Ice Area difficult to measure. But if the SIV values for July are robust, it does raise the question: Why would the SIV be seeing a dipping anomaly pre-minimum while SIE sees it post-minimum? Speculating, do draining melt ponds on the thin July ice prevent a lower minimum in September while the warmer seas prevent the quicker spread of the ice edge in October?
Killiansays
24 Russelll: Sad to see The New Yorker turn into a well written climate blog re- re-runs : Elizabeth Kolbert is channelling Bill McKibben channelling Jonathan Schell
I fixed it for her. Please forward.
“So how hot—which is to say, how bad—will things get? One of the difficulties of making such predictions is that there are so many forms of uncertainty, HPWEVER, THE LONG-TAIL RISK IS SO EXTREME WE CANNOT RISK IT UNDER ANY CIRCUMSTANCES. We must change virtually everything in all industrialized nations.”
Killiansays
Solar Jim: Thanks to MAR for last month’s comment at #44, about the von Schuckman et al (2020) paper “Heat stored in the Earth system.” This research recommends that “Earth Energy Imbalance” become the fundamental metric to measure humanity’s response.
Hmmm… The energy balance could be achieved in ways that are utterly unsustainable and completely unjust. It’s an indicator, not a measure of successful responses.
The only measure that mattersisregenerative or not.
Adam Lea @20,
I agree entirely that seasonal totals for Atlantic hurricane ACE in earlier years could be significantly wrong. The apparent absence of lesser storms has been kicked about in the literature. I am less familiar with discussion of geographical absences being discussed but I’m sure they have not gone without consideration.
I think we agree that the direct comparison of pre-satellite storm numbers and of ACE totals with the data we now have in the satellite era has many elephant traps to negotiate.
My thoughts/speculations with the number of consecutive ACE=+100 years was speculating on a way to compare these earlier pre-satellite years with the satellite era that avoided these elephant traps.
Thus I pointed to the run of years with high seasonal ACE totals.
We have now had five ACE=+100 in a row 2016-2020 in the satellite era. Given seasonal ACE totals can be considered inaccurate in the pre-satellite era, the question I ask is “How inaccurate?”
The lion’s share of any ACE total is the big long-lived storms. We may underestimate their ACE by a significant percentage but we cannot have missed such storms. So if we say that seasonal ACE totals (that predominantly comprise those big long-lasting storms) could for pre-satellite days underestimate an ACE=+100 year of the late 1940s, how much could that underestimation comprise? Could an actual ACE=+100 year be recorded today as low as ACE=90? Or ACE=80? Or ACE=70? And ditto for years back in the 1800s. And having set such a lower limit for what potentially could be an ACE=+100 year, is then 1947 elevated from ACE=88, 1948 from ACE=95 and 1949 from ACE=96, all to ACE=+100 to create a 5-year run 1947-51? And could 1895 be elevated from ACE=69 to ACE=+100 to provide a six-year run of ACE=+100 1891-96?
But if those adjustments are seen as too big, adjustment beyond credibility, then the 2016-20 run of five consecutive ACE=+100 years (and likely five ACE=+125 years) would then be unique on the record.
Al Bundysays
I’ve decided to not pursue ownership of my forestry ideas. Here is a top level description:
A forest that has been recently cut is easy to defend from fire. Each year of growth makes it harder, probably all the way to the next cutting time.
Each forest has an optimal rotation rate. The characteristics of a firebreak that will almost certainly hold (with human help) can be predicted. The worst winds in fire season will likely be from a specific direction, for example (and geologic features are constants, more or less).
Rotation is 60 years? Make another firebreak 60 firebreak-widths away. Repeat.
Next year a new set of firebreaks is harvested such that a fire would likely have to get past last year’s firebreak first. Repeat yearly.
The firebreaks have mother trees and wildlife corridors. The corridors function as chokepoints for genetic transfer, creating almost but not quite separate populations. Genes that better fit the new environment will arise and become dominant in one population, and then spread to other populations via the corridors, where they’ll hook up with different advantageous mutations, providing hybrid vigor.
Islands of never-to-be-touched forest are scattered throughout the pattern (generally in the toughest terrain).
The long strips allow for a very large area of harvest per mile of service rail so the harvest can get to market at reasonable cost.
And if a fire gets past a firebreak, so what? It will feast for a few firebreak-widths, but then it has to fight through ever sparser fuel all the way to the next break.
Thoughts?
Reading one of those random stories that gets pushed to one’s desktop this morning over coffee, I was struck by this snippet from a piece on computer energy use:
…this early work was also limited by the fact that it tried to apply equilibrium statistical physics to analyze the thermodynamics of computers. The problem is that, by definition, an equilibrium system is one whose state never changes. So whatever else they are, computers are definitely nonequilibrium systems. In fact, they are often very-far-from-equilibrium systems.
Fortunately, completely independent of this early work, there have been some major breakthroughs in the past few decades in the field of nonequilibrium statistical physics (closely related to a field called “stochastic thermodynamics”). These breakthroughs allow us to analyze all kinds of issues concerning how heat, energy, and information get transformed in nonequilibrium systems.
These analyses have provided some astonishing predictions. For example, we can now calculate the (non-zero) probability that a given nanoscale system will violate the second law, reducing its entropy, in a given time interval. (We now understand that the second law does not say that the entropy of a closed system cannot decrease, only that its expected entropy cannot decrease.) There are no controversies here arising from semi-formal reasoning; instead, there are many hundreds of peer-reviewed articles in top journals, a large fraction involving experimental confirmations of theoretical predictions.
Hmmm… thermodynamics in a non-equilibrium system… like, say, Earth’s climate system.
Rule one: search the literature before asking stupid questions. (They may still be stupid, but at least they’ll be a slightly more informed brand of stupid.)
So, searching “non-equilibrium+statistical+physics+climate” and (noticing an auto-fill suggestion “non-equilibrium+thermodynamics+climate” turned up a smattering of papers. Browsing from ignorance, one doesn’t know how to separate wheat from chaff, but there was this:
Prospective Study on Applications of Non-Equilibrium Thermodynamics to Climate Modeling
Yongping Wu;
Hongxing Cao;
Guolin Feng
Journal of Coastal Research (2015) (73 (10073)): 342–347. https://doi.org/10.2112/SI73-060.1
It shows several concerning signs, notably an apparent mismatch between topic and journal focus, and an abstract that frankly is poorly written. (But maybe that’s just ESL.) Plus, just exactly how obscure is the ‘Journal of Coastal Research,” anyway? Then there’s the fact the article has all of 4 citations.
Yet this–from the abstract I just impugned–was intriguing:
As the typical open non-equilibrium system, atmospheric system undoubtedly follows some of the theorems of non-equilibrium thermodynamics. Therefore, the non-equilibrium thermodynamics theory has broad applicational prospects in climate modeling and prediction.
Continuing on through the search results, it became evident that one researcher seriously focused on this and related areas is Valerio Lucarini, apparently associated with the Universities of Reading and Bologna; he has several related papers to his credit.
A non-mathematically-technical paper of his, which however seems not to have appeared except online and as a book chapter–i.e., non-peer-reviewed sources–is here:
It’s not easy reading because of the dense and occasionally jargon-laden prose, but is otherwise pretty accessible. And it makes some interesting points about climate modeling, and the epistemological criteria pertaining thereto. (Interesting to me, at least.)
But although the second search–the ‘non-equilibrium thermodynamics climate’ one–turned up 14k+ citations, a sampling of the first couple of search pages seemed to include a high percentage of papers that were either old, or apparently tangential in some way–Dottore Lucarini aside.
So, is this an area of which one can say ‘something is happening here?’ Or just a backwater, or maybe a sideshow?
Solar Jimsays
MAR @ 26.
Thanks for your comments. You seem to be much more complacent about the danger to the world’s seacoasts than many scientists are. You make several controversial statements which may derive from your misunderstanding of what EEI represents.
RE: “the positive EEI representing our past & future GHG emissions which have yet to act.” This is untrue, as I understand it. The value of EEI is derived from the heat flux (actual energy transfer, not radiative forcing, RF) into the ocean in recent time’s. This has nothing to do with future GHG emissions, only the past. Additionally, since RF is higher than Earth Energy Imbalance, the quantity of EEI will continue to increase toward the net RF value. (Both of these are measured in watts per square meter over the entire surface of the planet.)
Regards.
Al Bundysays
Solar Jim,
I agree that the imbalance is immediate. The delay is primarily lag due to thermal mass, such as why an Adobe house stays cool on a hot afternoon. The oceans, for example, represent a 1000 year partial delay, both in temperature and CO2 levels. And Greenland and Antarctica have stupendous amounts of phase-changing (and lapse rate) to give to the cause of delaying the inevitable.
Then there’s feedbacks. As things warm up, reactions that further heat things up tend to occur.
MAR knows what is going on. Typos happen.
Donald Condliffesays
Regarding the management of forests I like to look at what real experts say. One such person is Dave Daley who published an article in the Chico Enterprise on September 27, 2020. Cry for the Mountains. Daley is a rancher and comes from a ranching family that has run cattle since 1852 in the National Forest that burned in this year’s North Complex fire. His expertise on forest management is also academic he is a PhD who teaches forest and land management. He also is Past President of the California Cattlemen’s Association, current Chair of the California Cattle Council, Chair of the Forest Service committee for the Public Lands Council and Chair of Federal Lands for the National Cattlemen’s Beef Association. He also is familiar with the many thousands of years of local native American land management practices. His conclusions therefore carry great weight in my mind. His view is that California’s fire danger has two main causes mismanagement of the forests for 8 decades AND global warming causing the mismanaged forests to become drier. Definitely NOT just one or the other. One of which, forest management, we can improve with local action and a return to more intensive management of large parts of our forests for fire control using controlled burns following practices adapted from ancient native practices, that were used by his family until the 1960s. In addition the local tribes are proposing that intensive forest management should be done using local labor and local companies, including the tribes, with preferentially awarded management contracts for local coops and companies for forest areas. Producing reduced fire risk and economic development and higher forest productivity and more jobs, in particular good jobs that do not require a college degree. The other problem global warming, requires global action and frankly I doubt we will succeed in limiting the switch to a hotter stable state.
Donald Condliffesays
Re 37. Al I have seen this idea before when I attended the University of Oregon in Eugene in the 1970s. It also resembles closely the pie sector harvesting plan then proposed that would divide an area into slices allowing harvest of a slice every 10 to 20 years depending on the age of tree desired. The plan was to break up the landscape into a tree age patchwork with many fire breaks so fires would always remain limited. The pattern that existed before European settlement. Before our interventions, now seen to have gone awry.
I think the problem with idealized plans like this is that the term ivory tower applies in spades. When compared to the ideas from people with decades of experience informed by study of the local practices for thousands of years, informed by deep understanding of the local terrain, knowledgeable of the thousands of years of wisdom accumulated by local tribes on the boundaries of climate extremes now confirmed by paleoclimate studies (that exceed all worst case estimates now being used), any outsider cannot possibly prescribe better than informed locals when they have a stake in the success of the plan.
Karsten V. Johansensays
Could someone enlighten me regarding which research Michael Mann is referring to here: “Using new, more elaborate computer models equipped with an interactive carbon cycle, “what we now understand is that if you stop emitting carbon right now … the oceans start to take up carbon more rapidly,” Mann says. Such ocean storage of CO2 “mostly” offsets the warming effect of the CO2 that still remains in the atmosphere. Thus, the actual lag between halting CO2 emissions and halting temperature rise is not 25 to 30 years, he explains, but “more like three to five years”.
This is “a dramatic change in our understanding” of the climate system that gives humans “more agency”, says Mann. Rather than being locked into decades of inexorably rising temperatures, humans can turn down the heat almost immediately by slashing emissions promptly. “Our destiny is determined by our behavior,” says Mann, a fact he finds “empowering”.”
Solar Jim @39,
My apologies. You are correct in spotting the error I managed @26. I say “managed” as it is not what I wanted to write and probably resulted from editing that comment @26 which got a bit more complex as it was being witten. The words “& future” somehow escaped the editors cut.
The Earth’s Energy Imbalance (EEI) today represents the positive Radiative Forcing (RF) from the increased GHGs (only today’s increase, not “& future” ones) that has yet to warm the planet (ΔT) such that RF = EEI + f(ΔT). Note that as the RF does its stuff and increases ΔT (which as you say mainly requires warming the oceans), the value of f(ΔT) will increase and EEI will thus decrease, eventually EEI decreasing to zero when equilibrium is achieved. (You ‘manage’ to set this the other way round @39.)
….
You also mention Sea level Rise as being connected to my understanding of EEI and my disagreements concerning SLR, presumably my disagreement with some who consider multi-metre SLR by 2100 the likely outcome. There is only a tenuous SLR/EEI connection.
In terms of SLR, I have long worried that the very significant SLR beyond 2100 is being ignored and also have challenged claims of multi-metre SLR by 2100, two separate things.
On the multi-metre SLR by 2100: a decade back, the sole serious advocate of multi-metre 2100 SLR was James Hansen, this is an issue of melting ice caps. My argument is/was that given the EEI there is a limit to the energy available to melt the volume of ice required for such a sudden SLR. When the Hansen thesis was eventually set out in the discussion paper Hansen et al (2015)‘Ice melt, sea level rise and superstorms: evidence from paleoclimate data, climate modeling, and modern observations that 2ºC global warming could be dangerous’ you will note the modelled EEI of 4Wm^-2 is achieved through a period of significant negative ΔT. Note that in this whole multi-metre SLR business, I was not arguing against the possibility but against the absence of a workable theory to back it up.
And on the same score, I would criticise the article linked @1 & discussed @5. It is very thin on science to support the message of 15m SLR. It simply says that 3 million years ago the Arctic was ice-free with the ocean’s coasts cloaked in boreal forest and and the seas 15m higher, pointing out that three million years ago was also the last time we saw CO2 at 412ppm.
My understanding is that the level of CO2 three million years ago was probably less than 400ppm (so the last time CO2 was at 412ppm would be much earlier, perhaps 13 million years ago) and that other things like the absence of the Panama Isthmus had a role in the warmer global temperature three million years ago. And in setting out such a view I have discovered folk who will passionately defend +400ppm three million years ago, this resulting in lively interchanges.
Killiansays
32 Al Bundy:The Guardian: The actual lag effect between halting CO2 emissions and halting temperature rise, then, is not 25 to 30 years but, per Mann, “more like three to five years.”
In short, this game-changing new scientific understanding suggests that humanity can turn down the heat almost immediately by slashing heat-trapping emissions.
As I have said, return to 260-280 ppm, you start stabilizing the poles within decades, thus potentially avoiding many meters of SLR. So, this isn’t new news so much as finer detail.
It addresses my long-held irritation with the word “belief” as applied to science. While comprehension about how science works is perhaps partly cultural, in fact it functions pretty much without “faith” which is something else entirely. It has a side benefit of bypassing Popper and Kuhn, about whom arguments can be head-spinning. I particularly abhor the word “falsifiability” which is catmeat to science deniers. By using a term that implies its opposite, it invites misuse.
“Science is boring,” Strevens writes. “Readers of popular science see the 1 percent: the intriguing phenomena, the provocative theories, the dramatic experimental refutations or verifications.” But, he says,
” behind these achievements . . . are long hours, days, months of tedious laboratory labor. The single greatest obstacle to successful science is the difficulty of persuading brilliant minds to give up the intellectual pleasures of continual speculation and debate, theorizing and arguing, and to turn instead to a life consisting almost entirely of the production of experimental data.”
The allocation of vast human resources to the measurement of possibly inconsequential minutiae is what makes science truly unprecedented in history. Why do scientists agree to this scheme? Why do some of the world’s most intelligent people sign on for a lifetime of pipetting?
Strevens thinks that they do it because they have no choice. They are constrained by a central regulation that governs science, which he calls the “iron rule of explanation.” The rule is simple: it tells scientists that, “if they are to participate in the scientific enterprise, they must uncover or generate new evidence to argue with”; from there, they must “conduct all disputes with reference to empirical evidence alone.” Compared with the theories proposed by Popper and Kuhn, Strevens’s rule can feel obvious and underpowered. That’s because it isn’t intellectual but procedural. “The iron rule is focused not on what scientists think,” he writes, “but on what arguments they can make in their official communications.” Still, he maintains, it is “the key to science’s success,” because it “channels hope, anger, envy, ambition, resentment—all the fires fuming in the human heart—to one end: the production of empirical evidence.”
Susan Andersonsays
re my previous, it is from a NYer book review (10/5/20 issue) of this:
“The Knowledge Machine: How Irrationality Created Modern Science” (Liveright) which introduces Michael Strevens as a philosopher at New York University. These reviews often go deep.
Got a question for the blog owners (Gavin et al.):
The deniers often say the greenhouse effect “violates the second law of thermodynamics” because “you cannot have heat flow from a cold source to a warm one.” In reality, of course, 2LOT only says you can’t have NET heat flow from a cold source to a warm one. I tried to think of a simple counterexample, so I took a square meter atmosphere at 255 K and a square meter of ground at 288 K.
Both objects are assumed to be blackbodies (ε = 1). Up is considered positive. The atmosphere radiates -239.7576 joules in one second, the ground radiates 390.1051. Taking S = dQ / T, we have entropy of -0.94 joules per kelvin for the atmosphere and 1.35 J/K for the Earth. The total is 0.41 J/K, which is positive. Entropy has increased for the system as a whole, so 2LOT is not violated.
My question is–is this a legitimate calculation? Can I simply take the temperature of each object as the appropriate temperature for the 2LOT equation? What mistakes have I made? I’d appreciate input from any competent person.
mikesays
at AB at 32: I think it is possible that Mann is wrong about the time lag on heat because the feedbacks that have started would take longer to produce the cumulative heat we have bought at any particular moment than 3 to 5 years.
For instance, think about the heat buildup that arises slowly from warmed oceans melting ice and changing the albedo of the planet. I think the way the oceans absorb heat and return heat to the atmosphere are likely on longer cycles than 3 to 5 years. The fluctuating heat of the oceans will similarly be rather slow to produce all the ice melt that will follow from ocean heat storage. Heat buildup and increase from loss of ice and glacier albedo is also likely to take many years to stabilize to a point where we could say definitively that the heat increase has stopped.
But, and this is the important point, there is no indication or reason to believe that our species is able to stop increasing GHG emissions to the atmosphere, so the whole discussion about time lag is of academic interest, it does not appear to have any real time application with human society.
I would love to be wrong about this, but let’s face it, we have known that ghg emissions are a problem for a long time and our emissions have just continued to grow instead of moving toward a net zero condition. The discussion of what happens if we stop emitting greenhouse gases does not appear to have a real world application.
I think it makes more sense to ask questions about when our cumulative emissions start to blow the wheels of the economic engine and activities that produces emissions. We are much more likely to test those scenarios than we are to test how long heat continues to build after we hit net zero.
this is how we have done over the past ten years:
Last Week September 20 – 26, 2020 411.00 ppm
1 Year Ago September 20 – 26, 2019 408.34 ppm
10 Years Ago September 20 – 26, 2010 386.81 ppm
Mike
nigeljsays
“Climate change is largely responsible for a doubling in the number of natural disasters since 2000, the United Nations said on Monday, warning that the planet was becoming uninhabitable for millions of people…..”
Killian says
30 to 50 feet higher. Hmmm…
Happy Chuseok.
https://www.yahoo.com/news/arctic-hasnt-warm-3-million-122739770.html
dhogaza says
Mr. KIA:
“Tell us how the US Forest Service, and the National Park Service should clear the undergrowth in forests that are Congressionally Designated Wilderness where use of mechanized tools is prohibited by law? It isn’t possible: if you try it, lawsuits will be filed, the state and federal air quality folks will fine you if you burn it, etc. Now you know the rest of the story. ;)”
Most wildfires, including forest fires, are human caused.
Wilderness, on average, gets a lot less human activity than the rest of our National Forests, and are less likely to have human-caused fires within them. No roads means no dragging trailer chains or locking brakes or vehicle fires. Most backpackers nowadays use portal stoves and don’t build fires, and campfires are banned during fire season and this is enforced by wilderness rangers.
So most large wildfires don’t begin in designated wilderness areas, rather fires outside of wilderness burn into them.
Mechanized treatments, from thinning to dead timber removal to plain old logging are themselves a source of human-caused fires. And forests managed for timber through clearcutting burn more readily than mature forests with closed canopies, because the latter shade the undergrowth and soil, which aids moisture retention. Clearcuts after replanting take decades to close the canopy, they lose moisture readily, are shrubby, and burn quickly and easily.
Here in California, the Dolan Fire in Big Sur, which has burned a fair chunk of the Ventana Wilderness, was human caused (arson, and, no, not by antifa). Within the wilderness and bordering forest land it’s been largely an understory burn, which is not a bad thing. Most of the fire fighting efforts were, as usual, centered around saving a couple of towns along highway 1.
You want to stop the frequency of catastrophic fires in a world where they are more likely to blow up rapidly due to climate change?
Get rid of the human-caused fires. That would, for now, more than mitigate against the increased odds due to climate change.
Trying to make designated wilderness a bogeyman is just ideological crap.
barry says
Having participated in the study promoted here about blogosphere perceptions of climate changfe risk, I received a copy of the study.
https://www.mdpi.com/2071-1050/12/19/7990/htm
Probably worth a post?
My grade D takeaway from this and a decade’s worth of slugging it out in the blogosphere is that it’s hard to predict who will be a climate change ‘denier’ or ‘activist’.
MA Rodger says
UAH TLT has posted for September 2020 with an anomaly of +0.57ºC, the second highest anomaly of the year so far. The UAH TLT monthly anomalies for 2020 Jan-Sept sit in the range +0.38ºC to +0.75ºC.
Sept 2020 is the second warmest Sept on the UAH TLT record behind 2019 (+0.61ºC) and sitting ahead of Sept 2017 (+0.56ºC), 2016 (+0.47ºC), 1998 (+0.44ºC), 2010 (+0.36ºC) & 2009 (+0.26ºC).
Sept 2020 has the =10th highest anomaly in the all-month UAH TLT record.
The 2020 year-to-date average anomaly sits 3rd in the ranking tabled below. To gain warmest year accolade from 2016 by year’s end would require the Oct-Dec average to top +0.58ºC, to climb to 2nd above 1998* would require the Oct-Dec average to top +0.41ºC, and to slip to 3rd below 2019 would require Oct-Dec to average less than a rather chilly +0.23ºC. [*1998 still appears near the top of TLT rankings because TLT temperatures are much boosted by El Niño, far more than the surface temperatures. Even in the RSS TLT record which is less trend-defying than UAH, the El Niño-boosted 1998 still sits as 6th warmest year.]
…….. Jan-Sept Ave … Annual Ave ..Annual ranking
2016 .. +0.57ºC … … … +0.53ºC … … … 1st
1998 .. +0.56ºC … … … +0.48ºC … … … 2nd
2020 .. +0.51ºC
2019 .. +0.41ºC … … … +0.44ºC … … … 3rd
2010 .. +0.39ºC … … … +0.33ºC … … … 5th
2017 .. +0.38ºC … … … +0.40ºC … … … 4th
2002 .. +0.24ºC … … … +0.22ºC … … … 8th
2015 .. +0.22ºC … … … +0.27ºC … … … 6th
2018 .. +0.22ºC … … … +0.23ºC … … … 7th
2005 .. +0.20ºC … … … +0.20ºC … … … 9th
2007 .. +0.20ºC … … … +0.16ºC … … … 12th
John Pollack says
Killian @1 Pretty good news article. My one significant reservation was with the final paragraph. It suggests that avoiding a Pliocene CO2 world “will require big steps now to decrease fossil fuel use and turn down Earth’s thermostat.” A severe understatement, in my opinion.
It seems to me that since we’re already at a Pliocene CO2 level, and busily adding more, we aren’t going to get out of this by merely decreasing fossil fuel use. Granted, if we stopped burning fossil fuel tomorrow, some of the atmospheric CO2 will go into long-term ocean storage. We could pull out maybe another 40 or 50 ppm by recarbonizing the soil reserves that we’ve drawn down, and massive reforestation. However, would be the work of generations.
Before that happens, we’re looking at who-knows-how-many more ppm in the atmosphere from fossil fuel burning. Then, we get to massive feedbacks to the warming resulting from permafrost thaw, peat oxidation, methane releases, and whatever other carbon reservoirs that have been stashed away and building up for over 3M years of ice age climates.
One thing really clear is that the climate system is permanently out of equilibrium on the time scale of our civilization, so we’ll be dealing with one unanticipated consequence after another. I’m not saying that we shouldn’t be doing anything (although what we should be doing belongs in
Solutions.)I’m saying that we shouldn’t pretend that we’ll be dealing with anything like a return to equilibrium until after the ice sheets finish however much melting we commit them to undergo, and the feedbacks resulting from that melting.
Russell says
Can Denmark end the climate crisis by capturing carbon in nachos?
https://vvattsupwiththat.blogspot.com/2020/09/can-co2-be-sequestered-in-nachos.html
Al Bundy says
InsideClimateNews:
“The intensified layering, called ocean stratification, is happening ______ than scientists expected, an international team of researchers reported in the study, published Sept. 28 in the journal Nature Climate Change. And that means the negative impacts will arrive _____ and also be ____ than expected, said Mann, a co-author of the study.”
So, assuming a competent and unbiased scientific establishment the odds that the blanked out words are “slower”, “later”, and “less” should be 50%.
Who here will bet, even when given 10:1 odds, that those words actually complete the quote?
Ken Fabian says
dhogaza #2 My observation of forested areas in (fire prone) Australia is they predominately encompass rugged and difficult terrain – they missed being cleared for grazing or cultivation largely for that reason, ie by default.
The costs and difficulties of mechanical undergrowth “management” at such scales are staggering – even were such methods found to be compatible with a viable forest ecosystem or even enduring exploitation of timber. Best management method? That hasn’t been demonstrated. Best method, made illegal – presumably by unreasonable and naive extremism? That has not been demonstrated either, but it presses the “Right” buttons. Reality is, as you say, such methods can increase fire risks; tracked bulldozers are known to start fires in dry conditions just by rolling over rocks that make sparks.
We got similar arguments by deniers of climate change in Australia – that “green” regulation prevented reasonable fire hazard reduction activities and underlying extreme conditions aka global warming (therefore) didn’t have any influence. But it was BS; bushfire authorities, correctly in my view, deemed conditions unsafe. We had less than a few weeks total of “safe” conditions across all of 2019 Winter (when out of control fires started by lightning strike near here) – and big forests around here in global warming enhanced dry conditions need at least 6 weeks of “safe” conditions post burning before risk of fire breaking out is low.
Management of fire risks at the interfaces of forest and farms/towns is complicated. Development has been encouraged in dangerous places but managing fire risks is often outside the everyday experience of those who move in and there is a natural reluctance to “burn off” – where things are never under perfect control and legal liability for any unplanned spread applies.
I suspect also that in the past cool weather fires in Australia were easier to manage – because it was cooler! Dew forming overnight was a (reasonably) reliable natural fire retardant; farmers would light in the evening and go home, confident that fires would self extinguish. It really needs professional management now.
Western Hiker says
[“We’d go out to a big fire or clear cut,” North said, of his college summers working as a tree planter, “and every ten to twelve feet we’d plant another pine tree. At the end it would look just like a corn crop.”
They called it “pines in lines
Pine plantations are notoriously, incredibly flammable until they’re 60 to 80 years old,” North said. Till they’re big enough for their crowns to be out of reach of flames, “if a fire goes through a 12×12 pine plantation it just gets vaporized. The trees go up like matchsticks.”]
https://tinyurl.com/y2b5k9oy
Mr. Know It All says
2- dhogaza
Agree with your comment – I did not mean to denigrate wilderness – my point is that California has a lot of wilderness (35% of federal lands in Cal are wilderness), and that fire prevention and fire fighting are not practical in wilderness due to the no-mechanized-tools rules. Yes, I am aware that most of the fires in Cal are human caused, some being arson. Similar here in the PNW. FYI, I have hiked to the summit of Mt. Manuel (sp?), in the Ventana – great views, but can be a hot climb. Wilderness stats by state:
https://fas.org/sgp/crs/misc/RL31447.pdf
In other CC news, as we enter into the new ice age, we’ve learned recently to do COVID contact tracing. Now, to fight AGW, we will be doing climate tracing to locate sources of CO2, cow farts, etc. You can’t make this stuff up:
https://www.youtube.com/watch?v=Y81EOYIIN34
mkia
Kevin McKinney says
Looking again at the Arctic sea ice, 2020 continues post-minimum to sit right between the record year of 2012 and the third lowest 2019 at a current 4.16 million km2. Presumably, we are as a result seeing more energy than most years flowing from ocean to atmosphere. We can look forward to an accelerated rate of ice regrowth as part of the dynamic, too.
Per JAXA’s VISHOP:
https://ads.nipr.ac.jp/vishop/#/extent/&time=2020-10-01%2000:00:00
The NSIDC should be out soon with their September analysis, which will rule definitely on their assessment of the minimum, and will include the September mean–a useful benchmark according to many folks.
Guest (O.) says
FYI:
Our House is Burning: Scientific and Societal Responses to Mass Extinction | Michael Benton
Mass extinctions and the future of life on Earth | Michael Benton | TEDxThessaloniki
Solar Jim says
Thanks to MAR for last month’s comment at #44, about the von Schuckman et al (2020) paper “Heat stored in the Earth system.” This research recommends that “Earth Energy Imbalance” become the fundamental metric to measure humanity’s response. The paper reports results from years of research, including several thousand Argo floats that acquire data from various ocean depths around the world. It is important to understand that this metric in not radiative forcing. Although due to it and therefore increasing, it is rather actual current energy flux around the planet’s surface.
Although the value of EEI in the first decade of this century was about 0.6 Watts per square meter, it is now measured during 2010-2018 as 0.87 W/m2. Since this applies to the entire surface, whcih is 5.1 x 10 E14, the total heating is 4.4 x 10 E14, or 440,000 Gigawatts (GW). (an astounding value) With an estimated 3% going into icemelt this means 13,200 GW, or the equivalent of more than thirteen thousand large 1000 MW power plants running 24/7/365, is the cause of some one trillion tons of ice converting to liquid water around the globe each year.
The preferred value of EEI for the continuance of civilization is zero, implying “negative emissions.” Have I missed anything in this “simple calculation?”
Kevin McKinney says
Pending a new FR thread, here’s an interesting (and, it seems to me, sensible) discussion of the economics of battery storage:
https://www.energy-storage.news/blogs/behind-the-numbers-the-rapidly-falling-lcoe-of-battery-storage
Takeaways:
-Declining costs are driving increased deployment;
-Increasing scale and duration feed back with point #1;
-China has an overwhelming advantage in battery manufacture (and hence cost) but this is gradually eroding;
-Still need alternatives for applications over 5-6 hours.
They didn’t discuss the integration of storage with RE generation, unfortunately; I’d have liked to hear what he thought on that score.
MA Rodger says
Adam Lea @225-Sept UV thread,
I think we do perhaps address different aspects of the Atlantic hurricane record. I am thinking back to pre-satellite days while you are thinking of more recent times.
I was thinking in terms of the problems of comparing recent level of hurricane activity with the early 20th century and even the late 19th century.
The reliability of the annual ACE for that early period is surely very poor. For instance, did 1933 truly better 2005 for annual ACE? Or conversely, was it even more the record-holder than it appears on the record? And if this does yield a level of uncertainty for a single year ACE, the rolling average of annual ACE over longer periods will be also unreliable.
The situation is even more of a problem when Storm Numbers are considered. Short lived storms, or storms briefly becoming hurricane-force or major hurricane force would make a nonsense of such accounting.
My thought was that, while ACE may be inaccurate, the lion’s share of the annual total is always major storms. (Perhaps 2020 with its very large number of storms is becoming the exception here.) Thus annual ACE from early years could be assumed not wildly inaccurate. And if this were so, a 5-year run with ACE>100 is seemingly exceptional.
Of course, El Niño will suppress hurricane activity but as you discuss it is not the only factor and forecasters can be caught by surprise by low hurricane activity, as in 2013.
And likely wrapped up in the same considerations is an explanation for the hurricane activity of the late 1990s to early 2000s not been matched by the activity in the years since.
Certainly there have been more years with low activity, but in terms of years with high ACE, the most recent years with higher SST than 1995-2005 (or was it higher where it mattered?) do not show anything as strongly as 1995-2005 in the ratings. (And of course for the satellite era these ACE values are accurately comparable.) The top ten years for ACE since 1995 run:-
2005, 1995, 2004, 2017, 1998, 1999, 2003, 1996, 2010, 2008.
And in those 2017 was high because of a seriously crazy September.
So I do think there is some learning to be had about what has been the drivers/suppressors of hurricane activity over the last 25 years. But, as explained above, in the SeptUV thread I was addressing earlier times.
Al Bundy says
Ken F: The costs and difficulties of mechanical undergrowth “management” at such scales are staggering – even were such methods found to be compatible with a viable forest ecosystem or even enduring exploitation of timber.
AB: Actually, rational forest management pays for itself and provides plenty of forest products while turbocharging evolution (so forests can evolve to the new climate) and preventing serious forest fires.
Like engines and democracy, if ya insist that old failed techniques are the only possible way, then you are holding back progress.
I’ve developed a forest management system that actually works. Perhaps we can talk about it sometime, Ken.
William B Jackson says
#10 You may not be able to make this stuff up but as you have proven “you” can definitely completely misunderstand reality. Plus you have a habit of cherry picking nonsense.
David B. Benson says
For real, by Elizabeth Kolbert:
https://www.newyorker.com/news/annals-of-a-warming-planet/three-scenarios-for-the-future-of-climate-change
Bad, worse or worst.
Waytoolatenow says
New Zealand bushfire that demolished village leads to climate crisis debate
Lake Ōhau village is located at the foothills of the Ben Ohau mountain range, and is home to just 15 permanent residents but its numbers swell significantly during the holiday season. On Sunday morning, a fire tore through the foothills and into the village, forcing 90 people to evacuate.
Although no one was killed or seriously injured, about 50 of the village’s 70 homes were destroyed, as was 4,600 hectares of land, of which 1,900 are conservation estate.
As firefighters continue to subdue the blaze, and residents pick through the ruins of their homes, attention is turning to how and why the fire began, with emergency crews on the scene saying the ferocity of the blaze was rare for New Zealand, and usually only seen in bushfires in Australia, or California.
Climate scientists have warned longer and hotter summers as a result of the climate crisis are making such scenarios more likely, while local officials and industry groups have blamed land management practices.
more
https://www.theguardian.com/world/2020/oct/06/new-zealand-bushfire-that-demolished-village-leads-to-climate-crisis-debate
Adam Lea says
MA Roger @15: Regarding 1933, I’d estimate the true ACE could easily be higher than the recorded ACE. There is a notable lack of tropical cyclone genesis in the eastern Atlantic (two long trackers formed near the Cape Verde islands), and almost all the storms seem to have formed on the doorstep of the Caribbean islands and the U.S. Were there really no storms in that part of the Atlantic, or could there have been one or more significant storms which formed and recurved quickly, that were missed because they didn’t hit land or any ships? Hurricane Lorenzo last year is the type of storm that could have easily been missed in the early 20th century.
MA Rodger says
Copernicus ERA5 re-analysis has been posted for September with a global anomaly of +0.63ºC, up from Aug’s +0.44ºC and also above last 4 months (May-Aug) but sitting below Jan-Apr. 2020 ERA5 anomalies for the year-so-far sit in the range +0.80ºC to +0.44ºC.
September 2020 is the warmest on the ERA5 record, sitting above September 2019 (+0.58ºC), 2016 (+0.55ºC), 2017 (+0.48ºC), 2015 (+0.46ºC), 2018 (+0.40ºC), 2014 (+0.38ºC), 2013 (+0.37ºC), 2005 (+0.33ºC) & 2012 (+0.31ºC).
September 2020 has the 18th highest anomaly on the ERA5 all-month record.
With three-quarters of the year gone, 2020 is sitting in 2nd place in the rankings of warmest year-so-far tabulated below. To gain top-spot for the full calendar year from 2016, the 2020 Oct-Dec anomaly would have to average above +0.636ºC and to drop to 3rd spot below 2019 would require Oct-Dec to average less than +0.476ºC.
…….. Jan-Sept Ave … Annual Ave ..Annual ranking
2016 .. +0.65ºC … … … +0.63ºC … … … 1st
2020 .. +0.63ºC
2019 .. +0.56ºC … … … +0.59ºC … … … 2nd
2017 .. +0.53ºC … … … +0.54ºC … … … 3rd
2018 .. +0.43ºC … … … +0.46ºC … … … 4th
2015 .. +0.37ºC … … … +0.45ºC … … … 5th
2010 .. +0.34ºC … … … +0.32ºC … … … 6th
2014 .. +0.28ºC … … … +0.30ºC … … … 7th
1998 .. +0.27ºC … … … +0.21ºC … … … 14th
2005 .. +0.26ºC … … … +0.29ºC … … … 8th
2007 .. +0.25ºC … … … +0.23ºC … … … 12th
A graph of ERA5 monthly anomalies plotted year-on-year is here (usually 2 clicks to ‘download your attachment’).
Kevin McKinney says
…and yet again:
https://weather.com/storms/hurricane/news/2020-10-07-hurricane-delta-forecast-us-gulf-coast-yucatan-peninsula-0
Hit Cat 4 Tuesday night before (luckily) weakening to Cat 2 prior to slamming the Yucatan.
But watch out, Louisiana!
Kevin McKinney says
I want to highlight Mal’s useful response to a point I made. His post unfortunately got a little lost due to the closure of the UV thread at month’s end. Link:
https://www.realclimate.org/index.php/archives/2020/09/unforced-variations-sep-2020/comment-page-5/#comment-777645
Bottom line, I was right but only up to 2018, when Congress issued a large extra allocation specifically for fighting wildfire. (See above link for more.) Presumably, that has helped forest management budgets out–but there’s a big backlog to be cleared. (So to speak.)
Also getting lost at the end of the month was this comment of mine:
(Spotify, Apple Music, and Youtube all work, too, with a little searching of “Doc Snow” and “Carolina Maze.”)
Russell says
18
Sad to see The New Yorker turn into a well written climate blog re- re-runs : Elizabeth Kolbert is channelling Bill McKibben channelling Jonathan Schell
“So how hot—which is to say, how bad—will things get? One of the difficulties of making such predictions is that there are so many forms of uncertainty, from the geopolitical to the geophysical. (No one, for example, knows exactly where various “climate tipping points” lie.) That being said, I’ll offer three scenarios.”
Al Bundy says
Mrkia: fire fighting are not practical in wilderness due to the no-mechanized-tools rules
AB: what a doofus. As if laws and rules can’t be changed. Really? You are claiming that mechanical systems are impossible because laws can NEVER change? Even a sub-genius knows that laws adjust to fit whims. Much disagreement on whose whims, but your post is obviously a deliberate ploy, “we’re helpless because this rule exists and even though I scream about the need to erase all rules I will support this one to death because…..”
finish the sentence, dweeb.
MA Rodger says
Solar Jim @13,
You ask for confirmation about the “Earth Energy Imbalance” and set out that this EEI should be zero, this “implying negative emissions.”
While the EEI remains positive the planet will warm, the positive EEI representing our past & future GHG emissions which have yet to act. Once our direct* GHG emissions are zero, the EEI will begin to decline through:-
(1) The natural draw-down of GHGs which for CO2 means into the oceans. For CO2, the anthropogenic CO2 increase will halve, a process taking a millennium but with the lion’s share of this draw-down appearing over the first century.
So if we stopped CO2 emissions tomorrow, we could expect an eventual CO2 level of perhaps 350ppm, *although this assumes we would not have kicked off any significant CO2 emissions from the biosphere.
And there would be the residual forcing from other GHGs to be factored in as well.
(2) Planetary warming caused by any left-over EEI which may still need satisfying. Thus, that potential reduction to 350ppm CO2 by AD3100 if emissions magically stopped tomorrow and if ECS=3.2ºC and with no other legacies from AGW, we would be left with a global temperature increase of something like +1.25ºC. But that takes no account of the temperatures we would encounter before AD3100 and with continuing emissions that theoretical AD3100 increase is rising by the year.
When these two processes have run their course, we will have established our new planetary temperature. If that temperature were considered to be looking too high, that would be the point when we would require anthropogenic negative emissions. The science seems to be saying that, for AGW to be restricted to +1.5ºC and with the likely emissions we still expect before we reach zero emissions, we will require significant negative emissions prior to 2100.
Adam Lea says
22: Unfortunately Delta looks likely to come ashore in the same area that Laura devestated, there are still a lot of blue tarps where roofs used to be. Looking less likely it will be a major hurricane at its second landfall, but the storm is forecast to get wider, which will increase the storm surge potential.
MA Rodger says
RSS has posted its TLT anomaly for September (but so far not updated the browser).
The RSS September global TLT anomaly is +0.87ºC, up from Aug’s +0.77ºC and the third highest of the year sitting below Jan &Feb. 2020 RSS TLT anomalies for the year-so-far sit in the range +0.74ºC to +1.02ºC.
September 2020 is the 3rd warmest on the RSS TLT record (2nd in UAH), sitting below Septembers 2019 & 2017 (both +0.89ºC) and above September 2016 (+0.82ºC), 2010 (+0.66ºC), 2015 (+0.66ºC), 2009 (+0.62ºC) & 2012 (+0.59ºC).
September 2020 has the 10th highest anomaly on the RSS TLT all-month record (=10th in UAH).
With three-quarters of the year gone, 2020 is sitting in 2nd place in the rankings of warmest year-so-far tabulated below (3rd in UAH). To gain top-spot for the full calendar year from 2016, the 2020 Oct-Dec anomaly would have to average above +0.74ºC and to drop to 3rd spot below 2019 would require Oct-Dec to average less than +0.47ºC.
…….. Jan-Sept Ave … Annual Ave ..Annual ranking
2016 .. +0.87ºC … … … +0.80ºC … … … 1st
2020 .. +0.83ºC
2019 .. +0.74ºC … … … +0.74ºC … … … 2nd
2017 .. +0.69ºC … … … +0.63ºC … … … 4th
2010 .. +0.67ºC … … … +0.68ºC … … … 3rd
2015 .. +0.66ºC … … … +0.58ºC … … … 6th
1998 .. +0.55ºC … … … +0.61ºC … … … 5th
2018 .. +0.53ºC … … … +0.54ºC … … … 7th
2014 .. +0.48ºC … … … +0.47ºC … … … 9th
2005 .. +0.47ºC … … … +0.48ºC … … … 8th
2014 .. +0.46ºC … … … +0.42ºC … … … 11th
A graph of the last-decade’s SAT & TLT monthly anomalies is here (usually 2 clicks to ‘download your attachment’).
A graph of RSS TLT monthly anomalies plotted year-on-year is here (2 clicks).
(And up thread the ERA5 re-analysis year-on-year monthly anomaly graph suffered a duff link. Not sure what happened but here it is again (2 clicks).
Al Bundy says
Western Hiker: Pine plantations are notoriously, incredibly flammable until they’re 60 to 80 years old,
AB: Pine heartwood is used to make fatwood kindling. Lights about as easily as paper. And pine needles are way flammable when dry. Pines are oily trees. So I believe you 120%.
And 60-80 years sounds suspiciously impractical. If the rotation is that or less, then the only “safe” period is the few years immediately after a cut.
And those pine plantations are surreal to drive past. The geometry goes on forever, so it’s like hundreds of hallways.
But, like I said, the problem is easily solved. I’m sure everyone here has all the data they need to come up with the solution. Heck, it’s as obvious as the temporal torque transfer device (which cuts the number of cylinders an engine needs for a given smoothness in half).
Yep, not actual but effective time travel is easy, almost 100% efficient, and dirt cheap, but only for torque. Send half of a cylinder’s torque into the future and the engine as a whole acts like it has twice as many half-sized cylinders.
Al Bundy says
Yo EP!
I challenge you to figure out how a temporal torque transfer device works.
I’ll even give you a clue:
If an electric motor fails, depending on design, it might result in a temporal torque transfer device.
Kevin McKinney says
#29–
Just a side note: hereabouts–i.e., the Carolina sandhills region–there’s a fair bit of longleaf pine restoration happening, for instance at the Battle of Camden site:
https://www.historiccamden.org/camden-battlefield/
(In the pictures you can see both the currently dominant loblolly pines, and young longleaf pines. Over much of the site the loblollies have now been removed by cutting, and they will be suppressed on an ongoing basis by intentional burns.)
Point being–as Janisse Rae discussed so beautifully in her Ecology of a Cracker Childhood–the longleaf pines are fire-adapted–historically, they dominated much of the US southeast, but struggled due to over-harvesting (they are a great source of softwood lumber) and fire suppression. So, not *all* pines are highly flammable!
Anyway, carry on!
Al Bundy says
The Guardian: The actual lag effect between halting CO2 emissions and halting temperature rise, then, is not 25 to 30 years but, per Mann, “more like three to five years.”
In short, this game-changing new scientific understanding suggests that humanity can turn down the heat almost immediately by slashing heat-trapping emissions.
AB: Look at that. A “Things are better than we thought” moment!
MA Rodger says
The JAXA Arctic Sea Ice Extent numbers look like handing 2020 another couple of days of ‘lowest SIE for the time of year’ record in the coming week. This post-minimum October situation of dipping SIE anomalies has become a frequent phenomenon recently, shown more clearly on an anomaly plot of JAXA SIE as shown here (usually 2 clicks to ‘download your attachment’).
I say “frequent” as this dip in SIE anomaly post-minimum is the same path followed in 2007, 2011, 2016, 2018, 2019 and now 2020. And perhaps it seems to be saying that the impact of AGW and all that extra heat sloshing about up there can more easily delay the onset of the freezing season than it can reduce the minimum SIE in September. Perhaps that residual SIE at the height of the melt may be a harder nut to crack than it would simplistically appear.
Perhaps there is another bit of evidence to consider.
A plot of PIOMAS Sea Ice Volume anomalies sourced from Wipneus at Arctic Neven’s is perhaps showing the same/similar situation but in terms of the volume pre-minimum.
By the height of the melt in September, the zero anomaly is just 4,000 cu km below the recorded SIV anomaly. But the anomaly is achieving 2,000 cu km lower values in early July when there is 13,000 cu km to melt. This means the melt of SIV July-to-September in recent years is actually less than seen in earlier years. 2,000 cu km is ‘missing’ relative to earlier years and that ‘missing’ melt is a very significant part of the remaining minimum in September.
(The graph 14b (XIVb) at MARCLIMATEGRAPHS plots the appearance of the ‘zero anomaly’ on a monthly PIOMAS anomalies plot [note the different anomaly base], but a direct link is being denied by GoogleSites for some reason. Being monthly data, the dip in early July is less pronounced.)
Of course, PIOMAS is the output of a model and those July values may be the most difficult to check being in the height of the melting when melt ponds make even Sea Ice Area difficult to measure. But if the SIV values for July are robust, it does raise the question: Why would the SIV be seeing a dipping anomaly pre-minimum while SIE sees it post-minimum? Speculating, do draining melt ponds on the thin July ice prevent a lower minimum in September while the warmer seas prevent the quicker spread of the ice edge in October?
Killian says
24 Russelll: Sad to see The New Yorker turn into a well written climate blog re- re-runs : Elizabeth Kolbert is channelling Bill McKibben channelling Jonathan Schell
I fixed it for her. Please forward.
“So how hot—which is to say, how bad—will things get? One of the difficulties of making such predictions is that there are so many forms of uncertainty, HPWEVER, THE LONG-TAIL RISK IS SO EXTREME WE CANNOT RISK IT UNDER ANY CIRCUMSTANCES. We must change virtually everything in all industrialized nations.”
Killian says
Solar Jim: Thanks to MAR for last month’s comment at #44, about the von Schuckman et al (2020) paper “Heat stored in the Earth system.” This research recommends that “Earth Energy Imbalance” become the fundamental metric to measure humanity’s response.
Hmmm… The energy balance could be achieved in ways that are utterly unsustainable and completely unjust. It’s an indicator, not a measure of successful responses.
The only measure that mattersisregenerative or not.
MA Rodger says
Adam Lea @20,
I agree entirely that seasonal totals for Atlantic hurricane ACE in earlier years could be significantly wrong. The apparent absence of lesser storms has been kicked about in the literature. I am less familiar with discussion of geographical absences being discussed but I’m sure they have not gone without consideration.
I think we agree that the direct comparison of pre-satellite storm numbers and of ACE totals with the data we now have in the satellite era has many elephant traps to negotiate.
My thoughts/speculations with the number of consecutive ACE=+100 years was speculating on a way to compare these earlier pre-satellite years with the satellite era that avoided these elephant traps.
Thus I pointed to the run of years with high seasonal ACE totals.
We have now had five ACE=+100 in a row 2016-2020 in the satellite era. Given seasonal ACE totals can be considered inaccurate in the pre-satellite era, the question I ask is “How inaccurate?”
The lion’s share of any ACE total is the big long-lived storms. We may underestimate their ACE by a significant percentage but we cannot have missed such storms. So if we say that seasonal ACE totals (that predominantly comprise those big long-lasting storms) could for pre-satellite days underestimate an ACE=+100 year of the late 1940s, how much could that underestimation comprise? Could an actual ACE=+100 year be recorded today as low as ACE=90? Or ACE=80? Or ACE=70? And ditto for years back in the 1800s. And having set such a lower limit for what potentially could be an ACE=+100 year, is then 1947 elevated from ACE=88, 1948 from ACE=95 and 1949 from ACE=96, all to ACE=+100 to create a 5-year run 1947-51? And could 1895 be elevated from ACE=69 to ACE=+100 to provide a six-year run of ACE=+100 1891-96?
But if those adjustments are seen as too big, adjustment beyond credibility, then the 2016-20 run of five consecutive ACE=+100 years (and likely five ACE=+125 years) would then be unique on the record.
Al Bundy says
I’ve decided to not pursue ownership of my forestry ideas. Here is a top level description:
A forest that has been recently cut is easy to defend from fire. Each year of growth makes it harder, probably all the way to the next cutting time.
Each forest has an optimal rotation rate. The characteristics of a firebreak that will almost certainly hold (with human help) can be predicted. The worst winds in fire season will likely be from a specific direction, for example (and geologic features are constants, more or less).
Rotation is 60 years? Make another firebreak 60 firebreak-widths away. Repeat.
Next year a new set of firebreaks is harvested such that a fire would likely have to get past last year’s firebreak first. Repeat yearly.
The firebreaks have mother trees and wildlife corridors. The corridors function as chokepoints for genetic transfer, creating almost but not quite separate populations. Genes that better fit the new environment will arise and become dominant in one population, and then spread to other populations via the corridors, where they’ll hook up with different advantageous mutations, providing hybrid vigor.
Islands of never-to-be-touched forest are scattered throughout the pattern (generally in the toughest terrain).
The long strips allow for a very large area of harvest per mile of service rail so the harvest can get to market at reasonable cost.
And if a fire gets past a firebreak, so what? It will feast for a few firebreak-widths, but then it has to fight through ever sparser fuel all the way to the next break.
Thoughts?
Kevin McKinney says
Reading one of those random stories that gets pushed to one’s desktop this morning over coffee, I was struck by this snippet from a piece on computer energy use:
Hmmm… thermodynamics in a non-equilibrium system… like, say, Earth’s climate system.
Rule one: search the literature before asking stupid questions. (They may still be stupid, but at least they’ll be a slightly more informed brand of stupid.)
So, searching “non-equilibrium+statistical+physics+climate” and (noticing an auto-fill suggestion “non-equilibrium+thermodynamics+climate” turned up a smattering of papers. Browsing from ignorance, one doesn’t know how to separate wheat from chaff, but there was this:
It shows several concerning signs, notably an apparent mismatch between topic and journal focus, and an abstract that frankly is poorly written. (But maybe that’s just ESL.) Plus, just exactly how obscure is the ‘Journal of Coastal Research,” anyway? Then there’s the fact the article has all of 4 citations.
Yet this–from the abstract I just impugned–was intriguing:
Continuing on through the search results, it became evident that one researcher seriously focused on this and related areas is Valerio Lucarini, apparently associated with the Universities of Reading and Bologna; he has several related papers to his credit.
A non-mathematically-technical paper of his, which however seems not to have appeared except online and as a book chapter–i.e., non-peer-reviewed sources–is here:
https://www.researchgate.net/profile/Valerio_Lucarini/publication/51910325_Modelling_Complexity_the_case_of_Climate_Science/links/0f317535f84199ea0f000000.pdf
It’s not easy reading because of the dense and occasionally jargon-laden prose, but is otherwise pretty accessible. And it makes some interesting points about climate modeling, and the epistemological criteria pertaining thereto. (Interesting to me, at least.)
But although the second search–the ‘non-equilibrium thermodynamics climate’ one–turned up 14k+ citations, a sampling of the first couple of search pages seemed to include a high percentage of papers that were either old, or apparently tangential in some way–Dottore Lucarini aside.
So, is this an area of which one can say ‘something is happening here?’ Or just a backwater, or maybe a sideshow?
Solar Jim says
MAR @ 26.
Thanks for your comments. You seem to be much more complacent about the danger to the world’s seacoasts than many scientists are. You make several controversial statements which may derive from your misunderstanding of what EEI represents.
RE: “the positive EEI representing our past & future GHG emissions which have yet to act.” This is untrue, as I understand it. The value of EEI is derived from the heat flux (actual energy transfer, not radiative forcing, RF) into the ocean in recent time’s. This has nothing to do with future GHG emissions, only the past. Additionally, since RF is higher than Earth Energy Imbalance, the quantity of EEI will continue to increase toward the net RF value. (Both of these are measured in watts per square meter over the entire surface of the planet.)
Regards.
Al Bundy says
Solar Jim,
I agree that the imbalance is immediate. The delay is primarily lag due to thermal mass, such as why an Adobe house stays cool on a hot afternoon. The oceans, for example, represent a 1000 year partial delay, both in temperature and CO2 levels. And Greenland and Antarctica have stupendous amounts of phase-changing (and lapse rate) to give to the cause of delaying the inevitable.
Then there’s feedbacks. As things warm up, reactions that further heat things up tend to occur.
MAR knows what is going on. Typos happen.
Donald Condliffe says
Regarding the management of forests I like to look at what real experts say. One such person is Dave Daley who published an article in the Chico Enterprise on September 27, 2020. Cry for the Mountains. Daley is a rancher and comes from a ranching family that has run cattle since 1852 in the National Forest that burned in this year’s North Complex fire. His expertise on forest management is also academic he is a PhD who teaches forest and land management. He also is Past President of the California Cattlemen’s Association, current Chair of the California Cattle Council, Chair of the Forest Service committee for the Public Lands Council and Chair of Federal Lands for the National Cattlemen’s Beef Association. He also is familiar with the many thousands of years of local native American land management practices. His conclusions therefore carry great weight in my mind. His view is that California’s fire danger has two main causes mismanagement of the forests for 8 decades AND global warming causing the mismanaged forests to become drier. Definitely NOT just one or the other. One of which, forest management, we can improve with local action and a return to more intensive management of large parts of our forests for fire control using controlled burns following practices adapted from ancient native practices, that were used by his family until the 1960s. In addition the local tribes are proposing that intensive forest management should be done using local labor and local companies, including the tribes, with preferentially awarded management contracts for local coops and companies for forest areas. Producing reduced fire risk and economic development and higher forest productivity and more jobs, in particular good jobs that do not require a college degree. The other problem global warming, requires global action and frankly I doubt we will succeed in limiting the switch to a hotter stable state.
Donald Condliffe says
Re 37. Al I have seen this idea before when I attended the University of Oregon in Eugene in the 1970s. It also resembles closely the pie sector harvesting plan then proposed that would divide an area into slices allowing harvest of a slice every 10 to 20 years depending on the age of tree desired. The plan was to break up the landscape into a tree age patchwork with many fire breaks so fires would always remain limited. The pattern that existed before European settlement. Before our interventions, now seen to have gone awry.
I think the problem with idealized plans like this is that the term ivory tower applies in spades. When compared to the ideas from people with decades of experience informed by study of the local practices for thousands of years, informed by deep understanding of the local terrain, knowledgeable of the thousands of years of wisdom accumulated by local tribes on the boundaries of climate extremes now confirmed by paleoclimate studies (that exceed all worst case estimates now being used), any outsider cannot possibly prescribe better than informed locals when they have a stake in the success of the plan.
Karsten V. Johansen says
Could someone enlighten me regarding which research Michael Mann is referring to here: “Using new, more elaborate computer models equipped with an interactive carbon cycle, “what we now understand is that if you stop emitting carbon right now … the oceans start to take up carbon more rapidly,” Mann says. Such ocean storage of CO2 “mostly” offsets the warming effect of the CO2 that still remains in the atmosphere. Thus, the actual lag between halting CO2 emissions and halting temperature rise is not 25 to 30 years, he explains, but “more like three to five years”.
This is “a dramatic change in our understanding” of the climate system that gives humans “more agency”, says Mann. Rather than being locked into decades of inexorably rising temperatures, humans can turn down the heat almost immediately by slashing emissions promptly. “Our destiny is determined by our behavior,” says Mann, a fact he finds “empowering”.”
https://www.theguardian.com/us-news/2020/oct/02/donald-trump-climate-change-michael-mann-interview
Is it this:
https://scitechdaily.com/our-oceans-are-capturing-more-carbon-than-expected-underestimated-by-up-to-900000000-metric-tons-of-carbon-per-year/
https://www.researchgate.net/publication/344122812_Revised_estimates_of_ocean-atmosphere_CO_2_flux_are_consistent_with_ocean_carbon_inventory ?
And how does this connect with this:
https://scitechdaily.com/ocean-may-absorb-less-co2-as-man-made-carbon-emissions-are-cut/ ?
MA Rodger says
Solar Jim @39,
My apologies. You are correct in spotting the error I managed @26. I say “managed” as it is not what I wanted to write and probably resulted from editing that comment @26 which got a bit more complex as it was being witten. The words “& future” somehow escaped the editors cut.
The Earth’s Energy Imbalance (EEI) today represents the positive Radiative Forcing (RF) from the increased GHGs (only today’s increase, not “& future” ones) that has yet to warm the planet (ΔT) such that RF = EEI + f(ΔT). Note that as the RF does its stuff and increases ΔT (which as you say mainly requires warming the oceans), the value of f(ΔT) will increase and EEI will thus decrease, eventually EEI decreasing to zero when equilibrium is achieved. (You ‘manage’ to set this the other way round @39.)
….
You also mention Sea level Rise as being connected to my understanding of EEI and my disagreements concerning SLR, presumably my disagreement with some who consider multi-metre SLR by 2100 the likely outcome. There is only a tenuous SLR/EEI connection.
In terms of SLR, I have long worried that the very significant SLR beyond 2100 is being ignored and also have challenged claims of multi-metre SLR by 2100, two separate things.
On the multi-metre SLR by 2100: a decade back, the sole serious advocate of multi-metre 2100 SLR was James Hansen, this is an issue of melting ice caps. My argument is/was that given the EEI there is a limit to the energy available to melt the volume of ice required for such a sudden SLR. When the Hansen thesis was eventually set out in the discussion paper Hansen et al (2015)‘Ice melt, sea level rise and superstorms: evidence from paleoclimate data, climate modeling, and modern observations that 2ºC global warming could be dangerous’ you will note the modelled EEI of 4Wm^-2 is achieved through a period of significant negative ΔT. Note that in this whole multi-metre SLR business, I was not arguing against the possibility but against the absence of a workable theory to back it up.
And on the same score, I would criticise the article linked @1 & discussed @5. It is very thin on science to support the message of 15m SLR. It simply says that 3 million years ago the Arctic was ice-free with the ocean’s coasts cloaked in boreal forest and and the seas 15m higher, pointing out that three million years ago was also the last time we saw CO2 at 412ppm.
My understanding is that the level of CO2 three million years ago was probably less than 400ppm (so the last time CO2 was at 412ppm would be much earlier, perhaps 13 million years ago) and that other things like the absence of the Panama Isthmus had a role in the warmer global temperature three million years ago. And in setting out such a view I have discovered folk who will passionately defend +400ppm three million years ago, this resulting in lively interchanges.
Killian says
32 Al Bundy:The Guardian: The actual lag effect between halting CO2 emissions and halting temperature rise, then, is not 25 to 30 years but, per Mann, “more like three to five years.”
In short, this game-changing new scientific understanding suggests that humanity can turn down the heat almost immediately by slashing heat-trapping emissions.
As I have said, return to 260-280 ppm, you start stabilizing the poles within decades, thus potentially avoiding many meters of SLR. So, this isn’t new news so much as finer detail.
Susan Anderson says
The New Yorker provides a good overview on How Does Science Really Work
https://www.newyorker.com/magazine/2020/10/05/how-does-science-really-work
It addresses my long-held irritation with the word “belief” as applied to science. While comprehension about how science works is perhaps partly cultural, in fact it functions pretty much without “faith” which is something else entirely. It has a side benefit of bypassing Popper and Kuhn, about whom arguments can be head-spinning. I particularly abhor the word “falsifiability” which is catmeat to science deniers. By using a term that implies its opposite, it invites misuse.
Susan Anderson says
re my previous, it is from a NYer book review (10/5/20 issue) of this:
“The Knowledge Machine: How Irrationality Created Modern Science” (Liveright) which introduces Michael Strevens as a philosopher at New York University. These reviews often go deep.
Barton Paul Levenson says
Got a question for the blog owners (Gavin et al.):
The deniers often say the greenhouse effect “violates the second law of thermodynamics” because “you cannot have heat flow from a cold source to a warm one.” In reality, of course, 2LOT only says you can’t have NET heat flow from a cold source to a warm one. I tried to think of a simple counterexample, so I took a square meter atmosphere at 255 K and a square meter of ground at 288 K.
Both objects are assumed to be blackbodies (ε = 1). Up is considered positive. The atmosphere radiates -239.7576 joules in one second, the ground radiates 390.1051. Taking S = dQ / T, we have entropy of -0.94 joules per kelvin for the atmosphere and 1.35 J/K for the Earth. The total is 0.41 J/K, which is positive. Entropy has increased for the system as a whole, so 2LOT is not violated.
My question is–is this a legitimate calculation? Can I simply take the temperature of each object as the appropriate temperature for the 2LOT equation? What mistakes have I made? I’d appreciate input from any competent person.
mike says
at AB at 32: I think it is possible that Mann is wrong about the time lag on heat because the feedbacks that have started would take longer to produce the cumulative heat we have bought at any particular moment than 3 to 5 years.
For instance, think about the heat buildup that arises slowly from warmed oceans melting ice and changing the albedo of the planet. I think the way the oceans absorb heat and return heat to the atmosphere are likely on longer cycles than 3 to 5 years. The fluctuating heat of the oceans will similarly be rather slow to produce all the ice melt that will follow from ocean heat storage. Heat buildup and increase from loss of ice and glacier albedo is also likely to take many years to stabilize to a point where we could say definitively that the heat increase has stopped.
But, and this is the important point, there is no indication or reason to believe that our species is able to stop increasing GHG emissions to the atmosphere, so the whole discussion about time lag is of academic interest, it does not appear to have any real time application with human society.
I would love to be wrong about this, but let’s face it, we have known that ghg emissions are a problem for a long time and our emissions have just continued to grow instead of moving toward a net zero condition. The discussion of what happens if we stop emitting greenhouse gases does not appear to have a real world application.
I think it makes more sense to ask questions about when our cumulative emissions start to blow the wheels of the economic engine and activities that produces emissions. We are much more likely to test those scenarios than we are to test how long heat continues to build after we hit net zero.
this is how we have done over the past ten years:
Last Week September 20 – 26, 2020 411.00 ppm
1 Year Ago September 20 – 26, 2019 408.34 ppm
10 Years Ago September 20 – 26, 2010 386.81 ppm
Mike
nigelj says
“Climate change is largely responsible for a doubling in the number of natural disasters since 2000, the United Nations said on Monday, warning that the planet was becoming uninhabitable for millions of people…..”
https://www.msn.com/en-gb/news/environment/climate-change-largely-to-blame-for-a-near-doubling-of-natural-disasters-since-year-2000-says-un/ar-BB19WS5R?li=BBoPWjQ