New addition: Download an annotated pdf of the Fraser report. An interactive pdf file, to be read on the screen, is here, and a printable version is here. Suggestions for further commenting are welcome. Additions to the pdf have to be short, and tied to particular pieces of text or figures. And of course we will only incorporate comments that we deem to be scientifically sound and cogent.
*****************
While most of the world’s climate scientists were following the IPCC fest last week, a few contrarians left out in the cold were trying to to organize their own party.
An unofficial, “Independent Summary for Policymakers” (ISPM) of the IPCC Fourth Assessment report has been delivered by the Fraser Institute. It’s a long, imposing-looking document, resembling, come to think of it, the formatting of the real Summary for Policymakers (SPM) document that was released on Friday after final negotiations of the IPCC in Paris last week. The Fraser Institute has assembled an awesome team of 10 authors, including such RC favorites as tilter-against-windmills-and-hockey-sticks Ross McKitrick, and other luminaries such as William Kininmonth, MSc, M.Admin — whose most recent paper is “Don’t be Gored into Going Along” in the Oct-Nov issue of Power Engineer. To be fair, he did publish a paper on weather forecasting, back in 1973. According to the press release, the London kickoff event will be graced by the presence of “noted environmentalist” David Bellamy. It’s true he’s “noted,” but what he’s noted for is his blatant fabrication of numbers purporting to show that the world’s glaciers are advancing rather retreating, as reported here.
Why go to all the trouble of producing an “independent” summary? The authors illuminate us with this wisdom regarding the official Summary for Policymakers: “A further problem is that the Summary for Policy Makers attached to the IPCC Report is produced, not by the scientific writers and reviewers, but by a process of negotiation among unnamed bureaucratic delegates from sponsoring governments.” This statement (charitably) shows that the Fraser Institute authors are profoundly ignorant of the IPCC process. In fact, the actual authors of the official SPM are virtually all scientists, and are publically acknowleged. Moreover, the lead authors of the individual chapters are represented in the writing process leading to the SPM, and their job is to defend the basic science in their chapters. As lead author Gerald Meehl remarked to one of us on his way to Paris: “Scientists have to be ok, they have the last check. If they think the science is not represented, then they can send it back to the breakout groups. ”
A common accusation at the time of the Third Assessment Report was that the SPM didn’t reflect the science in the rest of the report. A special National Academy panel was convened at the request of President GW Bush, to consider this and other issues. The Panel found no significant disconnect between the SPM and the body of the report. The procedure followed this time is not in essence any different from that which has been used for previous IPCC reports.
One of the strangest sections of the Fraser Institute report is the one in which the authors attempt to throw dirt on the general concept of radiative forcing. Radiative forcing is nothing more than an application of the principle of conservation of energy, looking at the way a greenhouse gas alters the energy balance of a planet. The use of energy conservation arguments of this type has been standard practice in physics at least since the time of Fourier. We have heard certain vice presidents dismiss “Energy Conservation” as merely a matter of personal virtue, but we have never before heard people who purport to be scientists write off the whole utility of “Conservation of Energy.” From what is written in the Fraser report, it is not even clear that the authors understand the first thing about how radiative transfer calculations are done. They criticize the radiative forcing concept because it “fails to take into account the lifetime of greenhouse gases” — as if we really needed to know anything more about CO2 in this regard than that it stays around for centuries to millennia. They say that radiative forcing “is computed by assuming a linear relationship between certain climatic forcing agents and particular averages of temperature data.” Nonsense. It is computed using detailed calculations of absorption and emission of infrared radiation, based on laboratory measurements carried out with exquisite accuracy, and meticulously checked against real atmospheric observations.
Hockey-stick bashing and solar-explains-all advocacy are favorite activities of the denialist camp, so it is no surprise to see both themes amply represented in the Fraser Institute report. In neither case does the Fraser report break new ground in bad behavior. It’s just more of the same old same old. On climate of the past millennium, the Fraser report misrepresents the recent National Research Council report , which concluded quite the opposite of what the Fraser report claims it concluded: The National Research Council, like the official SPM, affirms that recent warming really does appear anomalous in light of the past millennium. The Fraser report obscures this point by cleansing the recent period of warming from their graphs. The discussion of solar variability consists of a lot of vague talk about unexplored possibilities, while skirting the basic problem with solar variability as an explanation of recent warming: There is no observed trend in solar activity of a type that could explain recent warming, and if the problem were an unobserved trend in solar ultraviolet, it would make the stratosphere (where UV is absorbed by ozone) trend warmer relative to a constant-solar baseline. In reality, the stratosphere is cooling strongly, and at about the rate the models predict.
The basic approach taken by the Fraser Institute Report is to fling a lot of mud at the models and hope that at least some of it sticks. Of course, if one looks at enough details one is bound to find some areas where there is a mismatch between models and reality. Modellers do this all the time, as a way of improving the representation of physical processes. However, to highlight a few shortcomings without asking what their implications might be for climate sensitivity, or whether the mismatch might be due to data problems rather than model problems (as in the case of tropical lapse rate), gives a distorted picture of the state of the art. An examination of the model shortcomings in the light of the vast range of important things they get right leaves the fundamental premise of the cause of warming unchallenged, and to see why, one needs to turn to a balanced assessment of the science such as represented in the full IPCC report.
The Fraser Institute authors also raise the curious objection that models have not been “formally proven” to be suitable for predicting the future. We are not sure what it would mean to “formally prove” such a thing (Kurt Gödel, are you listening?), but the specific objection raised in the Fraser report makes no sense: the authors suggest that the number of tunable parameters in models is so great that it may exceed the degrees of freedom in the data being “fit.” In reality, there are at most a dozen or two parameters that modellers touch, most of these are constrained to certain limits by data, and there are physical limitations to what one can do to the output by changing such parameters. In contrast, adding up time series of temperature and precipitation and pressure as a function of latitude and longitude, seasonal cycles, surface radiation balance, ocean heat storage, ENSO events, past climates, and vertical structure, there are literally thousands of observational constraints involved in the evaluation of model behavior.
There are so many bizarre statements in the Fraser Institute report that some of us think that spotting them could serve as a good final exam in an elementary course on climate change. Take your pick. The report states that “The IPCC gives limited consideration to aerosols …” whereas aerosols have been a key part of the scenarios since the Second Assessment Report, were the key to explaining the interrupted mid-century warming, and cannot in any way be mangled so as to spuriously give the warming of the past decades. The ISPM regales us with tales of natural global warming in the distant past, without pointing out that these happened over millions of years, had often massive consequences nonetheless, and were linked to processes like continental drift which are unlikely to be part of the explanation of the recent warming. The Fraser report describes the climate changes of the past century as “minor” (a value-laden and subjective term if ever there was one), failing to realize that climate change so far has been the fire alarm, not the fire. The climate of 2100 is not forecast to be mild.
We could go on, but why bother? We’ll leave off with a quote. “most places have observed slight increases in rain and/or snow cover”
Actually, consulting the draft of Chapter 4, snow cover kinda looks likes it’s been decreasing, not increasing. But take a look at the artful use of “and/or”. The sentence is not “formally” wrong. Superb! When you hear “ISPM,” just think “Incorrect Summary for Policymakers.”
Note: In the interests of timeliness, this commentary has been based on a January 8 draft of the “ISPM” which was leaked to us. If the final released version differs substantively from what we have seen so far, the changes (for better or worse) will be discussed in the comments.
Ike Solem says
RE#150,
You say: “From a basis of physical plausibility, for starters, the models are solving equations that cannot be derived from first principles nor represent any observed any experimentally determined phenomena.”
Well, here is a list of some of the ocean circulation models in use: http://stommel.tamu.edu/~baum/ocean_models.html
Furthermore, you obviously don’t understand how the models work, or the difference between a short-term weather model and a long-term climate model, which are indeed based on things such as the physical equations of motion. For a brief introduction, see:
http://www-das.uwyo.edu/~geerts/cwx/notes/chap12/nwp_gcm.html
A general circulation model (also known as a global climate model, both labels are abbreviated as GCM) uses the same equations of motion as a numerical weather prediction (NWP) model, but the purpose is to numerically simulate changes in climate as a result of slow changes in some boundary conditions (such as the solar constant)or physical parameters (such as the greenhouse gas concentration). Numerical weather prediction (NWP) models are used to predict the weather in the short (1-3 days) and medium (4-10 days) range future. GCM’s are run much longer, for years on end, long enough to learn about the climate in a statistical sense (i.e. the means and variability). A good NWP model accurately predicts the movement and evolution of disturbances such as frontal systems and tropical cyclones. A GCM should do this as well, but all types of models err so much after some time (e.g. 2 weeks), that they become useless from a perspective of weather foresight. The quality of a GCM is judged, amongst others, by the quality of the statistics of tropical or extratropical disturbances.
An error in the sea surface temperature by a few deg C, or a small but systematic bias in cloudiness throughout the model, matter little to a NWP model. For a GCM these factors are important, because they matter over a long term. GCMs ignore fluctuating conditions when considering long-term changes, whereas NWP models take no notice of very slow processes.
Regarding the parameterization of mesoscale eddies, let me refer you back to On the Mixing Coefficient in the Parameterization of Bolus Velocity, JPO 1999
From the abstract: “Mesoscale eddies in the ocean play an important role in the ocean circulation. In order to simulate the ocean circulation, mesoscale eddies must be included explicitly or parameterized. The issue you raised was of how to include such features in GCMs, but you only threw out a sentence of jargon: ” bi-harmonic parameterization of lateral viscosity” which sounds ‘scientific’ but is meaningless out of context.
Thus, the only things you say (of a scientific nature) in your post are just wrong. The rest seems designed to create the appearance of scientific controversy.
For those readers who are actually interested in the science, this paper provides a good estimate of the situation in the oceans in 2000:
http://ocean.mit.edu/~cwunsch/papersonline/ganachwunschnature.pdf
“Improved estimates of global ocean circulation, heat transport and mixing from hydrographic data,” Alexandre Ganachaud & Carl Wunsch, Nature 2000 v408 23 Nov.
Again, they point out the need for more comprehensive observations in the oceans:
…limitations now lie primarily in the uncertainty introduced by true oceanic variability from the daily to the interannual. Significant improvements in the present numbers will occur only through the use of data sets permitting true temporal averaging of the oceanic circulation. For instance, the present solution has large uncertainties due to undersampling of the highly variable Brazil current and Pacific- Indian throughflow. Additional observations there would greatly improve accuracy.
Sashka says
Re: 148
First of all, I’d like to thank Ray for taking time to answer. Second, I’d like to point out the sheer difference between the quality of his response and what preceded it. I hope it is abundantly obvious to any unprejudiced observer. Third, I’d like to state explicitly that I’m in full agreement with the last paragraph.
I absolutely disagree with the claim in the second paragraph that states using muni bonds is tantamount to the tax break. Whatever they do, it’s just a choice of financing that was affected by existing tax laws. The laws, of course, were not designed to help any specific industry. In fact, this particular provision resulted from the Supreme Court decision on the constitutional grounds.
I also disagree that investment in R&D necessarily constitutes a subsidy. But I do agree that depletion allowance is a subsidy – this is something I didn’t know about. I stand corrected, thank you very much.
Steve Bloom says
Re #149: I should add that an underlying assumption made by many is that the transition to a sustainable energy economy (necessary for a massive reduction in CO2 emissions) must have a high net cost. California, e.g., is now basing policy on a different assumption (based on a careful analysis). Today I see that Barclay’s Bank, that well-known hotbed of radical greens, has come up with a similar view.
Sashka says
Re: 151
[deliberate mis-interpretation deleted]
For the rest of the readers, let me just point to GFDL web site
http://www.gfdl.noaa.gov/~lat/webpages/om/om_webpage.html
where MOM description begins with the following:
The Modular Ocean Model (MOM) is a three-dimensional, z-coordinate, B-grid, primitive equation ocean circulation model. It is designed primarily as a tool for studying the ocean climate system.
To be sure, read the Wiki page on primitive equations
http://en.wikipedia.org/wiki/Primitive_equations
One important quote from this page:
When a statement of the conservation of water vapor substance is included, these six equations form the basis for any numerical weather prediction scheme.
I hope we are clear on this now.
Finally, let me quote from another climate paper “Formulation of an ocean model for global climate simulations” available on GFDL site:
http://nomads.gfdl.noaa.gov/CM2.X/references/oceanscience_1_45.pdf
Many modelers have traditionally taken a Prandtl number (ratio of viscosity to diffusivity) on the order 1-10. In OM3, we choose a depth independent background vertical viscosity of 10^4 m2 s^-1. The level of background viscosity can also affect the equatorial currents, as discussed in Large et al. (2001). There is no theoretical or observational justification for this value of the vertical viscosity.
That was regarding suspicious parameterizations, of course.
Michael Tobis says
re #138: People claiming credentials ought to identify themselves.
re #147: A doctorate in meteorology is as much a credential in climate science as is possible short of active participation in the field. Though we should be allowed to defend ourselves, we are not and should not be immune from criticism, especially from people with relevant backgrounds.
There are few if any doctoral programs that are called “climatology” or “climate science” or such. Most applicable credentials would be in meteorology, oceanography or geophysics.
Ike Solem says
RE#154,
“Sashka”, your argument isn’t very clear – as I understand it, you are saying that the parameterizations used in the ocean circulation components of global circulation models (as respects the mesoscale eddy effects) are ‘incorrect’ – but then you go on to say that ALL climate models are ‘incorrect’ because they don’t use equations of motion? You seem to misunderstand my comment that you quote – I was saying that BOTH climate and weather models rely on equations of motion.
From the link, http://www-das.uwyo.edu/~geerts/cwx/notes/chap12/nwp_gcm.html ,
Here’s what climate and weather models have in common:
1) Physics: equations of motion (plus radiative transfer equations, water conservation equations, etc.)
2) Computer methods: Finite difference expression of continuous equations, or spectral representation; run prognostically.
3) Output: state variables and motion of the atmosphere in 3 dimensions.
Now, one of the differences is that changes in the ocean are major factors in climate, and not in weather (because of the timescales) – so weather models use current ocean conditions as an ‘external forcing’ of the atmospheric system, whereas ocean circulation is itself an important ‘internal component’ of climate models, which responds slowly to external forcings – i.e. the radiative effects increases in atmospheric CO2.
This is particularly important in the polar regions, and as an example of this issue (which is still apparently quite uncertain), see this paper: http://www.ldeo.columbia.edu/res/div/ocp/pub/martinson/fulltext.pdf
“RE-EVALUATING ANTARCTIC SEA-ICE VARIABILITY AND ITS TELECONNECTIONS IN A GISS GLOBAL CLIMATE MODEL WITH IMPROVED SEA ICE AND OCEAN PROCESSES – JIPING LIU, XIAOJUN YUAN, DOUGLAS G. MARTINSON and DAVID RIND, IJC 2004 vol24”
However, that uncertainty should not be reassuring, particularly if you look at the warming trends in the Arctic over the past few years, based on the 1961-1990 baseline – even in the middle of the Arctic winter, there are strong temperature anomalies in the Arctic: http://www.bom.gov.au/bmrc/ocean/results/SST_anals/SSTA_20070128.gif
These were even more pronounced last last summer.
So, we are seeing accelerating global warming in the Arctic, as far as I can tell – do you disagree?
By the way, what’s your opinion on the correct choice of baseline for anomaly calculations? 1951-1980, 1961-1990, 1971-2000, or 1980-1999?
Hank Roberts says
Look for patterns. Once is happenstance, twice is coincidence, anything further is, well, suggestive (of course, lacking IP numbers, you can never be sure).
+sashka +”climate change”
Enough, already.
Steve Bloom says
Re #154: Maybe you could explain the significance of that last GFDL quote. In particular, how does varying the vertical viscosity affect results?
Sashka says
I am going to try Gavin’s advice, just once. I do find Barton’s post #158 offensive, inflammatory and with no redeemeing value.
[Response: Last chance people. I removed the latest deliberate confusion (both Sashka’s provocation and Barton’s response, but that’s it. Either make an effort to communicate or this whole thread get’s shut down. -gavin]
Ike Solem says
From the original post, we have
They say that radiative forcing “is computed by assuming a linear relationship between certain climatic forcing agents and particular averages of temperature data.” Nonsense. It is computed using detailed calculations of absorption and emission of infrared radiation, based on laboratory measurements carried out with exquisite accuracy, and meticulously checked against real atmospheric observations.
I think that the attacks by Sashka on the ocean component of the coupled global circulation models are an attempt to extend that demonstrably false statement about the radiative forcing to the issue of ocean circulation, where data is much harder to come by.
Even the IPCC report says that they can’t come to a consensus on the meridional overturning circulation effects due to a lack of data.
Historically, this issue has been brought up (in that movie, the Day After Tommorow, for example) a a scenario in which the sinking of cold salty water in the North Atlantic ‘shuts down’ the conveyor system that brings warmth to the North Atlantic. However, as many have noted the Gulf Stream is physically driven and the atmospheric component of poleward heat transport seems to be increasing (yes?).
The real issue may be that bottom water formation will slow. Ganachaud and Wunsch ( http://ocean.mit.edu/~cwunsch/papersonline/ganachwunschnature.pdf ) provide a description of the 2000 state of the oceans. If bottom water formation slows, that means that less oxygen will enter the deep ocean – and that just might account for the low-oxygen water bodies observed off the Oregon coast over the past five years. The huge increase in human nitrogen fixation over the past century might also play a role in the potential ‘eutrophication of the oceans’.
I have no idea over how the reduced formation of bottom water would affect the ability of the oceans to absorb both heat and CO2, however – but it does seem that the failure to set up ocean data collection systems a decade ago was a monumental mistake (unless you are a denialist who thinks that the absence of evidence is evidence of absence).
mark s says
RE #157
Oooh. direct hit mate! Follow the thread, and the purpose is clear.
Respect for that, Hank. Excellent work.
TAC says
Kudos to David Archer and the Group for making the effort to create an annotated version of the Fraser Institute report. It helps immensely to understand what the debate is about.
Alex Tolley says
WSJ (Feb 9th)has an editorial that tries to discredit the press reports on the AEI letter. AEI is now claiming letter was an invite to a symposium on GW policies. Exxon claims no knowledge of the letter until the UK press reported on it.
Has anyone actually posted a copy of this letter? Can Exxon’s denials and AEIs backpeddling be definitely debunked?
J.C.H says
I’m an ExxMob shareholder. The link on their site is not working, but I believe this is what ExxMob had up:
“ExxonMobil’s Response to Recent Media Articles Concerning the American Enterprise Institute
February 5, 2007 — We wish to make it clear that ExxonMobil had no knowledge of allegations made concerning the American Enterprise Institute offering payments for articles critical of the IPCC 4th Assessment Report of Climate Science. We, and many other corporations, fund AEI for the purpose of promoting active policy debate but we do not control their views or actions. We believe that the release of the IPCC Fourth Assessment Report of Climate Science is an important contribution to the issue of climate change. The IPCC report process is valuable in that it facilitates the sharing of global scientific knowledge and encourages further inquiry on the important issue of climate change. We are taking action on many fronts to address the risks of climate change. These include partnerships with vehicle manufacturers to reduce emissions; researching hydrogen-fuelled vehicles, energy efficiency in our own refineries and supporting Stanford University on groundbreaking research to reduce greenhouse gas emissions.”
There is a difference between what ExxMob funds and what its shareholders and employees fund. There may also be a difference in what results they demand from the AEI. ExxMob is but one oil corporation, and I would suspect the above statement is largely true. The industry and its shareholders and employees are much larger players, and the oil culture leans heavily toward contributing to conservative thinkers with the expectation of friendly results.
RC is an excellent website. I’ve learned a great deal. Not all of us are part of the denial culture, which, as of the last few weeks, is just as run aground as the Valdez. Congratulations, you’ve won.
Stephen Berg says
Excellent work with the annotations, David! That must have taken a while!
[Response:A few nice comments like this make it all worth while. Cheers! David]
Peter Winters says
Re: Bellamy
I did a survey after the Earth Summit in 1992, which was sponsored by the Conservation Foundation (for which David Bellamy is the president), on what â??UK environmental decision-makersâ?? thought about the summit, and what should be done next. Respondents were from NGOs, government and business. The report â??The Road from Rioâ?? was published in 1993, and had a foreword by Jonathan Porritt. One of the findings was that environmental decision-makers were particularly concerned about â??protecting the atmosphereâ??. In discussing this once with David Bellamy in early 1993, he told me that, in his view, global warming was not happening and that we were heading for another ice age.
At that time, I knew very little about global warming and, as he was such an eminent scientist, I took his word for it! Now, I tend to think it is all wrapped up in his dislike of wind farms.
Hank Roberts says
Very, very good.
As a nudge generally for scientists writing.
— consider the basic advice for business writers
“Paragraph … Idea … Paragraph … Idea …”
http://www.esc.edu/esconline/across_esc/WritingResources.nsf/frames/Ten+Commandments+of+Business+Writing?OpenDocument
“esc.edu” — great name for a website, even before opening it
Ed G. says
re #164, try this path
http://www.exxon.mobil.com/Corporate/Newsroom/NewsReleases/corp_nr_mr_climate_aei.asp
Ark says
And another one: in the Times Online: “An experiment that hints we are wrong on climate change” by a former editor of the New Scientist http://www.timesonline.co.uk/tol/news/uk/article1363818.ece
[Response: RC archive on Solar Forcing. -mike]
lars says
experimental evidence…. not a hypothesis, not a theory….
An experiment that hints we are wrong on climate change
When politicians and journalists declare that the science of global warming is settled, they show a regrettable ignorance about how science works. We were treated to another dose of it recently when the experts of the Intergovernmental Panel on Climate Change issued the Summary for Policymakers that puts the political spin on an unfinished scientific dossier on climate change due for publication in a few months’ time. They declared that most of the rise in temperatures since the mid-20th century is very likely due to man-made greenhouse gases.
The small print explains “very likely” as meaning that the experts who made the judgment felt 90% sure about it. Older readers may recall a press conference at Harwell in 1958 when Sir John Cockcroft, Britain’s top nuclear physicist, said he was 90% certain that his lads had achieved controlled nuclear fusion. It turned out that he was wrong. More positively, a 10% uncertainty in any theory is a wide open breach for any latterday Galileo or Einstein to storm through with a better idea. That is how science really works.
http://www.timesonline.co.uk/tol/news/uk/article1363818.ece
lars says
Cosmic rays blamed for global warming
Man-made climate change may be happening at a far slower rate than has been claimed, according to controversial new research.
Scientists say that cosmic rays from outer space play a far greater role in changing the Earth’s climate than global warming experts previously thought.
In a book, to be published this week, they claim that fluctuations in the number of cosmic rays hitting the atmosphere directly alter the amount of cloud covering the planet.
High levels of cloud cover blankets the Earth and reflects radiated heat from the Sun back out into space, causing the planet to cool.
Henrik Svensmark, a weather scientist at the Danish National Space Centre who led the team behind the research, believes that the planet is experiencing a natural period of low cloud cover due to fewer cosmic rays entering the atmosphere.
This, he says, is responsible for much of the global warming we are experiencing.
http://www.telegraph.co.uk/news/main.jhtml?xml=/news/2007/02/11/warm11.xml
John L. McCormick says
Lars is on the loose again. Who left the door open?
Jon Beharry says
There is a typo in the first line of the post, Fraser is incorrectly spelled Frasier. Keep up the good work!
[Response: Fixed, thanks. -gavin]
Lynn Vincentnathan says
Just got to a computer where I had a chance to read a bit of the Fraser report with annotations. Great work, David.
The bit about Arctic sea ice reducing more greatly before 1990, and not after, got me thinking that there should be a slow down in the amount of ice melting, since there is increasingly less ice to melt & less edge meters from which to melt — even if the ocean/atmosphere is getting warmer. I.e., if there were the same amount of ice to melt now, more of it would be melting.
So these are the types of contrarian deceptions be expected in the future. I can see their headlines after all the ice is melted: “The ice has stopped melting, ergo GW is a hoax.”
Neal J. King says
I had a specific question about one point of the ISPM, for which the comment did not seem clear enough. I’m referring to:
5.3b On average, models that assume strong greenhouse warming project the tropical troposphere to warm faster than the surface. Current data do not support these forecasts.
to which the RealClimate critique was:
The ground surface is where we live, where we know the temperature changes the best, and the forecast has been spot on so far.
a) Does this answer imply that there is a discrepancy between expectation and measurements for the tropical tropospheric?
b) If so, do you assume that there is some kind of instrument error?
c) Or is there any other dimension of explanation?
My perhaps naive impression is that this point seems to speak to the basic mechanism of the radiative forcing of the enhanced greenhouse effect.
[Response: The trends in the tropical lapse rate are small and hard to detect, and the tropics is not as well observed as other places. The US Climate Assessment report thinks it is fairly likely that the supposed discrepancy is due to instrument error. Even if it is real, the implications are actuallly more rather than less scary. If the lower atmosphere warms without as much upper warming as expected, that actually constitutes an enhanced greenhouse effect, which alllows the surface to warm more than it otherwise might. Further. a situation like this allows more convective energy to build up, if it goes on for long. That has scary implications for hurricanes and tropical storms. Given the small trends in this quantity so far, it’s possible that a modest change in the surface energy budget (e.g. through evaporation or a slight change in cloudiness) could take care of it. It’s certainly not a global warming killer, at the level observed so far. –raypierre]
Neal J. King says
re: 175
Thanks, Raypierre.
I think this is a point worth being a little bit more definite about in the text, because my first impression is that, since the EGE works by means of “frustrating” the cooling from the top of the troposphere, that the temperature increase at the troposphere should be driving/leading the temperature increase at the ground level. (And I recall Lindzen making a similar comment somewhere as well.) I would therefore suggest that the note in the annotated ISPM be modified to explain more clearly why this should not be considered a serious embarassment.
It should not be necessary to go into implications about hurricanes or increased sensitivity: Merely to be clear as to why this apparent discrepancy doesn’t threaten the basic explanation for the EGE.