There are two new papers in Nature this week that go right to the heart of the conversation about extreme events and their potential relationship to climate change. This is a complex issue, and one not well-suited to soundbite quotes and headlines, and so we’ll try and give a flavour of what the issues are and what new directions these new papers are pointing towards.
Let’s start with some very basic, but oft-confused points:
- Not all extremes are the same. Discussions of ‘changes in extremes’ in general without specifying exactly what is being discussed are meaningless. A tornado is an extreme event, but one whose causes, sensitivity to change and impacts have nothing to do with those related to an ice storm, or a heat wave or cold air outbreak or a drought.
- There is no theory or result that indicates that climate change increases extremes in general. This is a corollary of the previous statement – each kind of extreme needs to be looked at specifically – and often regionally as well.
- Some extremes will become more common in future (and some less so). We will discuss the specifics below.
- Attribution of extremes is hard. There are limited observational data to start with, insufficient testing of climate model simulations of extremes, and (so far) limited assessment of model projections.
The two new papers deal with the attribution of a single flood event (Pall et al), and the attribution of increased intensity of rainfall across the Northern Hemisphere (Min et al). While these issues are linked, they are quite distinct, and the two approaches are very different too.
The aim of the Pall et al paper was to examine a specific event – floods in the UK in Oct/Nov 2000. Normally, with a single event there isn’t enough information to do any attribution, but Pall et al set up a very large ensemble of runs starting from roughly the same initial conditions to see how often the flooding event occurred. Note that flooding was defined as more than just intense rainfall – the authors tracked runoff and streamflow as part of their modelled setup. Then they repeated the same experiments with pre-industrial conditions (less CO2 and cooler temperatures). If the amount of times a flooding event would occur increased in the present-day setup, you can estimate how much more likely the event would have been because of climate change. The results gave varying numbers but in nine out of ten cases the chance increased by more than 20%, and in two out of three cases by more than 90%. This kind of fractional attribution (if an event is 50% more likely with anthropogenic effects, that implies it is 33% attributable) has been applied also to the 2003 European heatwave, and will undoubtedly be applied more often in future. One neat and interesting feature of these experiments was that they used the climateprediction.net set up to harness the power of the public’s idle screensaver time.
The second paper is a more standard detection and attribution study. By looking at the signatures of climate change in precipitation intensity and comparing that to the internal variability and the observation, the researchers conclude that the probability of intense precipitation on any given day has increased by 7 percent over the last 50 years – well outside the bounds of natural variability. This is a result that has been suggested before (i.e. in the IPCC report (Groisman et al, 2005), but this was the first proper attribution study (as far as I know). The signal seen in the data though, while coherent and similar to that seen in the models, was consistently larger, perhaps indicating the models are not sensitive enough, though the El Niño of 1997/8 may have had an outsize effect.
Both papers were submitted in March last year, prior to the 2010 floods in Pakistan, Australia, Brazil or the Philippines, and so did not deal with any of the data or issues associated with those floods. However, while questions of attribution come up whenever something weird happens to the weather, these papers demonstrate clearly that the instant pop-attributions we are always being asked for are just not very sensible. It takes an enormous amount of work to do these kinds of tests, and they just can’t be done instantly. As they are done more often though, we will develop a better sense for the kinds of events that we can say something about, and those we can’t.
Hugh Laue says
#3 and #32 Edward Greisch
“The results are still probabilities not absolutes and so will be disappointing to many people.”
Science doesn’t deal with absolutes (that by definition have no reference points)- it only deals with probabilities.
and therefore “Andy Revkin shows his ignorance of probability & statistics once again ” is perhaps not surprising.
Sou says
@ Titus, yes, as you point out the link you provided is indeed limited to a very small geographic area (two adjacent cities in south east Queensland).
This spring and summer with the extra strong La Nina, there have been multiple record precipitation events right across the nation in number of events as well as size of the flooding (especially the vast areas flooded in Victoria and central Australia as well as Queensland). I hope BoM will prepare a report on these events as a whole and comment on whether there has been any precedent. The fact that so many records have been broken multiple times this season, makes it unlikely there has been any recorded precedent.
Alexandre says
Is there any such attribution study about the Katrina?
Jeffrey Davis says
“I’ve not found anything recently that has not occurred more often or to a greater extent in the past. Interested to hear if there are.”
Iowa had 2 “100 year floods” in a decade.
Kevin McKinney says
#48, Tim Joslin–
\The fact is the 2000 Oxford floods really happened whether they were predicted by 10% of the model runs or by 100%, so they weren’t made 20% or 90% more likely because of AGW, they were infinitely more likely.\
Perhaps I misunderstand, but it appears to me that this statement is you \confusing the model with the reality.\ That is, the ‘x% more likely’ is modeled, not ‘real.’ Or to put it another way, what did happen does not erase what could have happened in our ex post facto calculation of the probabilities.
The corollary of what you are arguing is that attribution studies are completely impossible, since past events are always more or less ‘certain.’
(Well, not in the denierverse, I suppose, as nothing is ever at all certain there.)
SecularAnimist says
Titus wrote: “All recent ‘extreme events’ on investigation appear not to be extreme when compared to known events in the past few hundred years before AGW.”
That is simply not true. The extreme weather events occurring all over the world are in fact extreme by historical standards, and are in some cases unprecedented. And what is truly unprecedented — and truly alarming — is the fact that these extreme events are occurring all over the world at the same time.
Didactylos says
Titus:
How high would the flood be without the 1,450,000 ML of flood mitigation provided by Lake Wivenhoe?
Do I need to draw a line connecting the 1974 floods and the building of Wivenhoe Dam?
“It is anticipated that during a large flood similar in magnitude to that experienced in 1974, by using mitigation facility within Wivenhoe Dam, flood levels will be reduced downstream by an estimated 2 metres.”
As for the figures from pre-1900: apply large error bars. Big error bars.
Lynn Vincentnatnathan says
RE #32, we have the scientists who require .05 p or less (95% confidence) on their claims and the industrialists and denialists who require 99 to 101% confidence.
Why do the people, potential and real victims, the media, and policy-makers stand somewhere between this 95% to 101% confidence. They should be way out on the other side of precaution when the risks are so high (maybe requiring 50% or even 5% confidence). It just amazes me.
Hurricanes Katrina, Andrew, and all other extreme weather events that can even by a new, remote, or quasi logical hypothesis be linked to CC over the past 50 years, in my books ARE caused (or enhanced) by CC. If someone can prove at the .01 alpha level that they are NOT…like I said, I’ll still keep mitigating, bec it saves money, and it also mitigates lots of other environmental and nonenvironmental problems.
Mike says
Would it be safe to say, from a lay parson’s perspective, this: evidence of a causal link between “AGW” and some types contemporary extreme weather events has grown but there is not yet a consensus view?
When is a consensus view likely to appear? The next IPCC report? Or is some other body (eg NAS) planning to report on this question?
Hank Roberts says
Mike, as a “lay parson” you may not be aware of how a “consensus statement” works. Have a look at a few from many different areas of science:
http://scholar.google.com/scholar?q=%22consensus+statement%22
One Anonymous Bloke says
Mike #59 Waiting for “Consensus” is a waste of time – whether or not it makes a good political comforter. It buys in to the “false and misleading dichotomy” that science can be settled (or unsettled). For further reading I can recommend Dr. Gavin Schmidt’s article Unsettled Science
One Anonymous Bloke says
Hank Roberts #60 Obviously ‘consensus’ has a scientific use that the laity are unaware of, this is another example of how the use of a word by scientists can be misrepresented. People hear “consensus” and translate it according to their lay definition. How to avoid this? Good question.
Karen Street says
Re 59 Mike,
There is consensus that it is pretty close to impossible to ascribe any particular event to climate change, but that the pattern of events makes it clear that climate change is happening.
With one event, it is possible to give some sense of how likely the event is in our current climate, how likely it might have been in preindustrial times, and so ascertain a higher or lower probability of it happening today.
Some heat waves are just hot times. Others, perhaps because they last longer, or because the temperature remains high at night, so humans and other living things don’t have a chance to recuperate, seem to be higher probability events today. Higher night temperatures, in particular, is one of the fingerprints of higher atmospheric greenhouse gas levels.
Biochar says
This just in:
NSIDC bombshell: Thawing permafrost feedback will turn Arctic from carbon sink to source in the 2020s, releasing 100 billion tons of carbon by 2100
Study underestimates impacts with conservative assumptions
http://climateprogress.org/2011/02/17/nsidc-thawing-permafrost-will-turn-from-carbon-sink-to-source-in-mid-2020s-releasing-100-billion-tons-of-carbon-by-2100/
And for the current discussion why not link to the CP post?
Two seminal Nature papers join growing body of evidence that human emissions fuel extreme weather, flooding that harm humans and the environment
http://climateprogress.org/2011/02/16/two-nature-paper-join-growing-body-of-evidence-that-human-emissions-fuel-extreme-weather-flooding-that-harm-humans-and-the-environment/
Bay Bunny says
@56 – Secular Animist. Thanks for the great quote from the WMO! I hadn’t seen that before. Very concise though scary.
Dan H. says
All in all, it it extermely difficult to attribute extreme events to climate change. Another analysis is presented here:
http://rogerpielkejr.blogspot.com/2011/02/flood-disasters-and-human-caused.html
Andronicus says
“A whole bunch of big storms, floods, droughts and fires are things that can invoke the fear necessary to get action on GW.”
Certainly true. However, that same whole bunch of fearful stuff could just as easily draw more determined inaction if it were determined that the extremes are not attributable, or minimally attributable, to AGW. I’m surprised there was no cautionary note from RC attached to the statement I have quoted above, but perhaps that caution has been expressed elsewhere on this or another thread. I hope it has. Overreaching has not been good for the cause of late, and I hope no gun is jumped on this important issue.
Septic Matthew says
51, Ray Ladbury: I have to say that it takes a special sort of studied obtuseness to look at all the events we are hearing about–unprecedented heat waves in Russia, flooding in Oz, S. America, etc. and say “Meh!”
Depending on how you define a “region” (an island, a river, a pair of non-overlapping areas east of Moscow, a city, a mountain range, a 1degree by 1degree area of the South Atlantic, a peninsula, an isthmus, a gulf, a bay) there are thousands to hundreds of thousands of “regions” of the world, partially overlapping in some cases, not always (perhaps never) statistically independent, seldom linearly related. In every region there are multiple measures every day: rainfall, accumulated rainfall over the last week or month, maximum temp, number of consecutive days with a max temp above 90F, minimum temp, max and min wind temps, etc. With this many opportunities over a poorly defined (or undefined, or defined post hoc) sample space, it would be a rare year that did not have multiple extreme events (“unprecedented”, or “unprecedented since 1100AD”, or “unprecedented in the historical record”) under any reasonable probability model. The actual number of extreme events would fluctuate randomly from year to year. To show from recent records that the actual frequency of extreme events has increased is hard, and non-intuitive. The two papers that are the focus of gavin’s essay are honest attempts to address the issue of whether particular events, identified post hoc, can provide reasonable evidence. In their case, hard work yielded little; the works are, however, reasonable models to try to improve upon in the future.
What you called “studied obtuseness” I would call “intellectual rigor”. “Studied obtuseness” would be the continued avoidance of statistical methods devoted to the analysis of extremes.
Jeffrey Davis says
” if it were determined that the extremes are not attributable, or minimally attributable, to AGW”
People say that kind of stuff all the time and the idea that if AGW were simply of natural origin then we shouldn’t or couldn’t do anything about it. If people believed that kind of stuff generally, they wouldn’t go the doctor, farm, work, eat, clothe themselves, etc. Death is as natural as it gets and yet we constantly work to put off the day of reckoning.
Edward Greisch says
67 Andronicus: “I hope no gun is jumped on this important issue.”
Surely the gun went off in 1988 when James Hansen testified before Congress.
Hank Roberts says
Edward, the NPR Science Friday just pointed that those who do science are convinced not by any individual opinion but by the weight of the evidence.
If you point to individual items you’ll get argument about opinions.
If you point to the weight of the evidence you’ll have a chance of educating people. You can’t point to the same thing tomorrow — more evidence comes in every day.
SecularAnimist says
Karen Street wrote: “There is consensus that it is pretty close to impossible to ascribe any particular event to climate change …”
It is impossible to attribute a particular weather event to any one particular cause. All weather events, extreme or otherwise, arise from numerous interacting causes and conditions.
And anthropogenic global warming is now a pervasive and inescapable condition that influences all weather events. We are experiencing the weather of an anthropogenically warmed, and rapidly warming, Earth. There is no more “natural” weather.
Tim Joslin says
@Kevin McKinney #55:
Maybe I am confused, since I understand the rationale behind Pall et al is to make progress towards determining the AGW fraction of attributable risk (FAR) for the case of a particular single exemplar extreme real-world precipitation event, the autumn 2000 floods in England. The point is not to tell us just something about the models, or about the increased risks of flooding in a warmer world in general, though that is in fact all the study does, IMO.
Pall et al’s assessment of the AGW FAR is surely highly sensitive to modelling skill, specifically of the actual event (their “A2000 climate”). Yet they don’t discuss at any length the skill of their climate model in actually forecasting the autumn 2000 floods when initialised to April 2000. They only go as far as to show some apparent skill in the correlation of the pattern of mean 500 hPa geopotential height with precipitation (their Fig 1) and in reproducing something similar to the real-world runoff power spectrum (Fig 2). From their Fig 3, their model does not appear to be incredibly skilful, predicting runoff as or more severe than actually occurred around 20-30% of the time (and there’s no information as to whether the model under or overestimates rainfall in other years nor any discussion of interannual runoff variability or other way to assess the skill in terms of the range of outcomes above and below what actually occurred).
This whole methodology will only make sense once we have models which can accurately hindcast the extreme event with some reliability. Thinking a bit further along these lines, we could then run a suite of, in this case, A2000s and just discard outliers, say 10%, with the remainder taken as defining the extreme event parameters, in this case an average daily rainfall range (for example, maybe 0.38 to 0.44mm in the 2000 flood case). Then perhaps we could run the same model with the range of possible counterfactual (no AGW) initial values (Pall et al’s A2000N suite) and get to the FAR directly (without the awkwardness of having to talk about the increased risk of something that is known to have happened) – discarding the extreme 10% again, we would simply calculate the proportion within the event parameters as previously determined (if some more extreme remain we have problems!). There are still a few assumptions to be challenged in court, though.
The idea is to determine a FAR for AGW as might be done for smoking, say. But in the case of smoking you have data on a real population of smokers and non-smokers. You only have to make one not particularly heroic assumption, i.e. that the risks for the individual whose loved ones are bringing the law suit were typical of the population of smokers as a whole. In the case of specific events under AGW there is a population of one. Another method is needed, and IMO simply “pretending” there’s a large population by a series of parallel universe model runs is, to say the least, somewhat problematic.
Ray Ladbury says
Septic Matthew, The Russian heat wave by itself was nearly a 3 sigma event. We are having flooding in enough areas that it has played a role in bringing global food prices to a record high. This is not merely a phenomenon that affects a couple of regions. It is global, and it is just what the theory predicts. Ignoring that is not intellectual rigor, but intellectual rigor mortis.
Ray Ladbury says
Mike@59,
The consensus theory of Earth’s climate predicts just this type of increase in extreme weather events, so no, I don’t think your statement is correct. Put another way, an element of a theory becomes part of the consensus when you cannot understand or predict system behavior correctly without it. It is very likely that if you didn’t take into account the increasing energy, water vapor, etc. in the atmosphere, that you not predict the increased severe weather.
Peter Kalmus says
Wow, almost a year from submission to publication. Intense.
This has been a burning question in my mind. I’m looking forward to reading these two carefully, and to seeing more papers like this in the future. Thanks for the post.
Pete
Septic Matthew says
74, Ray Ladbury: Septic Matthew, The Russian heat wave by itself was nearly a 3 sigma event.
Since you missed my points, I’ll repeat them: Earth experiences dozens at least of 3 sigma events every year, under any realistic probability models; you really need to study up more on the statistical analysis of extremes if you are going to promote some statistical inferences based on extremes.
[Response: You mean earth experiences more than three ‘extremes’ every month?! That’s what ‘dozens’ per year would mean.–eric]
chris colose says
I find the Meehl et al 09 study pretty neat, which showed that since 2000 the ratio of the number of record high to record low temperatures is about 2:1, projecting to increase to about 50:1 by end of century. Goes to show also that even in a warmer world you can still get record lows.
Gerry Quinn says
On the face of it, one would not expect that an increase in greenhouse gases sufficient to increase the average temperature of the Earth by a degree or so would have a dramatic effect on ‘extreme events’ in general. Some types of events would probably become a bit more common in some places, and some would become less common in others.
The alternative hypothesis is that for some reason this minor addition of a trace gas systematically and consistently causes bad weather, rather than having a fairly neutral effect overall. Maybe it does, but convincing people of this will require more than anecdotal evidence about floods here and heatwaves there, in one particular year.
I suspect that our best assessments of the most likely changes in weather patters will be a good deal less terrifying than some of the posters here would like. (Even when beneficial changes are downplayed, as is usually the case.)
[Response: Maybe we’ve only had a degree or so of warming so far, but depending on climate sensitivity and how much coal we can (or will) burn, global mean temperatures could well rise by 7C or even more. That is why there is so much interest in trying to use the present climate change to try to get some early warning of how bad things can get. You only have “suspicions” based on neither observations nor physics, just prejudice. Other people are working hard to try to make some sense observationally of this difficult problem Your suspicions don’t help anybody very much, so people should just take them for what they are worth. –raypierre]
Sou says
@ Quinn #79 – How do you define ‘bad weather’? Perhaps you define a drought as ‘bad weather’. Yet a drought in, say, the UK would be a mild summer where I live. Or maybe you define ‘bad weather’ as torrential rain? Torrential rain in a tropical rainforest is the norm in the wet season, yet where I live summer is dry and torrential rain causes flash floods and month long floods and ruins crops.
Extreme or bad weather can only be defined relative to what constitutes ‘normal’ weather for a particular locality. In other words, ‘consistently bad weather’ as you’ve put it would constitute a change in the climate, which has ramifications for local agriculture, building design and construction, infrastructure such as roads, rail and water storage, indoor air conditioning (or climate control’) and even the siting of dwellings and infrastructure.
The change in weather (extreme weather and unprecedented weather) is already taking place a good deal more quickly than some people expected. And the weather is measured, it’s not just anecdotes of someone gazing out their window. What constitutes terrifying for some might not for others. Staring down a wildfire is quite nerve-wracking, as is holding onto a barbed wire fence for several hours, hoping not to drown in a swiftly flowing current of water. Drought is a slower form of ‘terrifying’.
People in some localities might not be subject to extremes of weather. In other localities people are flip-flopping from one extreme to another, including unprecedented extremes. (eg In this part of the world it is flip-flopping from unprecedented heat and long drought to unprecedented rain intensity and extreme floods – and this has been going on for more than a decade.)
Hank Roberts says
SM, how about some sources? You sound like you know the literature or have written some of it. What are you reading or recommending others read?
Something like this?
Rougier: Probabilistic inference for future climate using an ensemble of climate model evaluations
maths.dur.ac.uk/stats/people/jcr/CCfinal.pdf
http://scholar.google.com/scholar?hl=en&lr=&cites=11114092939118472750&um=1&ie=UTF-8&ei=IWtfTbr8O4q4sAP3pYjHCA&sa=X&oi=science_links&ct=sl-citedby&resnum=7&ved=0CEoQzgIwBjgU
chris colose says
Septic Matthew in your comment #72 on attribution of weather causes and natural weather:
I really don’t think a synoptic or mesoscale meteorologist would agree with your remarks. In a very strict sense you may be right, but of course we don’t necessitate that every picokelvin of the temperature gradient around a frontal system or every microscopic and turbulent feature associated with a mid-latitude cyclone be accounted for. Saying that we can’t attribute causes to an emerging weather system is just as absurd as saying we can’t attribute causes to climate change. Here in Wisconsin for example, it’s usually pretty easy to identify storm systems that originate by the rockies, perhaps due to deformation of a temperature gradient down the leeward side of the mountain.
I think it’s very easy to brush off synoptic/mesoscale dynamics and the rather powerful field of meteorology when we talk about climate, and I know at least know a few people I have everyday contact with who get irritated at climate people because of this. I think a better marriage of these fields is really going to have to happen to get a better grip, especially, on the issue of extremes.
P. Lewis says
Tom Wigley wrote
in his Pew Center report The Science of Climate Change: Global and U.S. Perspectives in 1999.
We have “long” since passed the point where the anthropogenic influence on global mean temperature has emerged from the background noise. And it seems to me we are on the cusp of being able to attribute smaller scale features an anthropogenic influence with reasonable probability.
Philip Machanick says
Didactylos #57. Good points. It goes further than that. Wivenhoe exceeded nominal 100% capacity 4 times before the flood. 100% represents the level below flood mitigation capacity. The dam is completely full at 225%, and went to 191% the last time it was emptied. This is a very serious situation because the dam is not designed to be overtopped. It has sacrificial gates that would have opened before that happened so a disaster arising from the dam collapsing is unlikely, but the fact that we arrived at that situation from a drought crisis in 2007 when the dam dropped to 15% is a hint that we are seeing wilder climate swings than in the past. The saving of 2m no doubt relates to one event when the dam holds back a flood. I saw a claim that the total rainfall feeding into the dam was double that of the 1974 flood. It’s just plain stupid or a consequence of only reading the Murdoch media to claim this event was of a lesser scale than the 1974 flood.
As to whether this is a once-off event or part of a pattern, there’s still science to be done, but I wouldn’t wait if there’s action that can be taken now.
GlenFergus says
Brisbane floods:
Sou:
BOM rountinely publishes monthly weather summaries including comments on point and areal extremes. January isn’t out yet, but will appear shortly here (December is worth a read). There will also be a flood event summary, which, as I guess you know, will appear here.
Didactylos:
Not all of the Wivenhoe flood storage window was used. It peaked at a bit over 1000GL, for a total of ~1350GL in the Somerset Dam / Wivenhoe Dam cascade. I think Wivenhoe will have reduced Jan’11 by more than the 2m hindcast for Jan’74, because the proportion of the event rain on that sub-catchment was higher this time. Recall that Somerset was in place for both floods and also has a substantial flood storage window (~300GL). So if you wanted to adjust for flood mitigation, I’d be adding about 0.5m to Jan’74 and maybe 3m to Jan’11 (crude estimate from one who lives in the town and works in the field…).
Pre-1900 errors; yep, up to a point. The 1893 flood heights are known exactly. The flood of record (1841) is a bit of a guess. First european visitor was only 1821.
Titus:
You seem to be arguing that the Jan’11 flood was not particularly extreme. That, of course, is correct (I’d put it at only about 1 in 50, but that’s a controversial view). I think you should re-read Gavin’s brief explanation of what “extreme event” means in the context. It’s not “biggest possible” or even “biggest recorded”; really just “unusually big” (yeah, rubbery…). So in that sense Jan’11 is one, and what might be able to said is that events like it are becoming more frequent. Any evidence of that in the Brisbane record (so far)? Nup, not on the face of it. Elsewhere? Seems to be, now.
The really odd (and little mentioned) thing about Jan’11 was that it wasn’t a cyclonic event, unlike the other big ones on the record. Attributers could do worse than have a look at that.
Ray Ladbury says
Septic Matthew, And since you missed or chose to ignore my points, I will also repeat them:
The Russian heat wave affected a very large area. It was well outside the norms for any historical event. The extent of the flooding this year has come very close to causing food insecurity for the first time since the ’80s–despite all of the advances in farming and yields.
What is more, these are precisely the sorts of events that theory predicts. I think you would have a very difficult time explaining the trends without added greenhouse forcing. Indeed that is what this research indicates.
Ray Ladbury says
Gerry Quinn,
Might I suggest that you sit down and estimate exactly how much energy an increase in temperature of 1 degree C represents for the atmosphere plus the first hundred meters or so of ocean. Now consider that this is an AVERAGE and that the distribution of energy is not uniform and is subject to fluctuations–as in any thermodynamic system. I would suggest that this increase is more than sufficient for any sort of extremes we are seeing.
Didactylos says
Septic Matthew said “Earth experiences dozens at least of 3 sigma events every year”
This is just daft. Earth experiences millions of extreme events every year. We have no clue exactly how many. Extremes are only extreme when you look at a specific event in a specific place. This means that when comparing extremes, it is essential to compare extremes from the same class.
Thus, you cannot compare a 3 sigma record high temperature day in Springfield with a month-long heatwave covering a big chunk of eastern Europe.
So, let me close with a quote: “you really need to study up more on the statistical analysis of extremes”.
Jeffrey Davis says
While we’re talking 3 sigma events, how rare is permafrost melting? After all, it isn’t called “Oftenfrost.”
[Response: The term “Permafrost” refers to the situation where there is always a layer left deep down that remains frozen throughout the year. In any place where the surface temperature goes above freezing for part of the year, there is an active layer that melts and refreezes annually. High latitude warming manifests itself primarily in increase in the depth of the active layer. –raypierre]
Brian Dodge says
http://www2.ucar.edu/news/1036/record-high-temperatures-far-outpace-record-lows-across-us is not “…anecdotal evidence about floods here and heatwaves there.”
This whole methodology of determining how badly the dice are loaded will only make sense once we have models which can accurately hindcast each individual occurrence of snake eyes, the extreme event, with some reliability.
In other words, the “how can you predict the climate if you can’t predict the weather” argument is brought up yet again.
Alexandre says
Chemotherapy causes hair to fall off. Even so, it´s hard to pinpoint which ones fell because of the chemo and which would be the ones that would fall off anyway, in the normal rate.
It looks like we’ll go bald claiming that we cannot determine wether it’s becauso fo the chemo or not.
Punksta says
Chemo & baldness
The difference is we have a pretty good idea how much baldness to expect in the absence of chemo. And can predict with chemo.
Hank Roberts says
So SM, you’re claiming some expertise in ‘statistical analysis of extremes’ — reading? cites?
SecularAnimist says
Gerry Quinn wrote: “On the face of it, one would not expect that an increase in greenhouse gases sufficient to increase the average temperature of the Earth by a degree or so would have a dramatic effect on ‘extreme events’ in general.”
On the contrary. On the face of it, that’s exactly what one should expect, because it is what the climate models have been predicting, and it is now occurring just as predicted.
One Anonymous Bloke says
#91 Punksta. You have read the article, haven’t you? The one about prediction and attribution at the top of the page… Seems a bit daft to make assertions that can be proven false by using the scroll bar.
Septic Matthew says
92, Hank Roberts: So SM, you’re claiming some expertise in ‘statistical analysis of extremes’ — reading? cites?
At the risk of some repetition from my previous posts,
R. L. Smith, Extreme value analysis of environmental time series: An appication to trend detection in ground-level ozone (with discussion), Statistical Science, volume 4 pp 367-393, 1985.
S. G. Coles, An Introduction to Statistical Modeling of Extreme Values, Springer, 2001.
The 2010 Joint Statistical Meetings included some invited papers on the analysis of extremes relevant to analysis of climate change. Here is one of them:
http://www.amstat.org/meetings/jsm/2010/onlineprogram/index.cfm?fuseaction=abstract_details&abstractid=308388
Ray Ladbury and Didactylos, I think your last posts were attempts to goad me into an emotional response. If you’ll think about them for a few days, I think you’ll see what I mean.
I think I’ll have to leave you all and master raypierre’s book. It’s a gold mine of information for a skeptical point of view: uncertainties and omissions are clearly presented, along with what is known.
ccpo says
Hello gents,
Balancing the science with public policy is such a pain in the patootie. Sadly, this sort of post is irrelevant in terms of risk assessment because risk assessment is based on worst-case possibilities, home insurance, life insurance, e.g., where the risks are so overwhelming they must be addressed.
The worst case scenario of me drinking 6 beers on a Friday night? A hangover. The worst case scenario wrt climate? Civilization being shaken to its knees and even collapsing. (One of the well-known collapse writers and researchers has turned pessimistic where he was formerly hopeful, e.g., per a private conversation and Taleb is famously pessimistic about our properly dealing with risk.)
This is one of those rare times when I ask: Did you really need to post this? People are looking for ways to avoid acting, and the worse conditions get, the more they will grasp for straws. When asked, please answer honestly, but choosing proactively to post something that far too many will see as, “Scientists can’t equate squat to squat!” is perhaps worth reconsidering.
The trends are clear enough, aren’t they? Need we always deal with the provable? Or do we all really believe Pakistan and Russia and Arctic Sea Ice and Tennessee floods and Australian floods and methane clathrate emissions and… would all be happening concurrently at 1850 temps?
I’ll go close the barn door now.
Septic Matthew says
77, eric in comment: [Response: You mean earth experiences more than three ‘extremes’ every month?! That’s what ‘dozens’ per year would mean.–eric]
I missed this. The answer is yes, approximately, but the number fluctuates randomly.
The extreme is with respect to the distribution of measures in the region and season. Like the Russian heat wave, most of these are not defined in advance (with respect to region and season), but identified post-hoc and cherry-picked to suit. Concomitant with the Russian heat wave was a Siberian cool wave; bloggers chose the event that matched their AGW beliefs. Had anyone chosen in advance a region large enough to include (and average together) them both, the event would have been nothing unusual.
Another recent example was the swath of record low high temperatures in south central US. Had it been a swath of record high low temperatures it would have been exactly what is predicted by AGW. Instead, it’s just a set of locally correlated hundred-year (or 150-year) extreme events of no importance.
The situation is analogous with coincidences: they are much more common than commonly supposed, but people do not notice most of them unless something draws their attention; the coincidences noticed after Pres Kennedy was assassinated have spawned a whole genre of literature and movies.
If at this point you are thinking I am extreme (!), then start over from the top with gavin’s essay and the two focus papers. The statistical analysis of extreme events identified post-hoc is really hard.
Icarus says
“The WSJ is not a reliable source. There is no discussion of extreme events in the Compo et al paper at all. The 20C Reanalysis project is however very useful (albeit with caveats) and we’ll address that in a future post. – gavin]”
There was certainly nothing in the Compo et al paper to justify the WSJ’s assertion that “The weather isn’t getting weirder” and “…researchers have yet to find evidence of more-extreme weather patterns over the period, contrary to what the models predict”.
Gilbert Compo has submitted a letter to the WSJ regarding their misrepresentations of his work, which he expects to be published on Wednesday Feb 23rd – I hope they publish it in full. Look out for it.
Philip Machanick says
Icarus #99: good to see that but any chance that the Compo letter will appear here? I don’t read WSJ regularly because I like to know what’s going on and could easily miss it if published there.