Guest Commentary by Axel Schweiger, Ron Lindsay, and Cecilia Bitz
We have just passed the annual maximum in Arctic sea ice extent which always occurs sometime in March. Within a month we will reach the annual maximum in Arctic sea ice volume. After that, the sea ice will begin its course towards its annual minimum of both extent and volume in mid-September. This marks the beginning of the ritual of the annual sea ice watch that includes predictions of the extent and rank of this year’s sea ice minimum, as well as discussion about the timing of its eventual demise. One of the inputs into that discussion is the “PIOMAS” ice-ocean model output of ice volume – and in particular, some high-profile extrapolations. This is worth looking at in some detail.
Prediction methods for the sea ice minima range from ad-hoc guesses to model predictions, from statistical analyses to water-cooler speculation in the blogosphere. Many of these predictions are compiled in the SEARCH-sponsored “sea ice outlook“.
This year’s discussions however will be without the input of the father of modern sea ice physics, Norbert Untersteiner, who recently died at the age of 86. Much of the physics in PIOMAS and global climate models can be traced to Norbert’s influence. Norbert was sober-minded and skeptical about the prospects of skillful short-term sea ice predictions, but even he couldn’t help but be drawn into the dubious excitement around the precipitous decline of arctic sea ice and regularly added his own guestimate to the sea ice outlook. Norbert’s legacy challenges those of us who engage in predictions to prove our skill and to understand and explain the limitations of our techniques so they are not used erroneously to misinform the public or to influence policy…more about that later and here.
PIOMAS
PIOMAS is the Panarctic Ice Ocean Modeling and Assimilation System. It belongs to the class of ice-ocean models that have components for the sea ice and the ocean, but no interactive atmosphere. There is an active community (AOMIP) engaged in applying and improving these types of models for Arctic problems. Without an atmosphere, inputs that represent the atmospheric forcing (near surface winds, temperature, humidity, and downwelling short and longwave radiation) need to be provided. Typically those inputs are derived from global atmospheric reanalysis projects. The advantage of such partially-coupled models is that they can be driven by past atmospheric conditions and the simulations match well the observed sea ice variability, which is strongly forced by the atmosphere.
This is in contrast to fully-coupled models, such as those used in the IPCC projections, which make their own version of the weather and can only be expected to approximate the mean and general patterns of variability and the long-term trajectory of the sea ice evolution. Another advantage of ice-ocean models is that they don’t have to deal with the complexities of a fully-coupled system. For example, fully-coupled models have biases in the mean wind field over the Arctic which may drive the sea ice into the wrong places, yielding unrealistic patterns of sea ice thickness. This has been a common problem with global climate models but the recent generation of models clearly shows improvement. Because sea ice is strongly driven by the atmosphere, model predictions depend on the quality of the future atmospheric conditions. Therefore an ice-ocean model, like PIOMAS, is much more accurate at hindcasts, when the atmospheric conditions are simply reconstructed from observations, than for forecasts, when atmospheric conditions must be estimated. That is not to say that PIOMAS can’t be used for predictions, it can (Zhang et al. 2008, Lindsay et al. 2008 , Zhang et al. 2010) but it is important to recognize that performance at hindcasts does not necessarily say much about performance at forecasts. This point often gets confused.
Figure 1: PIOMAS mean monthly arctic sea ice volume for April and September. Dashed lines parallel to linear fits represent one and two standard deviations from the trend. Error bars are estimated based on comparison with thickness observations and model sensitivity studies (Schweiger et al. 2011)
PIOMAS was developed and is operated by Jinlun Zhang at the University of Washington. It is the regional version of the global ice-ocean model of Zhang and Rothrock (2003). The sea ice component represents sea ice in multiple categories of thickness and accounts for changes in thickness due to growth and melt as well as mechanical deformation of ice (Thorndike et al. 1975, Hibler 1980).
It has evolved with continual improvements, including the addition of data assimilation capabilities (Zhang et al. 2003, Lindsay et al. 2006) and the development of sister models for new applications (BIOMAS for biology) or specific regions (BESTMAS for the Bering Sea and GIOMAS for the entire globe) (publications). As a modeler working among observationalists from a variety of disciplines, Jinlun has never been short of tire-kickers who probe, push, and challenge his model from all sorts of different angles and identify warts and beauty spots. This is one of the reasons why PIOMAS has evolved into one of the premier ice-ocean models (Johnson et al. 2012), particularly when it comes to the representation of the sea-ice cover.
PIOMAS has been used in a wide range of applications but arguably the most popular product has been the time series of total Arctic sea ice volume which we have been putting out since March 2010 (see also Fig 1). The motivation for this time series is to visualize the fact that the long term Arctic-wide loss of sea ice is not only happening in extent, which is well measured by satellites, but also in thickness, which isn’t. Ice volume, the product of sea ice area and thickness, is a measure for the total loss in sea ice and the total amount of energy involved in melting the ice. Though this is a very small part of the change of global energy content, it is regionally important and investigations into the cause of sea ice need to pin down the sources of this energy.
But why use PIOMAS to show the decline in ice volume when our group of researchers has been involved in measuring, rescuing, and collecting sea ice thickness data from in-situ observations for 30-some years? The answer is that even though wide-spread thickness losses from observations alone have been apparent for some areas or time periods, Arctic-wide thickness losses are more difficult to document because of the sparse sampling in time and space. The problem can be visualized by constructing a “naïve” sea ice thickness time series from in-situ observations:
Figure 2 Naïve sea ice thickness time series. Sea ice thickness observations from the sea ice thickness climate data record (small grey dots), averages for all observations in a given year (large grey dots), and 5-year running mean through those observations. The same calculation for the corresponding PIOMAS simulations at the location and time of observation is shown by the big red dots and line.
Before those claiming that global warming stopped in 1998 have a field day with this figure, they should appreciate that our total volume time series and the naïve thickness time series are entirely consistent. The sampling issues arise from the fact that sea ice is highly dynamic with lots of spatial and seasonal variability so that measurements from individual moorings, submarine sonar tracks, and aircraft flights can only construct an incomplete picture of the evolution of the total Arctic sea ice volume. Progress towards establishing ice thickness records from satellite (ICESat, Envisat, and CryoSat-2) will change this over time, but these sources won’t yield a record before these measurements began and satellite retrievals of ice thickness have their own issues.
PIOMAS is not normally run as a freely-evolving model, but rather it assimilates observations. Ice concentration and sea surface temperature are currently assimilated and we have experimented with the assimilation of ice motion (Zhang et al. 2003, Lindsay et al. 2006). Assimilation helps constrain the ice extent to observations and helps improve the simulation of sea ice thickness. Ice thickness observations are not assimilated into the model. Instead, ice thickness and buoy drift data are used for model calibration and evaluation. So using a model constrained by observations is quite possibly the best we can do to establish a long-term ice volume record.
Model calibration is of course necessary. We need to determine parameters that are not well known, deal with inadequately modeled physics, and address significant biases in the forcing fields. Parameters changed in PIOMAS calibration are typically the surface albedo and roughness, and the ice strength. Once calibrated, the model can be run and evaluated against observations not included in the calibration process. Evaluation does not only mean showing that PIOMAS says something useful but also establishes the error bars on the estimated ice thickness. To establish this uncertainty in the ice-volume record (Schweiger et al. 2011), we spent a significant effort drawing on most types of available observations of ice thickness thanks to a convenient compilation of ice thickness data (Lindsay, 2010). We have also compared PIOMAS estimates with measurements from ICESat and conducted a number of model sensitivity studies. As a result of this evaluation our conservative estimates of the uncertainty of the linear ice volume trend from 1979-present is about 30%. While there is lots to do in improving both measurements and models to reduce the uncertainty in modeled ice volume, we can also say with great confidence that the decline in observed ice thickness is not just an effect of measurement sampling and that the total sea ice volume has been declining over the past 32 years at astonishing rates (for instance a 75% reduction in September volume from 1979 to 2011).
Prediction
The seasonal prediction issue and the prediction of the long-term trajectory are fundamentally different problems. Seasonal prediction, say predicting September ice extent in March, is what is called an initial value problem and the September ice extent depends both on the weather, which is mostly unpredictable beyond 10 days or so, and the state of the ocean and sea ice in March. Improving observations to better characterize that state, and improving models to carry this information forward in time is our best hope to improve seasonal predictability. The prediction of the long-term trajectory, depends on the climate forcing (greenhouse gases, aerosols, solar variability) and how the model responds to those forcings via feedbacks. A recent model study showed that the crossover between initial-value and climate-forced predictability for sea ice occurs at about 3 years (Blanchard-Wrigglesworth et al. 2011). In other words, a model forgets the initial sea ice state after a few years at which point the main driver of any predictable signal is the climate forcing. In fact, coupled model simulations have shown that even removing all the sea ice in a particular July has little lasting impact on the trajectory of the ice after a few years (Tietsche et al. 2011).
PIOMAS has been run in a forward mode (and hence without data assimilation) to yield seasonal predictions for the sea ice outlook (Zhang et al. 2008) and has also provided input to statistical forecasts (Lindsay et al. 2008) and fully-coupled models. We have also done experiments with PIOMAS in a climate projection mode by scaling atmospheric forcing data from a reanalysis to 2xC02 projections from the CMIP3 models (Zhang et al. 2010). This setup provides more realistic wind fields and spatial thickness distribution but cannot account for important atmosphere-ocean feedbacks.
Global climate model projections (in CMIP3 at least) appear to underestimate sea ice extent losses with respect to observations, though this is not universally true for all models and some of them actually have ensemble spreads that are compatible with PIOMAS ice volume estimates and satellite observations of sea ice extent. With error bars provided, we can use the PIOMAS ice volume time series as a proxy record for reality and compare it against sea-ice simulations in global climate models. This provides another tool in addition to more directly observed properties for the improvement and evaluation of these models and is in our view the best use of PIOMAS in the context of predicting the long-term trajectory of sea ice.
Predictions of a seasonally ice-free Arctic Ocean
The eventual demise of the summer sea ice is a common feature of nearly every climate model projection (the exceptions are models with very inappropriate initial conditions). But the question of when the Arctic will be ‘ice-free’ is imprecise and calls for a clear definition of what ice-free means. Does it mean completely ice-free, or is there a minimum threshold implied? Does it mean the first time the summer sea ice goes beneath this threshold or does it imply a probability of encountering low-ice conditions over a period of time? (e.g. high likelihood of Septembers with less than 106 km2 of ice in a 10-year period). Regardless of whether the concept is actually useful for any practical purpose (say for planning shipping across the Arctic), it is nevertheless a powerful image in communicating the dramatic changes that are under way in the Arctic.
Once defined, predictions of when an ice-free Arctic will occur seem justified. In the published literature there are several papers specifically targeting such predictions (Zhang and Walsh, 2006, Wang and Overland, 2009, Boe et al. 2009, Zhang et. al. 2010) while others include discussion about the timing of ice-free summers (e.g. Holland et al. 2006). Some address the fact that the CMIP3/IPCC AR4 simulations show sea ice declines less rapid than the observations and attempt to correct for it. Published projections, though with varying definitions of what constitutes ice-free, all project an ice-free Arctic ocean somewhere between 2037 (Wang and Overland, 2009) and the end of the century. Predictions of earlier ice-free dates so-far seem to be confined to conference presentations, media-coverage, the blogosphere, and testimony before to the UK parliament.
Extrapolation
A different class of predictions are based on simple extrapolation using historical sea ice extent, concentration, or volume. An example is included in the materials presented by the so-called ‘Arctic Methane Emergency Group’ who show extrapolations of PIOMAS data and warn about the potential of a seasonally ice-free Arctic ocean in just a few years. So does it make sense to extrapolate sea ice volume for prediction? In order to do a successful extrapolation several conditions need to be met. First, an appropriate function for the extrapolation should be chosen. This function needs to either be based on the underlying physics of the system or needs to be justified as appropriate for future projections beyond just fitting the historical data.
But what function should one choose? Since we don’t really have data on how the trajectory of the Arctic sea ice evolves under increased greenhouse forcing, model projections may provide a guide about the shape of appropriate function. Clearly, linear, quadratic or exponential functions do not properly reflect the flattening of the trajectory in the next few decades seen for example in the CCSM4 (Fig 3). The characteristic flattening of this trajectory, at first order, arises from the fact that there is an increasingly negative (damping) feedback as the sea ice thins described by Bitz and Roe (2004) and Armour et al. (2011). The thick ice along the northern coast of Greenland is unusually persistent because there are on-shore winds that cause the ice to drift and pile-up there. So extrapolations by fitting a function that resembles a sigmoid-shaped trajectory may make more sense, but even that, as shown in the figure, yields a much earlier prediction of an ice-free Arctic than can be expected from the CCSM4 ensemble.
Figure 3. CCSM4 AR4 ensemble and PIOMAS September mean arctic ice volume. Exponential and sigmoid (Gompertz) fits to PIOMAS data are shown. Note that the 1979-2011 September mean of the CCSM4 ensemble has about 30% higher sea ice volume than PIOMAS. To visualize the difficulty in choosing an appropriate extrapolation function based on PIOMAS data we shifted the CCSM4 time series forward by 20 years to roughly match the mean ice volume over the 1979-2011 fitting period.
But there is a second issue that may foil prediction by extrapolation: The period over which the function is fit must be sufficiently long to include adequate long-term natural variability in the climate system. The goodness of fit over the fitting period unfortunately may be misleading. Whether or not this is the case for sea ice extent or volume is an open question. The sea ice trajectory shows considerable natural variability at various time scales on top of the smoother forced response to changes in greenhouse gases. Periods of rapid decline are followed by slower periods of decline or increases. By fitting a smooth function to a sea ice time series (e.g. PIOMAS) one might easily be tempted to assume that the smooth fit represents the forced (e.g. greenhouse) component and the variation about the curve is due to natural variability. But natural variability can occur at time scales long enough to affect the fit. We have to remember that part of the observed trend is likely due to natural variability (Kay et al. 2011, Winton, 2011) and may therefore have little to do with the future evolution of the sea ice trajectory. This is visualized in figure 4 where ensemble members from the CCSM4 AR4 runs are fit with S-shaped (Gompertz) functions using the 1979-2011 period to estimate the parameters. The differences between the ensemble members, reflecting natural variability, yield vastly different extrapolated trajectories. Natural variability at these time scales (order of 30 years) may very well make prediction by extrapolation hopeless.
Figure 4. CCSM4 AR4 ensemble with sigmoid (Gompertz) fits. Light vertical lines represent fitting period for ensemble members (1979-2011).
In summary, we think that expressing concern about the future of the Arctic by highlighting only the earliest estimates of an ice-free Arctic is misdirected. Instead, serious effort should be devoted to making detailed seasonal-to-interannual (initial-value) predictions with careful evaluations of their skill and better estimates of the climate-forced projections and their uncertainties, both of which are of considerable value to society. Some effort should also target the formulation of applicable and answerable questions that can help focus modeling efforts. We believe that substantially skillful prediction can only be achieved with models, and therefore effort should be given to improving predictive modeling activities. The best role of observations in prediction is to improve, test, and initialize models.
But when will the Arctic be ice free then? The answer will have to come from fully coupled climate models. Only they can account for the non-linear behavior of the trajectory of the sea ice evolution and put longer term changes in the context of expected natural variability. The sea ice simulations in the CMIP5 models are currently being analyzed. This analysis will reveal new insights about model biases, their causes, and about the role of natural variability in long-term change.It is possible that this analysis will change the predicted timing of the “ice free summers” but large uncertainties will likely remain. Until then, we believe, we need to let science run
its course and let previous model-based predictions of somewhere between “2040 and 2100″ stand”
References
Bitz, C. M., and G. H. Roe (2004), A mechanism for the high rate of sea ice thinning in the Arctic Ocean, J Climate, 17(18), 3623-3632.
Boe, J. L., A. Hall, and X. Qu (2009), September sea-ice cover in the Arctic Ocean projected to vanish by 2100, Nature Geoscience, 2(5), 341-343.
Hibler, W. D. (1980), Modeling a Variable Thickness Sea Ice Cover, Monthly Weather Review, 108(12), 1943-1973.
Holland, M. M., C. M. Bitz, and B. Tremblay (2006), Future abrupt reductions in the summer Arctic sea ice, Geophys. Res. Lett, 33(23), 5.
Johnson, M., et al. (2012), Evaluation of Arctic sea ice thickness simulated by Arctic Ocean Model Intercomparison Project models, J. Geophys. Res., 117, C00D13.
Kay, J. E., M. M. Holland, and A. Jahn (2011), Inter-annual to multi-decadal Arctic sea ice extent trends in a warming world, Geophys. Res. Lett, 38.
Lindsay, R. W. (2010), New Unified Sea Ice Thickness Climate Data Record, Eos Trans. AGU, 91(44), 405-416.
Lindsay, R. W., J. Zhang, A. J. Schweiger, and M. A. Steele (2008), Seasonal predictions of ice extent in the Arctic Ocean, J.Geophys.Res., 113(C2), 11.
Lindsay, R. W., and J. Zhang (2006), Assimilation of ice concentration in an ice-ocean model, Journal of Atmospheric and Oceanic Technology, 23(5), 742-749.
Rothrock, D. A., Y. Yu, and G. A. Maykut (1999), Thinning of the Arctic sea-ice cover, Geophys. Res. Lett, 26(23), 3469-3472.
Schweiger, A. J., R. Lindsay, J. Zhang, M. Steele, H. Stern, and R. Kwok (2011), Uncertainty in modeled Arctic sea ice volume, J. Geophys. Res., 116, C00D06.
Tietsche, S., D. Notz, J. H. Jungclaus, and J. Marotzke (2011), Recovery mechanisms of Arctic summer sea ice, Geophys. Res. Lett, 38.
Thorndike, A. S., D. A. Rothrock, G. A. Maykut, and R. Colony (1975), Thickness Distribution of Sea Ice, J.Geophys.Res., 80(33), 4501-4513.
Wang, M. Y., and J. E. Overland (2009), A sea ice free summer Arctic within 30 years?, Geophys. Res. Lett, 36, 5.
Winton, M. (2000), A reformulated three-layer sea ice model, Journal of Atmospheric and Oceanic Technology, 17(4), 525-531.
Winton, M. (2011), Do Climate Models Underestimate the Sensitivity of Northern Hemisphere Sea Ice Cover?, J Climate, 24(15), 3924-3934.
Zhang, J., D. R. Thomas, D. A. Rothrock, R. W. Lindsay, Y. Yu, and R. Kwok (2003), Assimilation of ice motion observations and comparisons with submarine ice thickness data, J.Geophys.Res., 108(C6), 3170, DOI: 3110.1029/2001JC001041
Zhang, J., and D. A. Rothrock (2003), Modeling global sea ice with a thickness and enthalpy distribution model in generalized curvilinear coordinates, Monthly Weather Review, 131(5), 845-861.
Zhang, X. D., and J. E. Walsh (2006), Toward a seasonally ice-covered Arctic Ocean: Scenarios from the IPCC AR4 model simulations, J Climate, 19(9), 1730-1747.
Zhang, J., M. Steele, and A. Schweiger (2010), Arctic sea ice response to atmospheric
forcings with varying levels of anthropogenic warming and climate variability, Geophys.
Res. Lett, 37 (L20505)
jyyh says
Tenney Naumer lamented in #67:
“Are we then to continue to wait for the models to be perfected?
I find this sentence to be, well, I am liable to be snipped if I type what I really think about it:
“The answer will have to come from fully-coupled climate models.””
Well, at least we may found out if the problem is P ≠ NP or P = NP. http://en.wikipedia.org/wiki/P_versus_NP_problem
I’d say as the meteorologists can predict some weather with over 50% (Bayesian) accuracy to a short period of future there might be a (Bayesian) solution of the case being P = NP. I’m not not saying the solution is found out very early in the process but at least it may then add to the list of problems that belong to the set “P ≠ NP” or “P = NP”.
Chris Reynolds says
#86 Tenney Naumer,
Of course not, but without observational evidence that the PIOMAS Spring volume losses are real, their reality has to be claimed with caution. I think they’re real, based on my reading of the literature about PIOMAS. But what is observed always takes precedence over what I think. PIOMAS is the best proxy for sea ice volume we have, but it is only a proxy, not a direct measurement.
Yes but how rapidly will this process proceed? Bear in mind here that most of the October to March warming is due to surface heat fluxes. Screen and Simmonds state in their abstract that: “Arctic warming is strongest at the surface during most of the year and is primarily consistent with reductions in sea ice cover. Changes in cloud cover, in contrast, have not contributed strongly to recent warming.”
So there is little evidence of the cloud radiative feedback that may keep the Arctic temperate in future being a major player now. And changes in ice cover are driving most of Arctic temperature amplification.
Thus the claim that Arctic warming as the ice recedes will present a risk of stopping ice growing from October to March is a circular argument. The very warming that is supposed to impede ice growth over Autumn/Winter is a process whereby the Arctic is shedding heat gained by ice loss. So as it intensifies more of the energy gains of the year will be lost to the atmosphere and thence to space.
Bruce Tabor says
@Chris Reynolds #75,
Chris,
While I respect your views and you are clearly better informed on the detailed science at issue here, I beg to differ on some not insignificant points.
“Being right for the wrong reasons is still being wrong”
Clearly being right for the wrong reasons is as a matter of logic, grammer, causality etc, clearly BEING RIGHT, even if this decision was reached by a flawed process. If you think about it, in science it cannot be otherwise, as there is no such thing as a final flawless infallible theory of everything to guide our thinking. All science is a MODEL of reality, and all models are wrong. And whatever decisions we make, regardless of the correctness of our reasoning, we are left with the consequences of that decision, not the consequences of our reasoning.
Secondly, your “other earths” are a thought experiment – another model – like the many worlds interpretation (http://en.wikipedia.org/wiki/Many-worlds_interpretation). The Universe could be completely deterministic (Schroedinger’s Equations are) while having the appearence of randomness.
The relationship to ensemble models is that the underlying physical processes are chaotic – highly sensitive to the initial conditions. We hypothesise (& it seems reasonable) that we can capture the range of possible outcomes if we could somehow run “reality” with slightly varying initial conditions. This concept is a model, a thought experiment. It is NOT reality. The models must somehow be validated in the application of interest to be useful. I accept GCM’s have been validated for projection of global climate change, but their record on the cryosphere is woeful.
Thirdly, I don’t know the history of PIOMASS. As I understand, it attempts to estimate ice area, thickness and volume. As it is largely a process of statistical intepolation, with limited projection, I assume it has been subject to considerable experimental validation. Please tell me if this is not the case!
Finally, you seem to be suggesting that unexpected weather was responsible for the low ice events of 2007, 2010 & 2011. As far as I know El Ninos like that of 2007 do occur in climate models (not that specific one), as does other large scale weather phenomena. The whole point of ensembles is that if some random sequence of weather events can produce the low ice events, this should be seen in some models and so be included in the “confidence intervals” of the ensemble. This is the variability issue I was referring to. If I am reading the output of the ensemble models correctly, none of them suggest the possibility of the recent low ice events.
Peter Ellis says
“Reality is only one realisation of an ensemble” – a wonderful quote, but neglects a crucial point. We want to know which realisation we’re in!
The argument at its simplest is that since there are individual model runs in the CCSM4 ensemble that are just about as bad as our current reality, we can’t rule out the chance that reality will return to the CCSM4 ensemble line – i.e. the decline will slow, and the Arctic will be summer ice-free in “only” 2040-2050 or so. Even on the face of it, this seems wrong. Did the most pessimistic individual runs really subsequently return to the mean of the ensemble? Or did they remain below it for the remainder of the run?
http://psc.apl.washington.edu/wordpress/wp-content/uploads/schweiger/ice_volume/validation/Fig13b.png (looking at CCSM 20th & a1b) suggests the latter. These are presumably the worst individual runs, and they do indeed match PIOMAS extent/volume up to 2006, though not beyond. These runs zero out in ~2040, noticeably before the bulk of the ensemble. Even if the model is correct in all particulars, then if the real Earth is following a similar realisation, logically we too will come off worse than the ensemble average!
If you take the view that CCSM is correct, or nearly so, then in order to predict the real world, we need to work out which realisation is closest to reality. One approach could therefore be to break the ensemble down into its individual members, and exclude those one by one as “reality” shows us which members she’s more likely to be following. Averaging together the remaining runs – the runs that haven’t yet been ruled out by inconvenient data – seems like it would give a more accurate prediction.
However, philosophically that approach rests on an underpinning assumption:
our reality genuinely is the worst of all possible worlds and hence we “just happen to be” tracking the most pessimistic individual model runs. This goes against the Copernican principle that we should expect to follow a kind-of-average trajectory.
An alternative approach could be to take the shape of the CCSM4 ensemble curve, which is kind-of-sigmoid, and then scale that to fit the real data. That thus assumes that reality is following an “average” trend relative to what’s expected based on AGW forcing, but that the model has underestimated the sensitivity of the response to forcing. That too would
lead to a prediction substantially more pessimistic than the current ensemble mean.
Both of those approaches seem to me to be more scientifically justifiable than just taking the ensemble mean at face value.
Blaine says
RE: “Reality is only one realization of an ensemble”
Well, that’s kind of the point, isn’t it? If a model is going to purport to actually represent the real world, reality must be a reasonable member of the ensemble. If not, the model cannot reasonably be used as a basis for prediction. This writeup presents CCSM4 as a reasonable model to use for prediction (if time-shifted 20 years), but where is the CCSM4 ensemble member in which September ice volume has dropped 75% between 1979 and 2011 (even shifted 20 years)? The same goes for the model used by Tietsche; none of the ensemble members show a September ice volume drop which comes close to matching reality.
The writeup justifies this trust in models by linking to a CCSM3 model realization. OK, reality is possibly acceptable as an extreme realization of that model (though the graph isn’t terribly convincing, given the extent difference), but why should this imply that I should accept the results of other models which don’t include reality as possible realizations?
How is it that an old, low resolution model like CCSM3 yields results which match the observed ice volume so much better than the latest models which one would think ought to do much better? I think this is likely because ice volume variations likely depend strongly on the model’s stability under perturbation, and multiple errors in this parameter with different signs can end up canceling each other out and yielding plausible stability. At least some versions of CCSM3 are known to be unstable under perturbation. Also, projection of fields onto a coarse grid yields diffusion/mixing-like terms, which could compensate for parameterizations which yield too little mixing. Obviously one wants to correct all the errors, but for prediction it is still necessary to use a model which reasonably matches reality.
There are only a few reasonable explanations for the persistence of the observed anomalous amounts of sea ice. Either there’s a large decadal-scale internal variability driving it, such as a large pseudo-cyclical increase in deepwater formation, or the Arctic Ocean is near marginal stability under perturbation. See Merryfield for how the decreasing stability increases persistence. I’m not buying the writeup’s implication that we’ve been throwing nothing but sixes for the past seven years.
Jim Larsen says
Unfortunately, I’m still not seeing it. I’ll post thoughts and await correction:
This post is missing a figure. All of the CCSM4 runs and PIOMAS should be displayed in a figure with a single time line. The 20 year shift in figure 3 primarily serves to hide CCSM4’s deviation from reality’s “run”.
Figure 3’s lowest CCSM4 run has a minimum in 2011 or before of near triple the ice that PIOMAS has in 2011. Either PIOMAS is way off, CCSM4 is way off, or the current decline in sea ice is a series of ever-increasing extreme outliers. In order to suggest that reality is a bunch of extreme outliers, wouldn’t you have to provide mechanisms by which reality’s “run” has been affected? Saying reality is an unbroken series of increasing but nearly independent high sigma events requires incredible evidence. An F&R2011 style analysis might have been an appropriate addition to this post. Why is PIOMAS so far below all CCSM4 runs? Is CCSM4 capable of duplicating PIOMAS? Taken at face value, Fig 3 manifests into a prediction that sea ice will increase, with expected values in 2080 25% above 2011 values.
PIOMAS can run in forecast mode. Would you please provide a few runs for 2012-2019? Wouldn’t that avoid the extrapolation issue? Certainly 2012 would be a reasonable data point. With 3-8 years left in the theoretical decline, even one year’s data can make a huge difference in probabilities.
We also have Maslowski’s model supporting the PIOMAS extrapolations. Data, extrapolations, and model all agree, yet the OPs left the most famous modelling result for sea ice out completely. Why is Maslowski’s model so flawed and CCSM4 so good even though reality seems to be following Maslowski’s model?
Fig 1 bears mentioning. The linear trend hits 1k around 2030, with the current situation being just a huge dip in the road. That sounds reasonable if supported with analysis. Of course, it still makes “2040-2100” look way optimistic.
In summary, there is a huge spread in model estimates for 1k, from 5 years (Maslowski) to no reduction at all through 2100 (CCSM4). The OPs gave no rationale for selecting any particular model, and the model they chose to display was rejected in their final estimate. I would have liked some comparison of the credibility of each model or some analysis adjusting for natural variability. Be that as it may, if I accept the OPs’ contention that coupled models and only coupled models provide decent predictions, then I conclude very tenuously that 5 years +- 3 years is the best estimate, as Maslowski’s model suggests, while noting that “not in this century”, as suggested by CCSM4, could be someone else’s estimate. Conversely, if I reject their contention, the best data we have extrapolates to 1k in 3 (exponential), 8 (sigmoid), or 29 (linear) years, with the best fits for the exponential and sigmoid extrapolations. Using both model and data, my guesstimate for 1k is 3-8 years with a fat tail out to around 30 years and no upper bound as the ice pile above Greenland and Canada could become pretty stable at 2k, for example.
Dan H. says
RE: “Reality is only one realization of an ensemble.”
It is also possible that reality is not represented in the ensemble at all. Using 30 years of data to predict forward 90 years if difficult in and of itself, let alone projecting ice values much lower than experienced during the dataset time frame. Jim hits a little bit on this with his guesstimates of 3 years to stable at 2k, thereby never reaching the 1k threshhold.
It appears that the models are reasnable good at predicting ice in the open watre areas, but have much difficulty in the semi-enclosed areas.
Ray Ladbury says
Peter Ellis: “We want to know which realisation we’re in!”
Sorry, Peter, but this is a fundamental misunderstanding of the purpose of scientific modeling–which is to attain understanding of the system being modeled. The understanding in turn provides a basis for predictions of future behavior.
Douglas says
#93 Hank Roberts: Does that mean you rule out the possibility Shakhova and Semiletov are correct to estimate 50GT of methane is eligible for abrupt release? Or does it mean you would only assess “emergency” at the point it actually does release on a big scale (ie much bigger than the 1km plumes discovered last year)?
I agree other things are also going on – letting any positive feedback process get established is a mistake.
Chris Reynolds says
#103 Bruce Tabor,
Would you toss a coin; “heads this decade, tails later” and claim you were ‘right’ if your coin throw happened to agree with what transpired? There’s a more serious consequence of being right for the wrong reasons: If there is a massive drop to well under 1M km^2 and you interpret this as evidence of a tipping point, whereas what happened was a succession of 2007-like atmospheric set ups, any prognosication will be shown wrong when in the following years the ice grows back and the condition doesn’t maintain itself. It really does matter whether your workings are correct because reality doesn’t give a hoot for our reasonings, correct workings should be based on evidence and reasoning with the goal being understanding, not being right.
No, they’re an analogy, and the underlying point is sound. Events like the ones I list were the outcome of random priming events leading to the occurence of the final event, why didn’t the 1998 El Nino happen any number of years before or after, why did it occur at all? This isn’t pointless theorising, it has a direct bearing on what Dr Schweiger is saying. And what he is saying is correct; it is only in our ‘realisation’ that the events I listed happened when they did, and had the cumulative impact they have had playing their role in the succession of events we have observed. However see the end of this reply for my view on whether we’re simply on a single realisation.
PIOMAS
Pan Arctic Ice/Ocean Modeling and Assimilation System. It is a model of the ocean and ice over a specific domain covering a large part of the Arctic region. That’s the physics core – the ice and ocean. Into this core is introduced data covering SSTs, ice concentration, and atmosphere. The atmosphere data that drives PIOMAS is taken from ‘reanalysis’ AFAIK they use NCEP/NCAR, but they could use other systems. So the ice ocean physics model can be considered a black box into which you put the atmospheric factors, the box then spits out the response of the ice and ocean to the atmosphere. This is because the atmosphere is the dominant factor.
From my reading I interpret 2007 as a specific outcome of various processes.
http://dosbat.blogspot.co.uk/2012/04/musings-on-models.html
The Spring volume loss in 2010 does seem to be associated with a persistent pattern of high pressure, which doesn’t seem to be the Arctic Dipole (NCEP/NCAR), so as yet I haven’t figured out how this caused the ice volume loss reported by PIOMAS. As for 2011, I am unable to explain it.
I am awaiting this year’s Spring volume data from PIOMAS to see what happens. If there is no similar volume loss then I’ll put those years down to ‘weather’. However if we have a similar profile of volume loss as in the preceding two years then random variability looks very unlikely and I’ll be veering to the following viewpoint – that something new and radical has happened in the seasonal cycle of sea-ice loss, a new factor that in principle could have the power to make a virtually sea ice free state in September plausible this decade. I await the data before making any decision.
I should state that I am not persuaded by the idea that the observed recession of the Arctic is just one realisation of what the GCMs show, and that the difference between the models and observation is merely chance. I think that there are systematic problems with the models that make them conservative when judged against events, and that weather in the Arctic is not just random:
I see the Arctic dipole as an outcome of climate change in the Arctic:
http://dosbat.blogspot.co.uk/2011/08/arctic-dipole-sea-ice-loss.html
And an outcome that itself accelerates sea-ice loss.
http://dosbat.blogspot.co.uk/2011/09/arctic-dipole-positive-feedback.html
So I see the interpretation of atmospheric impacts on the Arctic as being random as wrong. I think there’s pattern here, and pattern that’s backed up by research in the literature.
But this doesn’t mean I’m at all persuaded by extrapolating curves, which I consider to be useless and uninformative. In fact I’ll go further:
You might as well draw the Arctic’s horoscope.
Geoff Beacon says
Gavin asks John Nissan in #57
Well there’s small print here – possibly some in invisible ink.
What does Gavin mean by “the science”? If it’s just the content of official peer-reviewed papers, where are the headlines – not the small print – that publicise the following?
I think the second point may be more suitable for judicial review rather than peer-review.
Chris Reynolds says
Jim Larson,
See here:
http://psc.apl.washington.edu/BEST/PSW2007/PSW07_modelpredictions.html
Those projections are detailed in Zhang et al, 2010 “Arctic sea ice response to atmospheric forcings with varying levels of anthropogenic warming and climate variability.” http://psc.apl.washington.edu/zhang/Pubs/Zhang_etal_2010GL044988.pdf I think you may be surprised at the results, you may want to peruse that link (the first link) before reading further. I recommend checking out 2025 and 2035.
The way the future projection is done is they use randomly shuffled past years of weather (NCEP/NCAR) and add those to a ‘spine’ of warming from GCM projection. If you check the first link in my reply to Bruce Tabor above you’ll find some comments on this study at the end of that post.
However your comment on Maslowski has combined with my scanning the papers Hank linked to earlier in this thread; trend extrapolation using both PIOMAS and NPS are being used to assert that the Arctic is on a fast track to seasonally sea-ice free state. I’ll suspend my cynicism of extrapolation for the moment by accepting that the volume loss in PIOMAS and NPS seems inconsistent with what I’ve seen of GCMs.
Both NPS and PIOMAS retrospective runs use ‘observed’ data for their atmospheric components. PIOMAS seems to use NCEP/NCAR, NPS uses ECWMF. Maslowski has stated resolution is a major factor in the ‘early ice free’ implications of volume loss. PIOMAS “has a horizontal resolution of 40 km X 40 km, 21 vertical ocean levels, and 12 thickness categories each for undeformed ice, ridged ice, ice enthalpy, and snow.” NPS “is configured using a horizontal, rotated spherical grid covering 1280×720 cells at a 1/12 degree (approximately 9km) resolution. It has 45 vertical layers” in the ocean. So whilst Maslwoski has asserted that resolution is a key factor it doesn’t seem to be the case when PIOMAS is considered, NPS is 9km square, PIOMAS 40km square.
Going back to data assimilation. Zhang et al 2010 reveals inflection points when going from historic NCEP/NCAR to a combination of GCM temperature spine and randomly shuffled NCEP/NCAR weather to represent a warming trend with weather variability. The authors note that randomly shuffling the ‘weather’ in this way will neglect contribution of trends in weather that are reinforcing the loss of ice. Part of the inflection may well be due to the GCM projection. However what both NPS and PIOMAS have in common is the assimilation, and their volume outputs have both been argued to imply an early loss of sea-ice.
So how much do changes in the Arctic atmosphere play a role in the loss of sea-ice volume and the apparent failure of the GCMs to reflect the current volume loss? Am I in a rut when I point the finger at the Arctic Dipole?
Hank Roberts says
> the papers Hank linked to
Note, as always; I know nothing, I post examples of what I find, but likely more interesting work can be found that I didn’t happen to stumble on looking with Scholar. (Those two are MA thesis papers, with Maslowski as advisor, from recent years; they can’t be the whole story of what(ever) the Navy knows.)
Hank Roberts says
Douglas says: “… the 1km plumes discovered last year”
Who has seen a first hand report of “1km plumes discovered last year”?
It’s not in their published work, that I’ve found.
Igor P Semiletov et al 2012 Environ. Res. Lett. 7 015201
doi:10.1088/1748-9326/7/1/015201
Received 5 August 2011, accepted for publication 6 December 2011
Published 4 January 2012
http://iopscience.iop.org/1748-9326/7/1/015201/article
discusses methane from coastal erosion — recent carbon, not old carbon.
They say:
“… we plan to obtain new data to answer the following overarching questions.
… How much CH4 could be released to the atmosphere from the ESAS due to degradation of sub-sea permafrost and decay of seabed deposits? What is the current state and projected future dynamics of sub-sea permafrost?
… study will require multiple year-round exploration campaigns, including drilling of sub-sea permafrost to evaluate the sediment CH4 potential ….”
Short answer: yes, it’s there, as it has been in past warmings.
Yes, drilling to evaluate hydrates for methane gas potential has started.
You can look this stuff up. E.g.: http://www.ogj.com/articles/print/volume-102/issue-6/drilling-production/japan-undertakes-ambitious-hydrate-drilling-program.html to pick one at random from many many search results. Try it.
wayne davidson says
Ray: “I believe we are facing a force that surpasses inertia–namely collective human stupidity. ” Humans are crawling out of the stupid morass for millenias, it goes in cycles, a true one, not the accuweather climate cycle a la Bastardi. Of which wars seem to push us further on the creeping edge of better ways of killing, that insanity side effects, the crumbs pushed us forward, some say the space age was different, but it was born from the cold war. I would say its hopeless when all scientists pack up their goods and rocket away to Alpha Centauri, but thanks to your colleagues and science, there is always the luck of humans to count on, surviving millions of years, by being clever or dumb, happenstance is our middle name, there is always a chance will get it right.
The methane discussion above fascinates, I dont think we know everything about it, since a lot of it is under sea ice at bottom of arctic ocean. There is no certainty here, there is only knowledge that it will increase in concentration. I read a whole lot of discussion but very very little observing. It should be the opposite, often we read peer review papers on the subject, they are mainly based on short term observations and data acquiring. I rather we discuss the numbers, the actual data out there. However little it may be.
Neven 1900 ppb Arctic number is a start, surely there are seasons , variations , regions with more CH4 than others. All is related to sea ice almost being completely open like 2007 or it being frozen up, rock solid as in March. Lets see how much methane shows up , when and where , if there is such data, that is a crucial discussion needed before conclusions are thrown out, understanding the Arctic requires data from here.
Tenney Naumer says
@Hank
Dr. Semiletov was interviewed about the vast plumes of methane:
In an exclusive interview with The Independent, Igor Semiletov, of the Far Eastern branch of the Russian Academy of Sciences, said that he has never before witnessed the scale and force of the methane being released from beneath the Arctic seabed.
“Earlier we found torch-like structures like this but they were only tens of metres in diameter. This is the first time that we’ve found continuous, powerful and impressive seeping structures, more than 1,000 metres in diameter. It’s amazing,” Dr Semiletov said. “I was most impressed by the sheer scale and high density of the plumes. Over a relatively small area we found more than 100, but over a wider area there should be thousands of them.”
I generally collect everything I can find by Shakhova and Semiletov on my blog, so you can see it at this link:
http://climatechangepsychology.blogspot.com/search/label/Natalia%20Shakhova
or this one:
http://climatechangepsychology.blogspot.com/search/label/Igor%20Semiletov
Pete Best says
its really good of real climate to address this burning issue. This article certainly addresses the exaggeration in the political pro green blogs, media and informed webistes on which if real climate was not here I would be presuming those artciles to be true.
Slickly written blogs and sites against the wording from here make me realise how easy is it for a not skeptical enough mind (scientifically that is before you all jump on me)to take the bait and start thinking that doom in accelerating when indeed its just that as real climate have been stating for ever makre sure your time series is long enough to flush out the natural variability.
thanks RC
Chris Dudley says
“Reality is only one realization of an ensemble”
There has been some criticism of this statement. But, it is not the first time I’ve heard it. Though I went the observational route, I did get some training from Josh Barnes, one of the better astrophysical modelers. And it is true that systems which are sufficiently non-trivial to require a concerted modeling effort generally have stochastic characteristics. Close three body dynamics in many-body Newtonian simulations, for example, have to be captured and treated statistically. This is fully justified because reality does act like a member of an ensemble at this level. And, modeling in this manner gives us an opportunity to learn.
I’d like to commend Axel and his co-workers for displaying a comparison with ice volume. So many authors stick with extent alone despite the availability of PIOMAS.
Which is better? Including more physics and seeing models depart from observations or sticking with old models even though it is known they are lacking? I’d say the former because further iteration will tell us what is lacking in our understanding.
Kevin McKinney says
#111–“I think the second point may be more suitable for judicial review rather than peer-review.”
Which judiciary? According to those who talk about this ‘pressure,’ it was coming from various national governments. So who has jurisdiction? The International Court of Justice hasn’t been accepted by all parties (non-signatories include, IIRC, the US.) Who else is there?
wayne davidson says
There is no question about the newer thinnest of the Arctic Ocean sea ice by mere observations in situ and from satellite pictures, I have found dozens of examples, so I am a bit perplexed by claims otherwise.
I don’t think its satisfying many when one claims something without multiple back ups, secondary observations apparently trivial may deny one claim or another. Natural variations are a given, a non argument, I suggest to look up very carefully at the Arctic ocean proper, with high resolution shots,
here:
Look very very carefully at Jpeg full resolution product late March as on the second article down on my blog, observe the white streaks streaming all over the place, these scream out as something newish.
How we square that with the claim that ice is somewhat thicker than a few years ago is difficult.
By the way the 3rd article deals about El-Nino, again witness the power of observing over simply mere thought and analysis.
wayne davidson says
oopss here: http://eh2r.blogspot.ca/
thanks Gavin
sorry for the booboo..
Chris Dudley says
Here’s a bit of Fawlty Language doggerel that illustrates how chi-by-eye might see a sigmoid but in fact the function is linear. A periodic function with added noise goes along for a while and then dips linearly towards zero. The “annual” minimum slips below zero but owing to noise pops up again giving an impression of a gradual approach to zero. Plotting the “annual” maximum with the same sigmoid shifted up shows that this is an illusion. The script should also run in IDL.
a=findgen(1440)*2.*!pi/12.
b=sin(a)
plot,a,b
d=randomn(12347,120)/4.
c=fltarr(1440)
for i=0,119 do if i lt 40 then c(i*12:i*12+11)=4.+d(i) else c(i*12:i*12+11)=(40.-i)*3./55.+d(i)+4.
plot,a,c+b
e=c+b
plot,e(indgen(120)*12+9)
!p.multi=[0,1,2]
f=e(indgen(120)*12+9)
f(where(f lt 0.))=0
plot,f, title=”Annual minimum”
oplot,smooth(f,5)
oplot,3-3./(1+exp(-(findgen(120)-68.)/8.))
g=e(indgen(120)*12+3)
plot,g,title=”annual maximum”
oplot,3-3./(1+exp(-(findgen(120)-68.)/8.))+2
!p.multi=0
PKthinks says
‘The Science’, or better stated as the “current understanding of the science” brings one back to“Reality is only one realization of an ensemble”
sounds like “The ensemble mean of individuals appraisal of the literature or evidence”
A consensus of course in other words, but no one expects the ensemble mean to be the whole story
[Response: Indeed, no one does. – gavin]
Tenney Naumer says
After reading all these comments, I say that what I have learned is that there are plenty of perils to modeling.
Nathan says
Re: possible methane releases, following up on some of the previous comments, Gavin Schmidt’s reply to one of them, etc.
I personally find something like the following arctic methane scenario quite astonishing to contemplate, and please take note that it neither goes against anything said recently by David Archer here at Real Climate, nor posits any mechanism that is not currently already activated towards change, and is entirely based upon already observed flux rates, conditions, etc.
Let’s imagine that just a small fraction of only the NON-HYDRATE C store around the ESAS submarine permafrost were released as methane rather quickly, either in one season, or even episodically over two or three years. Let’s make it just a mere .45%. This would be equal to 3 Gt methane. (3Gt CH4 = 2.3 Gt C, and 2.3 Gt C = ~.45% of 500Gt C, the amount estimated to be around the ESAS submarine permafrost [Shakhova, 2010]).
This is a quite credible scenario. It is certain that there are already releases coming from this source, and that there is currently significant deterioration of the state of this carbon store from influxes of warming waters, lost ice, etc. Current methane fluxes recorded in the water column from around these methane hotspots are >1000x those expected from the observed atmospheric anomolies, so considerable methanotrophic activity could already be activated, making irregularities of microbial consumption potentially even capable of such a release on its own. An estimated 3-5% of ESAS submarine permafrost is currently estimated to be perforated with taliks and degraded [Shakhova, 2010], and extrapolation from current hotspot releases would actually equal ~3.5Gt/yr [Shakhova, 2010].
A pulse of 3 Gt CH4 doubles the methane increase since industrialization: we have increased methane by about +157% (700ppb + 1100ppb= ~1800ppb), with abundances being ~1.9Gt (pre-industrial) + 3Gt = 4.9Gt CH4 (current). With best understanding of all indirect effects, this 3 Gt has added ~1W/m2 [Shindell et al, 2008]. Thus, this modest methane pulse would add quite quickly about +62% to all the increased radiative forcing since industrialization.
Further, spreading it over a couple of years wouldn’t make all that much difference: the feedback effect for methane is a ~ -.2 loss rate for each +1% of methane emission rate, which holds for up to about 33% increase in emission rate. (After that, I believe the negative loss rate increases further – I think Gavin has written on this and could explicate this easily). Roughly, the release would constitute a ~+500% change, so the pulse should, I believe, last for something like double the lifetime, or more, something like two decades or more. In any case, it would last long enough that I suspect it might cause Much Ado.
Eli Rabett says
RE: “Reality is only one realization of an ensemble”
If you have a model which is consistently wrong in one direction, the conclusion is that your model is either incomplete or wrong. If the model works for many other things (GCMs) you can still confidently use a consistent offset to estimate the future till you figure out how to improve the model.
This point emphasizes the dichotomy between the two uses of models, for forecasting and for understanding.
John Nissen says
@ Hank, who said, in #95:
“They [AMEG] don’t yet get the science, as Gavin pointed out above.”
“Seriously, from what we know, so many other nasty things will have happened before a methane burp becomes an issue.”
The accusation that AMEG does not “get the science” is a serious accusation, since AMEG claims to be driven to its conclusions by the best available evidence, both on sea ice and on methane.
AMEG’s Peter Wadhams, a professor of Ocean Physics at Cambridge, is the UK’s top expert on sea ice. He explains that the sea ice is thinning at an accelerating rate, as the Arctic Ocean warms. The ice can get thinner and thinner while the extent remains roughly constant, as has happened since 2007; but this can’t go on for many more years as shown by the PIOMAS data. At some point the extent of the ice has to collapse. But the thickness and melting of the ice is not uniform over the whole Arctic, so some residual ice will probably remain north of Greenland and around some of the islands. Thus the sea ice volume will not fall straight to zero on the PIOMAS graph, but curve round to the right. However the sea ice September extent can be expected to fall to a much reduced level by 2015, and possibly earlier – even this year is on the cards.
The main heat flux for the warming ocean is now coming from the albedo flip effect. Thus if the decline in extent doubles, this albedo flip warming effect doubles.
[BTW, I made a careless mistake in my posting, comment 57#, to say that the Arctic warming would double. It’s obviously only the albedo warming that doubles. But because that warming is now dominant, see comment 75#, the Arctic warming will certainly accelerate.]
[Response: I still don’t see why you think that the warming will accelerate much faster than the models suggest since they include these mechanisms. – gavin]
Since the warming of the Arctic is already considered to be causing an increase in weather extremes in the Northern Hemisphere, further warming is bad news by itself.
[Response: This is still an area of ‘active research’ as they say. It is not clear to me that published attempts to assess changes in variability in recent years have a solid statistical foundation. Maybe, maybe not. – gavin]
And the loss of the sea ice will mean the loss of an entire ecosystem, with repercussions that could include a major food chain, because of organisms that live on the underside of the sea ice.
But we also have to consider what’s happening with the methane. You, Hank, seem to recommend that we ignore the methane, because other issues will become prominent before it “burps”. However, what you don’t seem to appreciate is the risk of methane feedback, where the warming effect of the methane leads to further methane emissions in a vicious feedback loop. Once this gets going, it will almost certainly be unstoppable, as people like Steven Chu acknowledge.
[Response: You discuss thinks with ‘almost certainty’ that are almost certainly highly uncertain. – gavin]
There are signs of an escalation of emissions from the East Siberian Arctic Shelf (ESAS). Igor Semiletov, a top Russian expert on ESAS and its methane, who I met at the AGU last December, told me that he considers that the ESAS is a prime candidate for abrupt climate change. To my knowledge, nobody has been able to contradict his assertion that there is up to 50 Gigatonnes of methane available for release “at any moment”, e.g. as the result of an earthquake.
[Response: Earthquakes can happen any time and so any release associated with that has nothing to do with climate change, and no amount of emissions reductions will have any effect. However, if you postulate a strong earthquake driven methane pulse mechanism, where is there any evidence that this has been important in the Holocene? – gavin]
However, Nathan has further elaborated on the serious repercussions of a much smaller burp, a mere 3 Gigatonnes; see #125.
Thus, even without further Arctic warming, we have an extremely dangerous situation with the methane. But the retreat of the sea ice and the methane emissions are mutually reinforcing. AMEG claims that dealing with this emergency situation has to involve measures including some kind of geoengineering to actually cool the Arctic, or cool the currents and rivers flowing into the Arctic.
If anybody can show that AMEG conclusions are incorrect, I would be extremely pleased to see their argument.
John
[Response: It is rather that there is nothing convincing about them. You keep ignoring the fact that there is no evidence for methane burps associated with conditions in the relatively recent past (early Holocene, Eemian) for which there is good evidence for warmer Arctic conditions than now, and you are happy to extrapolate emissions of a few Tg (at most) to values 1000 times larger on the basis of nothing very much. You are going to have to do a lot better than that. – gavin]
sidd says
would a 3Gt methane burp be visible in fossil record ?
[Response: It would be visible in the ice cores. Pre-industrial emissions were around 300 Tg/year (0.3 Gt), so 3Gt is a factor of 10 greater than all global emissions (and obviously a much bigger factor greater than just Arctic emissions). But at minimum that would more than double the CH4 concentrations for a decade or so. Even with diffusion in the ice core, one would see a spike associated with this in the Holocene or Eemian records. Note too that this would only be a small increase in radiative forcing – less than the sustained anthropogenic increase in methane we have already seen. – gavin]
Ray Ladbury says
Nathan and John Nissen,
OK, if these scenarios are credible, then why have they not happened before? Or if you contend that they have, where is the paleo-evidence? Or if we are to dispense with the need for evidence before we consider a scenario credible, can I at least get an “Oogah-Boogah!?”
Jim Steele says
How well did those models predict the recent expansion of Bering Sea Ice that is currently 160%+ above average. There was a USGS 2010 paper by D Douglas that predicted “For the Bering Sea, median March ice extent is projected to be about 25 percent less than the 1979–1988 average by mid-century and 60 percent less by the end of the century” But if Bering ice is driven by the PDO I suspect that prediction will fail.
This makes me curious about how sea ice loss is modeled. I am under the impression that it is driven by CO2 mediated ice-loss that generates albedo changes resulting in positive feedbacks that increase further melting. How are changes in the the NAO and PDO modeled that can speed/slow the intrusion of warmer waters into the Arctic. I would assume those oscillations are not readily modeled, but the literature suggests several modelers believe the positive phases are controlled by CO2 and become more persistently positive with increasing CO2 .However most oscillations appear to now be headed to negative phases. I would appreciate any insight into how such ocean circulation changes are incorporated into predictions.
Geoff Beacon says
Responding to John Nissen in #95 Gavin says “Earthquakes can happen any time and so any release associated with that has nothing to do with climate change,”
I understand there is some debate about whether earthquakes could be related to climate change. Is it not the case that at the end of the last ice age, there was increased seismic activity due to changes in the weight of ice on the earth’s surface?
Currently a few hundred trillion tonnes of land based ice are being removed from the polls each year. Added to this the seas in these regions recede as the gravitational pull of the ice is no longer present, leading to the redistribution of this water to the equator.
Might we see increased frequency of earthquakes as the earth is squeezed round the equator and released at the polls?
Rob Dekker says
Sorry for being a bit late, but for starters I would like to thank Alex/Ron/Cecilia for this post on PIOMAS, and I have a question :
I understand that PIOMAS is forced by NCEP/NCAR data, which I think makes a lot of sense, since that the data that is as close to ‘reality’ as we can get without actually measuring ice volume.
But how about “under-ice” heat flux ? How is that modeled in PIOMAS ?
As I understand it, even a small amount (a few Watt/m^2) of sustained heat-flux very significantly impacts growth of multi-year ice in winter, and possibly make a difference between 6-7 meter MYI and just 1-2 meter ice.
In PIOMAS, is there any change in ocean heat flux over the modeled period (1979 onward) ?
Hank Roberts says
> many other nasty things will have happened
judging from the paleo record as well as the models
> before a methane burp becomes an issue.
http://www.ipcc.ch/pdf/assessment-report/ar4/wg2/ar4-wg2-chapter19.pdf
“Assessing key vulnerabilities and the risk from climate change
For some impacts, qualitative rankings of magnitude are more appropriate than quantitative ones. … risk is defined as consequence (impact) multiplied by its likelihood (probability), the higher the probability of occurrence of an impact the higher its risk …”
So the issue is — what’s the likelihood of a methane burp before the other expected stages of warming? This is whatchacallit, a decision tree — and the methane burp is the goblin sitting out near the end of one of the branches.
But which branch — at what point in time/temperature would we be turning onto that branch?
We’d need what,
— 4 degrees C overall to see the Arctic ocean warm?
— decades for warmth to propagate to the depths
— a new kind of rapid event not discovered in the paleo work?
I’m just a reader. I can’t claim a methane burp would be any less horrific than described by the worst case claims, just that I haven’t seen anyone documenting it’s happened in the paleo record.
What I read indicates it’s _quite_likely_ the world would already be in hellish condition from known outcomes of known fossil fuel uses we now engage in, before any likelihood of a methane burp.
Storegga, last I heard, was dated late in, not prior to, the climate event around the same time, which makes earthquake-created slides seem less probable as triggers: http://hol.sagepub.com/content/21/7/1167.short
So — how fast is the vertical mixing at locations across the Arctic? That seems the key missing chunk of information. Who has that info, or raw data that could be mined for the info? Or do we need to wait for a generation of uncrewed probes that can function under the winter ice and report back to get that data?
wili says
Have people seen the latest “Climate Crock of the Week.” I bring it up here, because it relates the recent spate of very extreme weather to Arctic Sea Ice Extent (especially in the second video):
http://www.youtube.com/watch?v=_-1iBHAivmw&feature=g-all-u&context=G234d85aFAAAAAAAAAAA
sidd says
Re: methane, ice cores
i see from the dome C data that the largest methane excursion in the last 800Kyr was around 300 ppb. This would correspond to around 1Gt CH4 release or about 0.2% of Shakova estimate of 500Gt fossil carbon store around ESAS ?
sidd
Geoff Beacon says
Excuse for spelling in # 131. I have been using Microsoft’s voice recognition. It’s rather good but not perfect. But nor am I.
Nathan says
Responding to Gavin Schmidt, John Nissen, and several others too. First, here is David Archer’s answer to John Nissen’s question about a “methane burp”, in stark contrast to what Gavin wrote:“A single catastrophic release of methane might not even be visible in the ice core records.”
Elsewhere in the same paper, Archer describes how this could come from the methane trapped in the ice being smoothed through “diffusion within the fern or heterogeneous bubble closure depth,” or simply through the methane sampling not being dense enough, where the maxima of release could be overlooked [Archer, Methane hydrate stability and anthropogenic climate change, Biogeosciences, 2007]. The reference was to a 10Gt release in that case, so it would stand to reason that a 3Gt release could more easily slip by detection.
In terms of the comments about the Holocene record, etc, and Gavin’s saying that there is “no evidence” of such methane burps then: first, let us all also acknowledge that some of the world’s major paleoclimate and methane experts HAVE seen evidence of exactly that [i.e., Nisbet, Have sudden large releases of methane from geological reservoirs occurred since the Last Glacial Maximum, and could such releases occur again? Phil. Trans of Royal Society, 2002]. In fact, I was just looking at a new paper with more such evidence a week or so back. In discussing the Eemian, let me also add this: certainly CO2 levels are the most precise and unequivocal of the climate signals we have from the past, and it is clear that they were nothing like the current day during the Eemian, never surpassing ~280ppm, and rather like our pre-industrial levels. Further, sea levels, of great significance to the state of various submarine carbon stores, were of course also very different, too, and while we normally think of higher sea levels as always making hydrates more stable (explaining the lack of release then), there could potentially be an opposing effect as well, discussed by Archer, in fact (Archer, 2007). One could see these differences of sea level going either way – either the shallow hydrates were kept more stable than they are now, or perhaps they actually did get released more than we have yet ascertained. But in general, it would seem that, since Gavin is discussing John Nissen’s presenting things that are really highly uncertain as being certain, I would simply state that expecting the Holocene record (or the whole Pleistocene record for that matter) to predict what we should expect of sensitive and complex arctic systems in our Anthropocene, at 400ppm CO2 and an arctic where methane is often spiking over ~1900ppb now (versus a little over ~700ppb, I believe, for the Eemian) might itself be guilty of the same problem, because one has mistakenly assumed the certainty of a past analogue when really there isn’t any.
Now, in the discussion of earthquakes, I think John Nissen was simply mentioning mechanisms for abrupt methane release, and thus wrote “e.g. as from an earthquake,” and Gavin pointed out that this would have little to do with warming. John might have been better served by having pointed to landslides or slope failure, and this not only could have been related to warming, but highly relevant to the specific situation at hand that he was discussing. I quote again from David Archer:
Another mechanism for releasing methane from the sediment column is by submarine landslides. These are a normal, integral part of the ocean sedimentary system (Hampton et al., 1996; Nisbet and Piper, 1998). Submarine landslides are especially prevalent in river deltas, because of the high rate of sediment delivery, and the presence of submarine canyons.
Of course, the Lena river delta, with its rapidly warming waters, is central to the ESAS methane situation, and slope failure might be a potential mechanism for sudden releases there.
Further, I think that Gavin’s point about the earthquakes having “nothing to do with climate change”, true in terms of causation, otherwise leaves out something utterly central to this discussion. In the ESAS, as I mentioned above, there are many taliks now, and they are centered around fault zones, where there is geothermal heating on top of the warming of the bottom waters. Where you then have a talik, from this combination of geological and radiative forces, and then there is plenty of free gas underneath that can migrate out easily through pathways once there are such tears, and then you add on top of all that that it is a seismically active zone, one can easily see how global warming could greatly amplify the effects of an earthquake at that fault zone. So the two don’t exactly have nothing to do with each other, although the source of the earthquake, is, as Gavin said, unrelated to the warming.
Now, Archer has written that, “it appears that most of the hydrate reservoir will be insulated from anthropogenic climate change. The exceptions are hydrate in permafrost soils, especially those coastal areas, and in shallow ocean sediments where methane gas is focused by subsurface migration.”
Of course, it is precisely one of those exceptions that is most concerning to some of us, and Archer himself noted that, “The Siberian margin is one example of a place where methane hydrate is melting today, presumably at an accelerated rate in response to anthropogenic warming. This is a special case,” and elsewhere he mentioned how “The most vulnerable hydrate deposits in the ocean appear to be the structural type, in which methane gas flows in the subsurface, along faults or channels, perhaps to accumulate to high concentrations in domes or underneath impermeable sedimentary layers,” which is exactly the situation there, and, since the amounts stored there are so large, even just the non-hydrate cap deterioration could be of consequence by itself, but further that deterioration then allows exactly the flow along faults as discussed by Archer.
Of course, it might be worth noting that Archer’s methane piece, his “Much Ado,” never did try to say that methane burps weren’t quite possible, but rather that they wouldn’t be particularly significant. He justified this attitude through using a time horizon of 100,000 years. This seems highly unwise, and, as I discussed in a piece on HuffPost about it, “Methane in the Twilight Zone, Episode 2,”* the more that you’re planning on doing anything about climate change – i.e., lowering GHG emissions, pulling carbon out of the system through biochar, afforestation, etc – the less sense it makes.
* http://www.huffingtonpost.com/nathan-currier/methane-in-the-twilight-z_1_b_1207619.html
ps – Archer wrote (Archer, 2007): “If 5 Gton C CH4 reached the atmosphere all at once, it would raise the atmospheric concentration by about 2.5 ppm of methane, relative to a present-day concentration of about 1.7 ppm, trapping about 0.2 W/m2.” Is that just a proofreading error? How could it be .2W/m2? Does that make sense?
Chris Reynolds says
John Nissen,
I beg to differ. I’ve done several blog posts on Arctic methane after reading a substantial sample of the available science. My criticisms of the Semiletov & Shakhova 50Gt are available here. Search that page for the string “presentation is talking about” and you’ll find the relevant paragraphs. You can find the rest of my posts on methane by clicking on the label at the bottom of that post.
There being some doubt about the magnitude of the 50Gt isn’t the main issue however, even a rapid degassing of a third of that would be serious. For the me the main issues are:
1) Lack of such degassing events in the recent paleo-records, as Dr Schmidt points out this would appear in ice cores, yet the warmth of the early Holocene and the earlier Eemian and Holsteinian interglacials did not trigger a degassing. That’s despite those periods probably being warmer than at present, and crucially in terms of ocean temperature and sediment warming, warmer for much longer periods than the anthropogenic warming (so far).
2) Lack of a credible process by which ocean warming can rapidly affect the bulk of the ESAS methane deposits in the sedimentary column.
3) Lack of evidence that the changes being seen are not long-standing, being largely due to the inundation of the ESAS after the end of the last glacial rather than due to recent anthropogenic warming.
I am certainly not saying there is not a problem, but I see the threat of the ESAS methane as more chronic than catastrophic, and see the risks attendant with the thawing of land permafrost, especially in view of the rapidity of warming in the Arctic, as the more strongly evidenced risk. But again this is likely to be chronic, not catastrophic.
I have to agree with Hank’s ‘tree analogy’, there are bigger issue to face in the short term. The risks of Arctic methane are a footnote, but not a negligible one.
As an aside, you mention that “Since the warming of the Arctic is already considered to be causing an increase in weather extremes in the Northern Hemisphere…”
Francis & Vavrus have recently published interesting findings with regards the behaviour of the jetstream. They note that this “may lead to an increased probability of extreme weather events”, note the use of ‘may’, this is early in a new line of research. Judah Cohen has done a succession of papers linking snowfall advance in Siberia to cold winters in Europe, this is more strongly supported, and seems in turn to be linked to reduced sea ice. Finally the recent ‘cold outbreak’ of late January in Europe seems to me to be the mechanism of Petoukhov & Semenov, see here. Taken together (and there’s more research I’ve not mentioned here), I’d agree with Dr Schmidt’s view that the chages may not yet be statistically strong. But from my reading it is clear that the loss of Arctic sea-ice is affecting NH circulation, and this suggests it will intensify as time goes on. With time it will become statistically significant.
Pete Best says
Re #134 – Ah yes living in the UK and sometimes watching Horizon (our BBC sceincy documentary program) I can see that some of this material in the popular media is now quoting scientists (peter Wadhams – do a google search for his name and Arctic) who appear to have saisd something that the media likes (always the hype of doom gets a page or two of news)or using the Global Weirding program that Horizon showed in March to demonstrate that its all true and the climate is changing.
Peter Sinclair is certainly showing some footage from this program. Personally I did watch the Horizon and it concentrated on Texas (well you would as its a place of extremes although flooding and drought are not uncommon) and on Hurricanes (odd one that really as the just is out on that one but they are big storms right) and then on the Sun (denial peice) and then on the Arctic being the culprit for the UK cold winters (wow the UK gets cold winters) and then for mild weather thereafter as the Sun is not the culprit here which although all true is not yet attributed to climate change per se as natural variability is variable and anything can happen on short time scales.
Who knows exactly when ACC will be the culprit in weather events – I know that James Hansen has attributed the Russian and EU heatwares in the zeros to being highly likely ACC enchanced events and maybe they are but the media once it has connected the dots (as it is starting to) then we are in for some more political posturing.
Fun but reality is more nuanced and complex than any of that as this article is pointing out
Pete Best says
http://climatecrock.files.wordpress.com/2011/09/stroeve2big.jpg
Most of the information countering this artcile is taken from this graph. FOr most of us lay types its hard to see how the ice will recover from such a 30 year reduction even if its not true and you cant assume it will disappear even if it seems to be.
Natural variability is hard to see here.
Pete Best says
Maybe Peter Wadhams being quoted is all down to this paper:
http://www.agu.org/pubs/crossref/2011/2011JC006982.shtml
Dan H. says
Pete,
Many prefer to use the Torgny Vinje paper when referencing sea ice prior to the satellite era.
http://journals.ametsoc.org/doi/pdf/10.1175/1520-0442%282001%29014%3C0255%3AAATOSI%3E2.0.CO%3B2
Kevin McKinney says
#130–“How well did those models predict the recent expansion of Bering Sea Ice that is currently 160%+ above average.”
I doubt they did at all–variability on annual timescales is probably basically ‘weather.’ Modeling won’t predict short-term variability–and that’s a limitation, not a failure.
“I am under the impression that it is driven by CO2 mediated ice-loss that generates albedo changes resulting in positive feedbacks that increase further melting.”
GCMs have oceanographic components–see Kate’s Skeptical Science post for a useful and accessible discussion of the architecture of climate models–which surely include currents as part of their ‘dynamical’ modelling. And I know that the fluxes of heat into the Arctic are a major focus of the work of Dr. Maslowski in regional-scale modeling. So I think, Jim, that your impression of the modeling is much too ‘minimal’–but I’ve just shared pretty much everything I know about it. I, too, would be very interested in more detail on this.
A useful summary of sea ice research is here:
http://www.springerlink.com/content/c4m01048200k08w3/
It’s mostly about atmospheric forcings, but there is some mention of oceanic heat fluxes in the “Discussion and Conclusions” section.
Gerdes and Lemke discuss sea-ice modeling issues in considerable technical detail in a Google Books preview here:
http://tinyurl.com/Gerdes-LemkePreview
(Of course, it has annoying lacunae, as these previews always do–including almost all of the discussion section.) It appears to be related to a paper, which is unfortunately paywalled:
http://www.springerlink.com/content/k72m58q83509787x/
wayne davidson says
#130, “How well did those models predict the recent expansion of Bering Sea Ice that is currently 160%+ above average. There was a USGS 2010 paper by D Douglas that predicted “For the Bering Sea, median March ice extent is projected to be about 25 percent less than the 1979–1988 average by mid-century and 60 percent less by the end of the century” But if Bering ice is d”
The model proposed here is not a concern, Natural Variability as proposed above is the culprit, Alaska was the only cold place in North America, Ice leaves a footprint with just about any weather or sea event.
What about Barents sea? The recipient area of multiple cyclone visits, another footprint:
http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/recent365.anom.region.6.html
Sea ice is such a good indicator of weather and geophysical events, I can tell where the center of a high pressure is, especially by looking at newly formed leads, the center of a powerful high punches a hole through the ice!
I know both Wadhams and Ray Bradley, splendid chaps, I think Bradley’s hypothesis resonates much more in the case of methane, because its the speed of the warming change which matters. Gavin quoting the holocene and past times misses that point. This change speed is dizzying us in the Arctic, even snow buntings come back very early this spring, and polar bears are seen on the thin enough sea ice for seals at the North Pole. I look for evidence of methane likewise, I think we don’t have the resolution on networks to catch strange sudden temperature shifts up to 6 degrees in one hour, not due to winds, the temperature goes up 6 degrees then goes back to “normal” cooler. We need to focus more, and we might see why sea ice has evolved, as I show on my blog, so radically in a mere 20 years.
‘
Hank Roberts says
> Dan H.
> Many prefer to use the Torgny Vinje paper
And if you read it, it’s obvious why “many prefer” that one.
Dan H. continues to push natural variation as the answer.
He’s become a good mimic of the way a scientist writes, lacking only facts.
The papers citing Vinje don’t support Dan H.’s claim.
“The current reduction in Arctic ice cover started in the late 19th century, consistent with the rapidly warming climate, and became very pronounced over the last three decades. This ice loss appears to be unmatched over at least the last few thousand years and unexplainable by any of the known natural variabilities.”
http://www.sciencedirect.com/science/article/pii/S0277379110000429
Quaternary Science Reviews
Volume 29, Issues 15–16, July 2010, Pages 1757–1778
Special Theme: Arctic Palaeoclimate Synthesis (PP. 1674-1790)
Axel Schweiger says
#140. Just to clarify: It is important to distinguish between attributing the differences between models (with anthropogenic forcing) and observations to natural variability and attributing the 33+ year decline in sea ice to anthropogenic forcing. While I think there is room for debate for the former, I think the latter is fairly settled.
Axel
Axel Schweiger says
#140. Just to clarify: It is important to distinguish between attributing the differences between models (with anthropogenic forcing) and observations to natural variability and attributing the 33+ year decline in sea ice to anthropogenic forcing. While I think there is room for debate for the former, I think the latter is fairly settled.
Chris Reynolds says
Mmm,
I did think it peculiar that Dan H was recommending such an old paper when there’s far more recent work that covers a longer period and puts recent Arctic sea ice loss firmly into context.
Kinnard et al, 2011, Reconstructed changes in Arctic sea ice over the past 1,450 years.
http://www.deas.harvard.edu/climate/seminars/pdfs/Kinnardetal2011.pdf
Dan H, if you really are trying to push the natural change meme – you’re very badly out of date. That was a reasonable suspicion as late as the 1990s, but since the events of the 2000s nobody serious buys it.
Dan H. says
Hank,
Are you claiming that papers citing Vinje don’t support the claim that people prefer to refernce his paper?
Peter Ellis says
If the authors are still around, I’d be grateful if they could clarify their thinking in regard to the published PIOMAS projections in Zhang et al 2010. I’m afraid I didn’t find their methodology very convincing – I know this is an extraordinarily arrogant statement coming from a molecular biologist, but there you go.
Their method runs the model forwards. Each year they pick a random historical year’s “weather”, apply that to the current state of the ice and see how it turns out. Then, for the next year, pick another random historical year’s weather, run the model forwards another year, and so on. They use two different historical pools for the resampling: either the complete 1948-2009 dataset or a more limited 1989-2009 dataset. On top of that, they apply a warming trend of either 2 degrees or 4 degrees from 2009-2050.
The problem is that in doing this, each year in their subsequent forward projection effectively “resets” the surface temperature to whatever the value was during the year the magic 8-ball picked during in the resampling process that created the forcing pool, plus the amount added on to represent the 2/4 degree trend.
Physically, what that does is assume that all warming occurring during the resampled period is random, and there is no AGW trend in this time period. That is known to be an invalid assumption for both periods – in each there is a significant positive trend. For periods where there is a significant trend you can’t simply use an ensemble over this period to estimate random variation since part of the change over that period is due to the (genuine) trend rather than to random variation. Surely the data should first be detrended, then permuted, then the SAT adjusted according to the forecast model. That is, they should be estimating the CV relative to the trend.
Scenarios A1 and B1 (i.e. forcings permuted randomly from the entire 1948–2009 record) are especially grossly affected. Just look the second panel of figure 1 – these scenarios both effectively assume an immediate, unrealistic and unphysical drop of >2 degrees in Arctic SAT! Even with the added 2/4 degrees warming, the projected trend in Arctic SAT doesn’t get back up to 2009 levels until 2035 for the A1 run, and never gets back up to 2009 levels in the B1 run. Put another way, both these runs assume that in the next 4 decades there will only be 4 years hotter than 2009. In the real world, 2010 and 2011 were both hotter.
Both these scenarios are frankly unworthy of analysis. All they tell us is that if there is no further warming for 40 years, and Arctic temperatures remain largely at or below today’s values for another four whole decades, then we won’t lose the Arctic summer ice. No shit, Sherlock. Frankly, it’s profoundly concerning that the model projects any loss in these two scenarios: and we do. Even with these wildly optimistic assumptions, September ice area drops by around 50% by 2050 in run A2, and barely holds its current levels in run B2 despite a sustained drop in Arctic SAT. That’s chilling (pun intended).
Scenarios A2 and B2 have the same problem, but to a lesser extent since the pool is drawn from the generally warmer 1989-2009 period. Of these two, A2 assumes that Arctic warming will continue at around the same rate as previously. That is, they added in an a trend of 4 degrees / 41 years = 0.0975 deg/year, comparable to the 0.093 deg/year seen in the hindcast. The fact that the actual trend seen in the A2 run was 0.084 deg/year simply reflects the stochastic nature of the resampling process. In contrast, the B2 assumes that the rate of future Arctic warming will be half this. Although this range is based on IPCC GCMs, given the historical record B2 seems hard to justify. This thus leaves A2 as the only realistic projection in my view.
Nevertheless, the conclusions from the A2 run are apparently encouraging in that September ice extent stays above ~5 million through 2020, and above ~2 million through to 2050 (eyeballed from Figure 1, so will not be precise!). This is a robust finding and is replicated in two further runs A2′ and A2″, which I assume they ran in the knowledge that A2 is the only realistic scenario.
However, look at the volume figures. As they themselves acknowledge, these plummet.
“The September ice volumes of A2 and B2 drop to very low
levels as early as around 2025 and remain almost flat in
the following years when the projected September ice
extents continue to fluctuate significantly annually (Figures 1d
and 1e). This suggests that summer ice volume is more sensitive
to AW than summer ice extent”
At face value their numbers suggest the Arctic will be left with an extensive cover (>4 million km^2) of ice but only a small volume (< 2 million km^3): i.e. on average the ice will be less than half a metre thick.
They interpret this as showing that ice area is robust to AGW for the next few decades. I think this rather shows the model doesn't adequately represent ice breakup processes. Ice under a given thickness just fragments, gets mixed with surface waters and melts out. PIOMAS is known to overstate the thickness of thin ice anyway (Schweiger et al 2011).
The upshot is that if the Arctic ice follows their only realistic forcing scenario, by 2025 it's not going to be recognisable – it'll be an eggshell-thin layer that poses no barrier to shipping. I don't find this particularly encouraging.