Some concerns about continued monitoring of sea ice by remote sensing were raised this week in Nature News an article in the (UK) Observer: Donald Trump accused of obstructing satellite research into climate change. The last headline is not really correct, but the underlying issues are real.
Climate Science
Unforced variations: Nov 2017
This month’s open thread. Lawsuits about scientific disputes, the new Climate Science Special Report from the National Climate Assessment, and (imminently) the WMO State of the Climate statement for 2017.
El Niño and the record years 1998 and 2016
2017 is set to be one of warmest years on record. Gavin has been making regular forecasts of where 2017 will end up, and it is now set to be #2 or #3 in the list of hottest years:
With update thru September, ~80% chance of 2017 being 2nd warmest yr in the GISTEMP analysis (~20% for 3rd warmest). pic.twitter.com/k3CEM9rGHY
— Gavin Schmidt (@ClimateOfGavin) October 17, 2017
In either case it will be the warmest year on record that was not boosted by El Niño. I’ve been asked several times whether that is surprising. After all, the El Niño event, which pushed up the 2016 temperature, is well behind us. El Niño conditions prevailed in the tropical Pacific from October 2014 throughout 2015 and in the first half of 2016, giving way to a cold La Niña event in the latter half of 2016. (Note that global temperature lags El Niño variations by several months so this La Niña should have cooled 2017.) [Read more…] about El Niño and the record years 1998 and 2016
O Say Can You CO2…
Guest Commentary by Scott Denning
The Orbiting Carbon Observatory (OCO-2) was launched in 2014 to make fine-scale measurements of the total column concentration of CO2 in the atmosphere. As luck would have it, the initial couple of years of data from OCO-2 documented a period with the fastest rate of CO2 increase ever measured, more than 3 ppm per year (Jacobson et al, 2016;Wang et al, 2017) during a huge El Niño event that also saw global temperatures spike to record levels.
As part of a series of OCO-2 papers being published this week, a new Science paper by Junjie Liu and colleagues used NASA’s comprehensive Carbon Monitoring System to analyze millions of measurements from OCO-2 and other satellites to map the impact of the 2015-16 El Niño on sources and sinks of CO2, providing insight into the mechanisms controlling carbon-climate feedback.
[Read more…] about O Say Can You CO2…
References
- J. Wang, N. Zeng, M. Wang, F. Jiang, H. Wang, and Z. Jiang, "Contrasting terrestrial carbon cycle responses to the two strongest El Niño events: 1997–98 and 2015–16 El Niños", 2017. http://dx.doi.org/10.5194/esd-2017-46
- J. Liu, K.W. Bowman, D.S. Schimel, N.C. Parazoo, Z. Jiang, M. Lee, A.A. Bloom, D. Wunch, C. Frankenberg, Y. Sun, C.W. O’Dell, K.R. Gurney, D. Menemenlis, M. Gierach, D. Crisp, and A. Eldering, "Contrasting carbon cycle responses of the tropical continents to the 2015–2016 El Niño", Science, vol. 358, 2017. http://dx.doi.org/10.1126/science.aam5690
1.5ºC: Geophysically impossible or not?
Guest commentary by Ben Sanderson
Millar et al’s recent paper in Nature Geoscience has provoked a lot of lively discussion, with the authors of the original paper releasing a statement to clarify that their paper did not suggest that “action to reduce greenhouse gas emissions is no longer urgent“, rather that 1.5ºC (above the pre-industrial) is not “geophysically impossible”.
The range of post-2014 allowable emissions for a 66% chance of not passing 1.5ºC in Millar et al of 200-240GtC implies that the planet would exceed the threshold after 2030 at current emissions levels, compared with the AR5 analysis which would imply most likely exceedance before 2020. Assuming the Millar numbers are correct changes 1.5ºC from fantasy to merely very difficult.
But is this statement overconfident? Last week’s post on Realclimate raised a couple of issues which imply that both the choice of observational dataset and the chosen pre-industrial baseline period can influence the conclusion of how much warming the Earth has experienced to date. Here, I consider three aspects of the analysis – and assess how they influence the conclusions of the study.
[Read more…] about 1.5ºC: Geophysically impossible or not?
…the Harde they fall.
Back in February we highlighted an obviously wrong paper by Harde which purported to scrutinize the carbon cycle. Well, thanks to a crowd sourced effort which we helped instigate, a comprehensive scrutiny of those claims has just been published. Lead by Peter Köhler, this included scientists from multiple disciplines working together to clearly report on the mistaken assumptions in the Harde paper.
The comment is excellent, and so should be well regarded, but the fact that it is a comment means that the effort will likely be sorely underappreciated. Part of problem is the long time for the process (almost 8 months) which means that the nonsense is mostly forgotten about by the time the comments are published. We’ve discussed trying to speed up and improve the process by having a specialized journal for comments and replications but really the problem here is the low quality of peer review and editorial supervision that allows these pre-rebunked papers to appear in the first place.
GPC is not the only (nor the worst) culprit for this kind of nonsense – indeed we just noticed a bunch of astrology papers in the International Journal of Heat and Technology (by Nicola Scatetta [natch]). It does seem to demonstrate that truly you can indeed publish anything somewhere.
References
- P. Köhler, J. Hauck, C. Völker, D.A. Wolf-Gladrow, M. Butzin, J.B. Halpern, K. Rice, and R.E. Zeebe, "Comment on “ Scrutinizing the carbon cycle and CO 2 residence time in the atmosphere ” by H. Harde", Global and Planetary Change, vol. 164, pp. 67-71, 2018. http://dx.doi.org/10.1016/j.gloplacha.2017.09.015
Unforced variations: Oct 2017
Is there really still a chance for staying below 1.5 °C global warming?
There has been a bit of excitement and confusion this week about a new paper in Nature Geoscience, claiming that we can still limit global warming to below 1.5 °C above preindustrial temperatures, whilst emitting another ~800 Gigatons of carbon dioxide. That’s much more than previously thought, so how come? And while that sounds like very welcome good news, is it true? Here’s the key points.
Emissions budgets – a very useful concept
First of all – what the heck is an “emissions budget” for CO2? Behind this concept is the fact that the amount of global warming that is reached before temperatures stabilise depends (to good approximation) on the cumulative emissions of CO2, i.e. the grand total that humanity has emitted. That is because any additional amount of CO2 in the atmosphere will remain there for a very long time (to the extent that our emissions this century will like prevent the next Ice Age due to begin 50 000 years from now). That is quite different from many atmospheric pollutants that we are used to, for example smog. When you put filters on dirty power stations, the smog will disappear. When you do this ten years later, you just have to stand the smog for a further ten years before it goes away. Not so with CO2 and global warming. If you keep emitting CO2 for another ten years, CO2 levels in the atmosphere will increase further for another ten years, and then stay higher for centuries to come. Limiting global warming to a given level (like 1.5 °C) will require more and more rapid (and thus costly) emissions reductions with every year of delay, and simply become unattainable at some point.
It’s like having a limited amount of cake. If we eat it all in the morning, we won’t have any left in the afternoon. The debate about the size of the emissions budget is like a debate about how much cake we have left, and how long we can keep eating cake before it’s gone. Thus, the concept of an emissions budget is very useful to get the message across that the amount of CO2 that we can still emit in total (not per year) is limited if we want to stabilise global temperature at a given level, so any delay in reducing emissions can be detrimental – especially if we cross tipping points in the climate system, e.g trigger the complete loss of the Greenland Ice Sheet. Understanding this fact is critical, even if the exact size of the budget is not known.
But of course the question arises: how large is this budget? There is not one simple answer to this, because it depends on the choice of warming limit, on what happens with climate drivers other than CO2 (other greenhouse gases, aerosols), and (given there’s uncertainties) on the probability with which you want to stay below the chosen warming limit. Hence, depending on assumptions made, different groups of scientists will estimate different budget sizes.
Computing the budget
The standard approach to computing the remaining carbon budget is:
(1) Take a bunch of climate and carbon cycle models, start them from preindustrial conditions and find out after what amount of cumulative CO2 emissions they reach 1.5 °C (or 2 °C, or whatever limit you want).
(2) Estimate from historic fossil fuel use and deforestation data how much humanity has already emitted.
The difference between those two numbers is our remaining budget. But there are some problems with this. The first is that you’re taking the difference between two large and uncertain numbers, which is not a very robust approach. Millar et al. fixed this problem by starting the budget calculation in 2015, to directly determine the remaining budget up to 1.5 °C. This is good – in fact I suggested doing just that to my colleague Malte Meinshausen back in March. Two further problems will become apparent below, when we discuss the results of Millar et al.
So what did Millar and colleagues do?
A lot of people were asking this, since actually it was difficult to see right away why they got such a surprisingly large emissions budget for 1.5 °C. And indeed there is not one simple catch-all explanation. Several assumptions combined made the budget so big.
The temperature in 2015
To compute a budget from 2015 to “1.5 °C above preindustrial”, you first need to know at what temperature level above preindustrial 2015 was. And you have to remove short-term variability, because the Paris target applies to mean climate. Millar et al. concluded that 2015 was 0.93 °C above preindustrial. That’s a first point of criticism, because this estimate (as Millar confirmed to me by email) is entirely based on the Hadley Center temperature data, which notoriously have a huge data gap in the Arctic. (Here at RealClimate we were actually the first to discuss this problem, back in 2008.) As the Arctic has warmed far more than the global mean, this leads to an underestimate of global warming up to 2015, by 0.06 °C when compared to the Cowtan&Way data or by 0.17 °C when compared to the Berkeley Earth data, as Zeke Hausfather shows in detail over at Carbon Brief.
Figure: Difference between modeled and observed warming in 2015, with respect to the 1861-1880 average. Observational data has had short-term variability removed per the Otto et al 2015 approach used in the Millar et al 2017. Both RCP4.5 CMIP5 multimodel mean surface air temperatures (via KNMI) and blended surface air/ocean temperatures (via Cowtan et al 2015) are shown – the latter provide the proper “apples-to-apples” comparison. Chart by Carbon Brief.
As a matter of fact, as Hausfather shows in a second graph, HadCRUT4 is the outlier data set here, and given the Arctic data gap we’re pretty sure it is not the best data set. So, while the large budget of Millar et al. is based on the idea that we have 0.6 °C to go until 1.5 °C, if you believe (with good reason) that the Berkeley data are more accurate we only have 0.4 °C to go. That immediately cuts the budget of Millar et al. from 242 GtC to 152 GtC (their Table 2). [A note on units: you need to always check whether budgets are given in billion tons of carbon (GtC) or billion tons of carbon dioxide. 1 GtC = 3.7 GtCO2, so those 242 GtC are the same as 887 GtCO2.] Gavin managed to make this point in a tweet:
Headline claim from carbon budget paper that warming is 0.9ºC from pre-I is unsupported. Using globally complete estimates ~1.2ºC (in 2015) pic.twitter.com/B4iImGzeDE
— Gavin Schmidt (@ClimateOfGavin) September 20, 2017
Add to that the question of what years define the “preindustrial” baseline. Millar et al. use the period 1861-80. For example, Mike has argued that the period AD 1400-1800 would be a more appropriate preindustrial baseline (Schurer et al. 2017). That would add 0.2 °C to the anthropogenic warming that has already occurred, leaving us with just 0.2 °C and almost no budget to go until 1.5 °C. So in summary, the assumption by Millar et al. that we still have 0.6 °C to go up to 1.5 °C is at the extreme high end of how you might estimate that remaining temperature leeway, and that is one key reason why their budget is large. The second main reason follows.
To exceed or to avoid…
Here is another problem with the budget calculation: the model scenarios used for this actually exceed 1.5 °C warming. And the 1.5 °C budget is taken as the amount emitted by the time when the 1.5 °C line is crossed. Now if you stop emitting immediately at this point, of course global temperature will rise further. From sheer thermal inertia of the oceans, but also because if you close down all coal power stations etc., aerosol pollution in the atmosphere, which has a sizeable cooling effect, will go way down, while CO2 stays high. So with this kind of scenario you will not limit global warming to 1.5 °C. This is called a “threshold exceedance budget” or TEB – Glen Peters has a nice explainer on that (see his Fig. 3). All the headline budget numbers of Millar et al., shown in their Tables 1 and 2, are TEBs. What we need to know, though, is “threshold avoidance budgets”, or TAB, if we want to stay below 1.5 °C.
Millar et al also used a second method to compute budgets, shown in their Figure 3. However, as Millar told me in an email, these “simple model budgets are neither TEBs nor TABs (the 66 percentile line clearly exceeds 1.5 °C in Figure 3a), they are instead net budgets between the start of 2015 and the end of 2099.” What they are is budgets that cause temperature to exceed 1.5 °C in mid-century, but then global temperature goes back down to 1.5 °C in the year 2100!
In summary, both approaches used by Millar compute budgets that do not actually keep global warming to 1.5 °C.
How some media (usual suspects in fact) misreported
We’ve seen a bizarre (well, if you know the climate denialist scene, not so bizarre) misreporting about Millar et al., focusing on the claim that climate models have supposedly overestimated global warming. Carbon Brief and Climate Feedback both have good pieces up debunking this claim, so I won’t delve into it much. Let me just mention one key aspect that has been misunderstood. Millar et al. wrote the confusing sentence: “in the mean CMIP5 response cumulative emissions do not reach 545GtC until after 2020, by which time the CMIP5 ensemble-mean human-induced warming is over 0.3 °C warmer than the central estimate for human-induced warming to 2015”. As has been noted by others, this is comparing model temperatures after 2020 to an observation-based temperature in 2015, and of course the latter is lower – partly because it is based on HadCRUT4 data as discussed above, but equally so because of comparing different points in time. This is because it refers to the point when 545 GtC is reached. But the standard CMIP5 climate models used here are not actually driven by emissions at all, but by atmospheric CO2 concentrations. For the historic period, these are taken from observed data. So the fact that 545 GtC are reached too late doesn’t even refer to the usual climate model scenarios. It refers to estimates of emissions by carbon cycle models, which are run in an attempt to derive the emissions that would have led to the observed time evolution of CO2 concentration.
Does it all matter?
We still live in a world on a path to 3 or 4 °C global warming, waiting to finally turn the tide of rising emissions. At this point, debating whether we have 0.2 °C more or less to go until we reach 1.5 °C is an academic discussion at best, a distraction at worst. The big issue is that we need to see falling emissions globally very very soon if we even want to stay well below 2 °C. That was agreed as the weaker goal in Paris in a consensus by 195 nations. It is high time that everyone backs this up with actions, not just words.
Technical p.s. A couple of less important technical points. The estimate of 0.93 °C above 1861-80 used by Millar et al. is an estimate of the human-caused warming. I don’t know whether the Paris agreement specifies to limit human-caused warming, or just warming, to 1.5 °C – but in practice it does not matter, since the human-caused warming component is almost exactly 100 % of the observed warming. Using the same procedure as Millar yields 0.94 °C for total observed climate warming by 2015, according to Hausfather.
However, updating the statistical model used to derive the 0.93 °C anthropogenic warming to include data up to 2016 gives an anthropogenic warming of 0.96 °C in 2015.
Weblink
Statement by Millar and coauthors pushing back against media misreporting. Quote: “We find that, to likely meet the Paris goal, emission reductions would need to begin immediately and reach zero in less than 40 years’ time.”
Impressions from the European Meteorological Society’s annual meeting in Dublin
The 2017 annual assembly of the European Meteorological Society (EMS) had a new set-up with a plenary keynote each morning. I though some of these keynotes were very interesting. There was a talk by Florence Rabier from the European Centre for Medium-range Weather Forecasts (ECMWF), who presented the story of ensemble forecasting. Keith Seitter, the executive director of the American Meteorological Society (AMS), talked about the engagement with the society on the Wednesday.
[Read more…] about Impressions from the European Meteorological Society’s annual meeting in Dublin
Why extremes are expected to change with a global warming
Joanna Walters links extreme weather events with climate change in a recent article in the Guardian, however, some reservations have been expressed about such links in past discussions.
For example, we discussed the connection between single storms and global warming in the post Hurricanes and Global Warming – Is there a connection?, the World Meteorological Organization (WMO) has issued a statement, and Mike has recently explained the connection in the Guardian.
[Read more…] about Why extremes are expected to change with a global warming