This month’s open thread for climate science topics. Please use the Forced Responses thread for solutions and politics.
Reader Interactions
168 Responses to "Unforced variations: Dec 2018"
Carrie carries the can too heavy for otherssays
147 mike says: “…but the underlying harsh reality is that the current accumulation number of 409 ppm
Ah yes. Well the pedants of this world would much rather speak about minutia displays of cherry picking finely tuned data points to split hairs between 1.8 vs 2.0 ppm increases and 2.9 ppm increases with NO EL NINO – or get all pedantic over claims of emissions reductions versus MLO data versus the global data that’s 60 years old or 6 days old – while ignoring the elephant in the room and the real data that really matters which is the difference between 280 PPMV and 409 PPMV – or for those without a fancy Excel spreadsheet – that’s 129 PPMV increase in Atmospheric CO2 – caused by everyday Human Activity that has not yet ceased.
Anything to not discuss the bleeding obvious! Being seen as super clever and a know it all is just too damned important for some people.
mikesays
Nigel at 148:
Thanks for the reference to the 2-3 year mixing time for CO2 emissions. I will choose to trust this article, and accept that time frame, but it’s too bad that Gavin or Stefan or Rasmus haven’t weighed in with a quick inline response to provide their opinion on mixing time. It’s a busy time of year, I guess.
I have trouble understanding what MAR is saying much of the time and he frequently stoops to a bit of name-calling, so I usually just skim his posts and move on. I enjoy cordial discussions when I have time, I won’t generally bother with folks who are unable to work to be understood without engaging in the name-calling. Life is too short.
So, again, I will ask: if you think you see a decrease in atmospheric CO2 accumulation that you believe is reasonably attributed to the pause in emission increases from 2014-2017, please provide month/year for each. I would guess with the 2 to 3 year mixing time, we have not yet seen the end of the pause (increase slowing, really) that might be attributed to the 2014-2017 pause in emission, but for those of you who feel the urge to track these kind of numbers, okay, provide month/year.
I think there is no doubt that human efforts to reduce emissions and accumulation do in fact work. So many people driving electric cars, avoiding air flight, eating lower on the food chain etc. It’s wonderful and it has to turn into an overall reduction in emissions/accumulations that would have otherwise have occurred. The problem is one of scaling and time. Is it enough and is it happening fast enough? These questions are difficult to answer with much certainty. It’s a glass half empty or half full type of situation. Each of us needs to sort out these value questions for ourselves and proceed accordingly.
I think the best way to approach these questions may be to approach from a wittgenstein – tractatus perspective, so: what can be said at all can be said clearly, and what we cannot say clearly we should pass over in silence.
I think we can talk about levels of emissions, levels of accumulation, time frames for changes to show up between the two, but as to the questions about whether our efforts are enough? should we feel doomed or hopeful? for my part, I am not sure. It looks bad to me, but I prefer to follow the W approach and state and reflect on that which can be stated clearly. For me, with the time, skills and interest I have for the project, that is the accumulation of CO2e in the atmosphere.
I think I can say clearly that these numbers are going in the wrong direction and that has consequences for earthlings.
If you want to feel good about things, then you compare 1977 with 2017 and say that our rate of increase in CO2 saturations fell from 2.57 ppm in 1977 to 2.07 ppm in 2017. I would agree that the numbers for 1977 and 2017 are likely stated clearly and correctly. I prefer to look at the whole record and say, hmmm… CO2 saturation numbers just keep rising and that has consequences for earthlings.
NASA | A Year in the Life of Earth’s CO2. An ultra-high-resolution NASA computer model has given scientists a stunning new look at how carbon dioxide in the atmosphere travels around the globe.
Plumes of carbon dioxide in the simulation swirl and shift as winds disperse the greenhouse gas away from its sources. The simulation also illustrates differences in carbon dioxide levels in the northern and southern hemispheres and distinct swings in global carbon dioxide concentrations as the growth cycle of plants and trees changes with the seasons.
The carbon dioxide visualization was produced by a computer model called GEOS-5, created by scientists at NASA Goddard Space Flight Center’s Global Modeling and Assimilation Office.
Mike @152, yes I would like to see expert comment as well!
I dont like ad hominems either, but I think MAR posts some good stuff and sounds right about the emissions issue.
His comments are hard to follow, because of a lot of abbreviations and squeezed together paragraphs, plus I am not a scientist, but as far as I understand it he has looked at the 20 year trends in the atmospheric mlo trends, and pulled out the la nina and el nino years to reduce the noise, and thinks the general longer term accelerating growth in atmospheric concentrations has at least slowed down a bit recently. As far as I can tell, he attributes this to a general slight slowdown in the rate of emissions increase since 2010 and that flattening off over 2014 – 2017.
You would never be able to attribute anything directly and unequivocally to these time periods because that period is just too short I think to rise fully above all the noise, but I think its likely that period and previous slowdown played some part, because what else would it be? Nothing I can think of.
The problem is its too early to say with any huge certainty, but that is how it is. I don’t think this is the same as being unclear it just means the best we can say is its likely something has changed but not hugely certain. So I’m slightly hopeful that we have made a small difference. I’m less hopeful that we will make enough difference.
Trickysays
Hi, I’m new here. I am just a non-scientist layman. I’ve gone through a lot of articles and thousands of comments on this site and at Skeptical Science. I just got done reading Gavin’s article and the comments re: “30 years after Hansen’s testimony”
Based on everything I’ve read so far, this is what I’ve internalized (please correct me as needed) — all climate models are obviously dependent upon the assumed inputs of both man-driven forcings and natural forcings, which the models use in physics-based simulations to produce projections of resulting outputs. The creators of such models do not claim to have intradecadal accuracy, rather the target is skill in projecting 30 year trends. Hansen was obviously required to guess all the the future forcings, which he incorporated into 3 different scenarios. His man-driven forcings included not only CO2, but also N2O, CH4 and CFC. His CO2 forcings, in retrospect, were “pretty close” for Scenario B but he overshot on the others because humans actually tackled those other emissions. Gavin took a stab at adjusting Hansen’s Scenario B and concluded that the adjusted results indicated a quite skillful model.
So my (perhaps dumb) question is — why not re-run the actual models with the actual man-made forcings that happened in those 3 decades, to see exactly how close the projections got for Scenario B? It seems like they might be “pretty darn close” and provide the AGW community with some strong ammunition? I know that it’s easy for me to propose this from the comfort of my armchair, seeing as I don’t have to chase down any of the resources to pull this off. But it’s a serious question and I’d like this group’s thoughts. Thanks in advance.
nigeljsays
Tricky @157
“Based on everything I’ve read so far, this is what I’ve internalized (please correct me as needed) — all climate models are obviously dependent upon the assumed inputs of both man-driven forcings and natural forcings…..The creators of such models do not claim to have intradecadal accuracy, rather the target is skill in projecting 30 year trends…..So my (perhaps dumb) question is — why not re-run the actual models with the actual man-made forcings that happened in those 3 decades, to see exactly how close the projections got for Scenario B? It seems like they might be “pretty darn close” and provide the AGW community with some strong ammunition? ”
I’m a layperson, but I came across something relevant to this recently. It appears models have been re run with real world data on anthropogenic aerosols, and real world data on natural factors like sunspot numbers, el nino, and volcanic activity, and I assume actual real world CO2 data. The result is models match real world data really well as below:
Let me be brief and assume things I skate over can be explained at length later if required.
(1) There are parts of the atmosphere that are taking time to catch up with emissions. The Southern Hemisphere is over a year behind. And we hear that parts of the mid-troposphere are 2 to 3 years behind. A part of the atmosphere taking time to catch up will thus be delayed in capturing any change in upward man-made emissions. So what about MLO? MLO is but a few weeks (not many weeks, a few weeks) behind NH emissions/sinks as it records the annual cycle of the Northern Hemisphere summer growing season from May to September.
(2) The amount of CO2 sucked from the atmosphere each year resulting from man-made emissions is some 6Gt(C). Only about a tenth of this is due to the current year’s emissions. If we emit less than the underlying 5.4Gt(C), CO2 levels will drop (a little bit). Yet as time goes by, the underlying reduction will decrease and if emissions do not also decrease and stay below that underlying reduction, the CO2 levels will begin rising again.
(3) The MLO data are noisy. The emissions numbers are “soft” but not impossibly so. Outside volcano years, half the noise in the MLO data is due to ENSO. Extracting that noise shows (see here (usually two clicks to download your attachment). that the 20-year trend is minutely reduced but, encouragingly, the latest 5-year trend look to be flat. IMPORTANT – this indication of an end to “acceleration” still means CO2 is rising at 2ppm+ per year.
(4) MLO data is noisy. A one-month sample is far noisier than a 4-month sample (as NOAA ESRL use here). Using all 12 months is is even less noisy. The 2018 year-to-date averages to 1.98ppm/yr (comprising Jan 1.85, Feb 1.93, Mar 2.28, Apr 1.26, May 1.60, Jun 1.95, Jul1.64, Aug 1.92, Sep 2.13, Oct 2.36, Nov 2.88. Note that your November 2018 number set out @152 is wrong & the numbers for earlier years are October not November.)
…why not re-run the actual models with the actual man-made forcings that happened in those 3 decades, to see exactly how close the projections got for Scenario B?
Because models have been vastly improved and expanded in the meantime, and there are are much more pressing questions to be asked than how good the state of the art was in the 1980s? It’s one thing to do a relatively simple ex post facto adjustment; quite another to set up a whole modeling experiment.
On the other hand, I would think that if you were to ‘dust off’ 80s-era models sufficiently to get them working under current systems, they’d run really, really fast.
I would love some expert feedback on this. I want to get an understanding of how we get from the energy absorption spectra of the CO2 molecule to predicting increases in global temperature, and I don’t know enough advanced maths / physics or have the time to plough through research papers. But accounts I have read for laypersons are too general and simplistic.
My understanding is it works as follows: I know you cannot simply go from the energy absorbing properties of CO2 to global warming by some simple equation. I understand experiments have been done with CO2 in a canister at high concentrations with a light source applied, and warming measured, but I believe the sort of couple of parts per million of CO2 we have in the atmosphere is too small to register on measuring devices like this, and a tube does not capture the physical properties of the atmosphere.
My understanding is the Stefan Boltzman laws allow us to know how much the planet would heat without any atmosphere, and we know how much the earth heats with an atmosphere, so this forms the basis to work out how much heating additional CO2 will cause. Knowing the spectral differences between CO2 and water vapour etc allows you to calculate the more precise effect of CO2 alone, and Arrhenius calculated this. Other feedbacks are also integrated into the calculations.
Is this ball park right or wrong or what? People are busy, so even a simple answer would be great.
mikesays
Hey, Al
I need to take one thing at a time. You say:
“(1) There are parts of the atmosphere that are taking time to catch up with emissions. The Southern Hemisphere is over a year behind. And we hear that parts of the mid-troposphere are 2 to 3 years behind. A part of the atmosphere taking time to catch up will thus be delayed in capturing any change in upward man-made emissions. So what about MLO? MLO is but a few weeks (not many weeks, a few weeks) behind NH emissions/sinks as it records the annual cycle of the Northern Hemisphere summer growing season from May to September.”
So I was correct that emissions do in fact show up in the MLO numbers in less than a month? As I recall, Nigel has been looking for a reduction in MLO numbers that would reflect the emission pause from 2014 to 2017. If MLO records NH emissions in less than a month, then Nigel should be able to work with the MLO numbers from 2014 to 2017, figure out how he wants to factor out the ENSO bump, then he can start pouring over those adjusted numbers looking to see the emission slowdown reflected in the MLO accumulation numbers, would you agree?
Muhahahaha, I will see The Fall of Empire within my lifetime and that’s all I ever wanted,
Love,
Nemesis
nigeljsays
Mike @163
I have already done that exercise, more or less. About 6 months ago I took about the last 10 years of yearly mlo data, and I excluded just the large el nino and la nina years. I did an average for the last 5 years and compared it to the average for previous years, and I noticed a small decrease in the rate of increase in the last 5 years. I’m sure I posted this somewhere.
I have lost the information and it was just a crude 5 minute boe scribble and I certainly dont claim it was great proof of anything, just suggested a possible change in trend. It also suggested to me the flattening off after 2014 in emissions was real.
The point is MAR has done a much more sophisticated statistical analysis of much the same sort of thing and also found a change in the trend. I defer to that. But it looks like I was right.
Al Bundysays
Steven Emerson: “Moving the goalposts”
AB: Static goalposts have use in artificial constructs, such as a game, but they’re incredibly counterproductive and dangerous in the real world…
…going to work, goal is to be on time. Chest pain… Zoom past the hospital so as to not move the goalpost.
mike @163,
Emissions showing up at MLO in less than a month?
As emissions data is Annual and the lag appears to be just a few weeks for NH CO2 emissions to appear at the MLO, for me that is effectively instantaneous.
The ESRL gif of CO2 by latitude does show some stations peaking earlier than others and tracking down & examining the timing of those peaks may give some sort of indication of speed of CO2’s journey round the NH. And the various animations may also show the length of lag, the AIRS measurement version perhaps less helpful that the modelled version.
A measure of the size of the SH lag is shown in this Scripps Inst graph which plots the difference between MLO and Antarctic CO2 measurements. The difference has grown, presumably solely due to rising human emissions, and no sign of any leveling-off by December 2018. Looking at the South Pole data over the last few years (2013-17 from ESRL) the MLO-SouthPole difference as a proportion of the annual MLO annual increase averages comes out as 1.4years, 1,8y, 1.7y, 1.3y, 1.8y, values still showing the ENSO signal but it yields an average of 1.5 years.
I think your ballpark is in the right ballpark… so to speak.
A few comments, from a fellow non-expert:
I understand experiments have been done with CO2 in a canister at high concentrations with a light source applied, and warming measured, but I believe the sort of couple of parts per million of CO2 we have in the atmosphere is too small to register on measuring devices like this, and a tube does not capture the physical properties of the atmosphere.
Yes, the Koch-Angstrom experiments of 1901, which put a chill (again, so to speak) on the discussion around Svante Arrhenius’s modeling exercise of 1896. The latter had used observations originally made by Samuel Langley, and hence did reflect (damn, can’t get away from these half-baked puns, it seems) the optical properties of the real atmosphere. I wrote about this here:
But it wasn’t that the effect Koch tried to measure was too small; it was that it was in a sense too large: virtually all the IR was blocked by the initial concentration, therefore additional CO2 had no effect. The greenhouse effect appeared to be ‘saturated’. (It’s an argument that denialists have tried to resurrect in our own day; no zombie left unturned.)
Ironically enough, Arrhenius’s friend and colleague Nils Ekholm at around the same time published a study in which he put forward a more useful formulation of how the greenhouse effect really worked. His story, too, is quite interesting, and, yes, I wrote about it, too:
. . . radiation from the earth into space does not go directly from the ground, but on the average from a layer of the atmosphere having a considerable height above sea-level. . . The greater is the absorbing power of the air for heat rays emitted from the ground, the higher will that layer be. But the higher the layer, the lower is its temperature relatively to the ground; and as the radiation from the layer into space is the less the lower its temperature is, it follows that the ground will be hotter the higher the radiating layer is.
(Unlike Knut Angstrom, Ekholm was a working meteorologist; even in 1901, the lapse rate was second nature to him–heck, even William Charles Wells had known about that, back in 1812. But that’s another story.)
Spencer Weart has a very good discussion of these issues in his piece for the AIP, “Discovery of Global Warming”; it’s available via a link on this page–the top item under the “Science Links” heading in the right-hand sidebar to this page. But, bottom line, it isn’t actually relevant whether the CO2 is saturated from a ground-level perspective; what’s relevant is the altitude of the effective mean radiating layer. And that is a question which the Koch-Angstrom experimental setup was not designed to answer.
…the Stefan Boltzman laws allow us to know how much the planet would heat without any atmosphere, and we know how much the earth heats with an atmosphere, so this forms the basis to work out how much heating additional CO2 will cause.
Well, it was a long time before SB, but this was essentially what Fourier had figured out in the mid-1820s, when he made the first stab at a Terrestrial heat budget. He had no inkling, though, what the mechanism might be by which the atmosphere warming the Earth; he just knew that the analysis indicated the existence of such a property. That’s why he usually gets credit for discovering that a ‘greenhouse effect’ of some sort existed.
It was John Tyndall–another fabulously colorful guy, and maybe the best popular science writer of all time, certainly the best of his day in English–who in 1860 identified the cause as what we now call greenhouse gases, primarily carbon dioxide (or as he called it, ‘carbonic acid’) and water vapor–or ‘vapour’, as both nigel and I were taught to spell it. (Gavin, too, of course, and any other non-US-native anglophones around here.) He used an experimental setup which no doubt inspired the Koch-Angstrom design, and which you can read all about here:
Knowing the spectral differences between CO2 and water vapour etc allows you to calculate the more precise effect of CO2 alone, and Arrhenius calculated this. Other feedbacks are also integrated into the calculations
Arrhenius still had relatively coarse spectrographic data back in 1896. That was one of the areas that Guy Callendar realized, in the 1930s, that he could improve upon in the consideration of ‘the CO2 theory.’ He was by trade a steam technologist and high-temperature spectrographic metrology was right up his alley, so he well knew how much better the data had gotten in the intervening 40 years since Arrhenius.
This was part of a long thread of research, starting with William Charles Wells, and extending to Gilbert Plass in the 1960s, which I tried to summarize here:
I’d particularly point to two figures: Anders Angstrom, the son of Knut Angstrom, the designer of the ‘CO2 saturation experiment’ discussed above, who did a pretty good job of refuting his father’s work, and Han Elsasser, who arguably should have won two Nobels but fell just outside the charmed circle on both occasions, and who developed a practical calculator for radiational cooling used for decades by forecasters. (Though Elsasser was a notable theoretician, it was impeccably-based in empirical data; someimes you have to be flexible when you are trying to pay the bills as a physicist!)
That’s one of the seriously under-appreciated points about the greenhouse effect, IMO–the extent to which it has been part of the practical, day-to-day toolkit of forecasters for decades. (Now, of course, it is far more precise than in Elsasser’s day, since the numerical models incorporate the detailed radiative transfer models first elaborated by Plass.) Every forecast you see or hear is in effect an implicit validation of greenhouse theory.
Carrie carries the can too heavy for others says
147 mike says: “…but the underlying harsh reality is that the current accumulation number of 409 ppm
Ah yes. Well the pedants of this world would much rather speak about minutia displays of cherry picking finely tuned data points to split hairs between 1.8 vs 2.0 ppm increases and 2.9 ppm increases with NO EL NINO – or get all pedantic over claims of emissions reductions versus MLO data versus the global data that’s 60 years old or 6 days old – while ignoring the elephant in the room and the real data that really matters which is the difference between 280 PPMV and 409 PPMV – or for those without a fancy Excel spreadsheet – that’s 129 PPMV increase in Atmospheric CO2 – caused by everyday Human Activity that has not yet ceased.
Anything to not discuss the bleeding obvious! Being seen as super clever and a know it all is just too damned important for some people.
mike says
Nigel at 148:
Thanks for the reference to the 2-3 year mixing time for CO2 emissions. I will choose to trust this article, and accept that time frame, but it’s too bad that Gavin or Stefan or Rasmus haven’t weighed in with a quick inline response to provide their opinion on mixing time. It’s a busy time of year, I guess.
I have trouble understanding what MAR is saying much of the time and he frequently stoops to a bit of name-calling, so I usually just skim his posts and move on. I enjoy cordial discussions when I have time, I won’t generally bother with folks who are unable to work to be understood without engaging in the name-calling. Life is too short.
So, again, I will ask: if you think you see a decrease in atmospheric CO2 accumulation that you believe is reasonably attributed to the pause in emission increases from 2014-2017, please provide month/year for each. I would guess with the 2 to 3 year mixing time, we have not yet seen the end of the pause (increase slowing, really) that might be attributed to the 2014-2017 pause in emission, but for those of you who feel the urge to track these kind of numbers, okay, provide month/year.
I think there is no doubt that human efforts to reduce emissions and accumulation do in fact work. So many people driving electric cars, avoiding air flight, eating lower on the food chain etc. It’s wonderful and it has to turn into an overall reduction in emissions/accumulations that would have otherwise have occurred. The problem is one of scaling and time. Is it enough and is it happening fast enough? These questions are difficult to answer with much certainty. It’s a glass half empty or half full type of situation. Each of us needs to sort out these value questions for ourselves and proceed accordingly.
I think the best way to approach these questions may be to approach from a wittgenstein – tractatus perspective, so: what can be said at all can be said clearly, and what we cannot say clearly we should pass over in silence.
I think we can talk about levels of emissions, levels of accumulation, time frames for changes to show up between the two, but as to the questions about whether our efforts are enough? should we feel doomed or hopeful? for my part, I am not sure. It looks bad to me, but I prefer to follow the W approach and state and reflect on that which can be stated clearly. For me, with the time, skills and interest I have for the project, that is the accumulation of CO2e in the atmosphere.
How are we doing? I can state clearly:
Nov mon avg annual increase from previous year
2018 408.02 4.38
2017 403.64 2.07
2016 401.57 3.28
2015 398.29 2.26
2014 396.03 2.33
2013 393.7 2.65
2012 391.05 2.05
2011 389 1.76
2010 387.24 2.81
2009 384.43 1.44
2008 382.99 1.87
2007 381.12 1.99
2006 379.13 2.13
2005 377 2.54
2004 374.46 1.32
2003 373.14 2.59
2002 370.55 2.1
2001 368.45 1.42
2000 367.03 1.66
1999 365.37 0.97
1998 364.4 3.58
1997 360.82 1.24
1996 359.58 1.59
1995 357.99 1.92
1994 356.07 1.95
1993 354.12 0.8
1992 353.32 0.97
1991 352.35 0.73
1990 351.62 1.37
1989 350.25 1.12
1988 349.13 2.54
1987 346.59 2.08
1986 344.51 1.38
1985 343.13 1.46
1984 341.67 1.29
1983 340.38 2.09
1982 338.29 1.21
1981 337.08 0.93
1980 336.15 1.96
1979 334.19 1.78
1978 332.41 1.14
1977 331.27 2.52
1976 328.75 0.4
1975 328.35 1.07
1974 327.28 0.1
1973 327.18 2.12
1972 325.06 1.49
1971 323.57 0.41
1970 323.16 1.38
1969 321.78 1.53
1968 320.25 0.94
1967 319.31 1.21
1966 318.1 0.8
1965 317.3 0.51
1964 316.79 0.8
1963 315.99 0.57
1962 315.42 0.04
1961 315.38 1.55
1960 313.83 0.57
1959 313.26 0.6
1958 312.66
source: https://www.co2.earth/monthly-co2
I think I can say clearly that these numbers are going in the wrong direction and that has consequences for earthlings.
If you want to feel good about things, then you compare 1977 with 2017 and say that our rate of increase in CO2 saturations fell from 2.57 ppm in 1977 to 2.07 ppm in 2017. I would agree that the numbers for 1977 and 2017 are likely stated clearly and correctly. I prefer to look at the whole record and say, hmmm… CO2 saturation numbers just keep rising and that has consequences for earthlings.
Cheers
Mike
nigelj says
https://www.youtube.com/watch?v=x1SgmFa0r04
NASA | A Year in the Life of Earth’s CO2. An ultra-high-resolution NASA computer model has given scientists a stunning new look at how carbon dioxide in the atmosphere travels around the globe.
Plumes of carbon dioxide in the simulation swirl and shift as winds disperse the greenhouse gas away from its sources. The simulation also illustrates differences in carbon dioxide levels in the northern and southern hemispheres and distinct swings in global carbon dioxide concentrations as the growth cycle of plants and trees changes with the seasons.
The carbon dioxide visualization was produced by a computer model called GEOS-5, created by scientists at NASA Goddard Space Flight Center’s Global Modeling and Assimilation Office.
Hank Roberts says
https://www.themercury.com.au/news/world/cold-pursuits-a-scientists-quest-to-uncover-antarcticas-secrets/video/93797de7b9bd8e6a37964a13962de9d4?nk=71b8aba4cec58e6deed9facc65ce6b6b-1546208593
Hank Roberts says
http://skytem.com/groundwater-mapping-in-antarctica-with-the-skytem-system/
nigelj says
Mike @152, yes I would like to see expert comment as well!
I dont like ad hominems either, but I think MAR posts some good stuff and sounds right about the emissions issue.
His comments are hard to follow, because of a lot of abbreviations and squeezed together paragraphs, plus I am not a scientist, but as far as I understand it he has looked at the 20 year trends in the atmospheric mlo trends, and pulled out the la nina and el nino years to reduce the noise, and thinks the general longer term accelerating growth in atmospheric concentrations has at least slowed down a bit recently. As far as I can tell, he attributes this to a general slight slowdown in the rate of emissions increase since 2010 and that flattening off over 2014 – 2017.
You would never be able to attribute anything directly and unequivocally to these time periods because that period is just too short I think to rise fully above all the noise, but I think its likely that period and previous slowdown played some part, because what else would it be? Nothing I can think of.
The problem is its too early to say with any huge certainty, but that is how it is. I don’t think this is the same as being unclear it just means the best we can say is its likely something has changed but not hugely certain. So I’m slightly hopeful that we have made a small difference. I’m less hopeful that we will make enough difference.
Tricky says
Hi, I’m new here. I am just a non-scientist layman. I’ve gone through a lot of articles and thousands of comments on this site and at Skeptical Science. I just got done reading Gavin’s article and the comments re: “30 years after Hansen’s testimony”
Based on everything I’ve read so far, this is what I’ve internalized (please correct me as needed) — all climate models are obviously dependent upon the assumed inputs of both man-driven forcings and natural forcings, which the models use in physics-based simulations to produce projections of resulting outputs. The creators of such models do not claim to have intradecadal accuracy, rather the target is skill in projecting 30 year trends. Hansen was obviously required to guess all the the future forcings, which he incorporated into 3 different scenarios. His man-driven forcings included not only CO2, but also N2O, CH4 and CFC. His CO2 forcings, in retrospect, were “pretty close” for Scenario B but he overshot on the others because humans actually tackled those other emissions. Gavin took a stab at adjusting Hansen’s Scenario B and concluded that the adjusted results indicated a quite skillful model.
So my (perhaps dumb) question is — why not re-run the actual models with the actual man-made forcings that happened in those 3 decades, to see exactly how close the projections got for Scenario B? It seems like they might be “pretty darn close” and provide the AGW community with some strong ammunition? I know that it’s easy for me to propose this from the comfort of my armchair, seeing as I don’t have to chase down any of the resources to pull this off. But it’s a serious question and I’d like this group’s thoughts. Thanks in advance.
nigelj says
Tricky @157
“Based on everything I’ve read so far, this is what I’ve internalized (please correct me as needed) — all climate models are obviously dependent upon the assumed inputs of both man-driven forcings and natural forcings…..The creators of such models do not claim to have intradecadal accuracy, rather the target is skill in projecting 30 year trends…..So my (perhaps dumb) question is — why not re-run the actual models with the actual man-made forcings that happened in those 3 decades, to see exactly how close the projections got for Scenario B? It seems like they might be “pretty darn close” and provide the AGW community with some strong ammunition? ”
I’m a layperson, but I came across something relevant to this recently. It appears models have been re run with real world data on anthropogenic aerosols, and real world data on natural factors like sunspot numbers, el nino, and volcanic activity, and I assume actual real world CO2 data. The result is models match real world data really well as below:
http://iopscience.iop.org/article/10.1088/1748-9326/aaf372/meta
“The Pause in historical context. Stephan Lewandowsky1,2,3, Kevin Cowtan4, James S Risbey3, Michael E Mann5, Byron A Steinman6, Naomi Oreskes7 and Stefan Rahmstorf8,9Published 19 December 2018 • © 2018 The Author(s). Published by IOP Publishing Ltd. Environmental Research Letters, Volume 13, Number 12
MA Rodger says
mike @152,
Let me be brief and assume things I skate over can be explained at length later if required.
(1) There are parts of the atmosphere that are taking time to catch up with emissions. The Southern Hemisphere is over a year behind. And we hear that parts of the mid-troposphere are 2 to 3 years behind. A part of the atmosphere taking time to catch up will thus be delayed in capturing any change in upward man-made emissions. So what about MLO? MLO is but a few weeks (not many weeks, a few weeks) behind NH emissions/sinks as it records the annual cycle of the Northern Hemisphere summer growing season from May to September.
(2) The amount of CO2 sucked from the atmosphere each year resulting from man-made emissions is some 6Gt(C). Only about a tenth of this is due to the current year’s emissions. If we emit less than the underlying 5.4Gt(C), CO2 levels will drop (a little bit). Yet as time goes by, the underlying reduction will decrease and if emissions do not also decrease and stay below that underlying reduction, the CO2 levels will begin rising again.
(3) The MLO data are noisy. The emissions numbers are “soft” but not impossibly so. Outside volcano years, half the noise in the MLO data is due to ENSO. Extracting that noise shows (see here (usually two clicks to download your attachment). that the 20-year trend is minutely reduced but, encouragingly, the latest 5-year trend look to be flat. IMPORTANT – this indication of an end to “acceleration” still means CO2 is rising at 2ppm+ per year.
(4) MLO data is noisy. A one-month sample is far noisier than a 4-month sample (as NOAA ESRL use here). Using all 12 months is is even less noisy. The 2018 year-to-date averages to 1.98ppm/yr (comprising Jan 1.85, Feb 1.93, Mar 2.28, Apr 1.26, May 1.60, Jun 1.95, Jul1.64, Aug 1.92, Sep 2.13, Oct 2.36, Nov 2.88. Note that your November 2018 number set out @152 is wrong & the numbers for earlier years are October not November.)
Kevin McKinney says
Tricky, #157–
Because models have been vastly improved and expanded in the meantime, and there are are much more pressing questions to be asked than how good the state of the art was in the 1980s? It’s one thing to do a relatively simple ex post facto adjustment; quite another to set up a whole modeling experiment.
On the other hand, I would think that if you were to ‘dust off’ 80s-era models sufficiently to get them working under current systems, they’d run really, really fast.
Hank Roberts says
https://www.scientificamerican.com/article/the-next-climate-frontier-predicting-a-complex-domino-effect/
nigelj says
I would love some expert feedback on this. I want to get an understanding of how we get from the energy absorption spectra of the CO2 molecule to predicting increases in global temperature, and I don’t know enough advanced maths / physics or have the time to plough through research papers. But accounts I have read for laypersons are too general and simplistic.
My understanding is it works as follows: I know you cannot simply go from the energy absorbing properties of CO2 to global warming by some simple equation. I understand experiments have been done with CO2 in a canister at high concentrations with a light source applied, and warming measured, but I believe the sort of couple of parts per million of CO2 we have in the atmosphere is too small to register on measuring devices like this, and a tube does not capture the physical properties of the atmosphere.
My understanding is the Stefan Boltzman laws allow us to know how much the planet would heat without any atmosphere, and we know how much the earth heats with an atmosphere, so this forms the basis to work out how much heating additional CO2 will cause. Knowing the spectral differences between CO2 and water vapour etc allows you to calculate the more precise effect of CO2 alone, and Arrhenius calculated this. Other feedbacks are also integrated into the calculations.
Is this ball park right or wrong or what? People are busy, so even a simple answer would be great.
mike says
Hey, Al
I need to take one thing at a time. You say:
“(1) There are parts of the atmosphere that are taking time to catch up with emissions. The Southern Hemisphere is over a year behind. And we hear that parts of the mid-troposphere are 2 to 3 years behind. A part of the atmosphere taking time to catch up will thus be delayed in capturing any change in upward man-made emissions. So what about MLO? MLO is but a few weeks (not many weeks, a few weeks) behind NH emissions/sinks as it records the annual cycle of the Northern Hemisphere summer growing season from May to September.”
So I was correct that emissions do in fact show up in the MLO numbers in less than a month? As I recall, Nigel has been looking for a reduction in MLO numbers that would reflect the emission pause from 2014 to 2017. If MLO records NH emissions in less than a month, then Nigel should be able to work with the MLO numbers from 2014 to 2017, figure out how he wants to factor out the ENSO bump, then he can start pouring over those adjusted numbers looking to see the emission slowdown reflected in the MLO accumulation numbers, would you agree?
Thanks
Mike
Nemesis says
@mike, #152
Quote:
” Nov mon avg annual increase from previous year
2018 408.02 4.38
2017 403.64 2.07
2016 401.57 3.28
2015 398.29 2.26
2014 396.03 2.33
2013 393.7 2.65
2012 391.05 2.05
2011 389 1.76
2010 387.24 2.81
2009 384.43 1.44
2008 382.99 1.87
2007 381.12 1.99
2006 379.13 2.13
2005 377 2.54
2004 374.46 1.32
2003 373.14 2.59
2002 370.55 2.1
2001 368.45 1.42
2000 367.03 1.66
1999 365.37 0.97
1998 364.4 3.58
1997 360.82 1.24
1996 359.58 1.59
1995 357.99 1.92
1994 356.07 1.95
1993 354.12 0.8
1992 353.32 0.97
1991 352.35 0.73
1990 351.62 1.37
1989 350.25 1.12
1988 349.13 2.54
1987 346.59 2.08
1986 344.51 1.38
1985 343.13 1.46
1984 341.67 1.29
1983 340.38 2.09
1982 338.29 1.21
1981 337.08 0.93
1980 336.15 1.96
1979 334.19 1.78
1978 332.41 1.14
1977 331.27 2.52
1976 328.75 0.4
1975 328.35 1.07
1974 327.28 0.1
1973 327.18 2.12
1972 325.06 1.49
1971 323.57 0.41
1970 323.16 1.38
1969 321.78 1.53
1968 320.25 0.94
1967 319.31 1.21
1966 318.1 0.8
1965 317.3 0.51
1964 316.79 0.8
1963 315.99 0.57
1962 315.42 0.04
1961 315.38 1.55
1960 313.83 0.57
1959 313.26 0.6
1958 312.66″
Muhahahaha, I will see The Fall of Empire within my lifetime and that’s all I ever wanted,
Love,
Nemesis
nigelj says
Mike @163
I have already done that exercise, more or less. About 6 months ago I took about the last 10 years of yearly mlo data, and I excluded just the large el nino and la nina years. I did an average for the last 5 years and compared it to the average for previous years, and I noticed a small decrease in the rate of increase in the last 5 years. I’m sure I posted this somewhere.
I have lost the information and it was just a crude 5 minute boe scribble and I certainly dont claim it was great proof of anything, just suggested a possible change in trend. It also suggested to me the flattening off after 2014 in emissions was real.
The point is MAR has done a much more sophisticated statistical analysis of much the same sort of thing and also found a change in the trend. I defer to that. But it looks like I was right.
Al Bundy says
Steven Emerson: “Moving the goalposts”
AB: Static goalposts have use in artificial constructs, such as a game, but they’re incredibly counterproductive and dangerous in the real world…
…going to work, goal is to be on time. Chest pain… Zoom past the hospital so as to not move the goalpost.
MA Rodger says
mike @163,
Emissions showing up at MLO in less than a month?
As emissions data is Annual and the lag appears to be just a few weeks for NH CO2 emissions to appear at the MLO, for me that is effectively instantaneous.
The ESRL gif of CO2 by latitude does show some stations peaking earlier than others and tracking down & examining the timing of those peaks may give some sort of indication of speed of CO2’s journey round the NH. And the various animations may also show the length of lag, the AIRS measurement version perhaps less helpful that the modelled version.
A measure of the size of the SH lag is shown in this Scripps Inst graph which plots the difference between MLO and Antarctic CO2 measurements. The difference has grown, presumably solely due to rising human emissions, and no sign of any leveling-off by December 2018. Looking at the South Pole data over the last few years (2013-17 from ESRL) the MLO-SouthPole difference as a proportion of the annual MLO annual increase averages comes out as 1.4years, 1,8y, 1.7y, 1.3y, 1.8y, values still showing the ENSO signal but it yields an average of 1.5 years.
Kevin McKinney says
First, Happy New Year, all!
nigel, #162–
I think your ballpark is in the right ballpark… so to speak.
A few comments, from a fellow non-expert:
Yes, the Koch-Angstrom experiments of 1901, which put a chill (again, so to speak) on the discussion around Svante Arrhenius’s modeling exercise of 1896. The latter had used observations originally made by Samuel Langley, and hence did reflect (damn, can’t get away from these half-baked puns, it seems) the optical properties of the real atmosphere. I wrote about this here:
https://hubpages.com/education/Global-Warming-Science-And-The-Dawn-Of-Flight
But it wasn’t that the effect Koch tried to measure was too small; it was that it was in a sense too large: virtually all the IR was blocked by the initial concentration, therefore additional CO2 had no effect. The greenhouse effect appeared to be ‘saturated’. (It’s an argument that denialists have tried to resurrect in our own day; no zombie left unturned.)
Ironically enough, Arrhenius’s friend and colleague Nils Ekholm at around the same time published a study in which he put forward a more useful formulation of how the greenhouse effect really worked. His story, too, is quite interesting, and, yes, I wrote about it, too:
https://hubpages.com/education/Global-warming-science-press-and-storms
The crucial passage:
(Unlike Knut Angstrom, Ekholm was a working meteorologist; even in 1901, the lapse rate was second nature to him–heck, even William Charles Wells had known about that, back in 1812. But that’s another story.)
Spencer Weart has a very good discussion of these issues in his piece for the AIP, “Discovery of Global Warming”; it’s available via a link on this page–the top item under the “Science Links” heading in the right-hand sidebar to this page. But, bottom line, it isn’t actually relevant whether the CO2 is saturated from a ground-level perspective; what’s relevant is the altitude of the effective mean radiating layer. And that is a question which the Koch-Angstrom experimental setup was not designed to answer.
Well, it was a long time before SB, but this was essentially what Fourier had figured out in the mid-1820s, when he made the first stab at a Terrestrial heat budget. He had no inkling, though, what the mechanism might be by which the atmosphere warming the Earth; he just knew that the analysis indicated the existence of such a property. That’s why he usually gets credit for discovering that a ‘greenhouse effect’ of some sort existed.
It was John Tyndall–another fabulously colorful guy, and maybe the best popular science writer of all time, certainly the best of his day in English–who in 1860 identified the cause as what we now call greenhouse gases, primarily carbon dioxide (or as he called it, ‘carbonic acid’) and water vapor–or ‘vapour’, as both nigel and I were taught to spell it. (Gavin, too, of course, and any other non-US-native anglophones around here.) He used an experimental setup which no doubt inspired the Koch-Angstrom design, and which you can read all about here:
https://hubpages.com/education/Global-Warming-Science-In-The-Age-Of-Queen-Victoria
Arrhenius still had relatively coarse spectrographic data back in 1896. That was one of the areas that Guy Callendar realized, in the 1930s, that he could improve upon in the consideration of ‘the CO2 theory.’ He was by trade a steam technologist and high-temperature spectrographic metrology was right up his alley, so he well knew how much better the data had gotten in the intervening 40 years since Arrhenius.
https://hubpages.com/education/Global-Warming-Science-And-The-Wars
This was part of a long thread of research, starting with William Charles Wells, and extending to Gilbert Plass in the 1960s, which I tried to summarize here:
https://hubpages.com/education/Fire-From-Heaven-Climate-Science-And-The-Element-Of-Life-Part-Two-The-Cloud-By-Night
I’d particularly point to two figures: Anders Angstrom, the son of Knut Angstrom, the designer of the ‘CO2 saturation experiment’ discussed above, who did a pretty good job of refuting his father’s work, and Han Elsasser, who arguably should have won two Nobels but fell just outside the charmed circle on both occasions, and who developed a practical calculator for radiational cooling used for decades by forecasters. (Though Elsasser was a notable theoretician, it was impeccably-based in empirical data; someimes you have to be flexible when you are trying to pay the bills as a physicist!)
That’s one of the seriously under-appreciated points about the greenhouse effect, IMO–the extent to which it has been part of the practical, day-to-day toolkit of forecasters for decades. (Now, of course, it is far more precise than in Elsasser’s day, since the numerical models incorporate the detailed radiative transfer models first elaborated by Plass.) Every forecast you see or hear is in effect an implicit validation of greenhouse theory.