This month’s open thread for climate science discussions.
Reader Interactions
206 Responses to "Unforced variations: Mar 2011"
dhogazasays
Gavin:
The older model has coarser resolution and less ‘stuff’, and so is much faster. -gavin
My assumption is that resolution (pizza box lat long degrees and number of pizza box layers in the atmosphere) and “stuff” is driven by how much time you’re willing to spend per model run – you mention about a month to model 100 years.
So where does this “month for 100 years” come from? I assume that for AR4, AR5 etc you want to make a certain number of runs (dozens?) and that there are obviously time constraints (AR5 has a schedule) and possibly funding enters into it (you can only buy so many CPU cycles) etc etc.
Care to spend a bit of time fleshing out the details, though? Is it budget driven primarily (can only spend so much on model runs), technology driven (while the next supercomputer generation will always be faster, today’s supercomputers are what you run on today), schedule driven (AR5 deadlines), a combination, or what?
[Response: The maximum length of a useful run has stayed pretty constant over the years for a much more basic reason. The fact of the matter is that climate modeling is a continually evolving process – it is never ‘done’. But it takes time to implement, test, and evaluate improvements – a timescale of months to years. So if you are happy with a model configuration now (the parameterisations, set up, experiment etc.), you will almost certainly be unhappy with it in a year’s time (because you will have improved a bunch of stuff in the meantime, found bugs, made the code more efficient, put in new diagnostics etc.). While groups are usually ok about doing long simulations with ‘almost up-to-date’ code, they become less willing to continue experiments with codes that that known to be flawed – especially when the particular error/issue was fixed/improved months before in the development version and the code has been improved to run faster now on some new hardware etc. The rate at which new stuff gets developed, combined with the timescale of developer frustration has been pretty constant for a long time. So groups only very rarely set up simulations that will take more than 6 months/1 year in total (including spin-up, historical runs, future scenarios, equilibration etc.). That puts a limit on the practical throughput that groups can stand (at around 100 years/a real month). – gavin]
Septic Matthewsays
51, Gavin in comment: [Response: The maximum length of a useful run has stayed pretty constant over the years for a much more basic reason. … That puts a limit on the practical throughput that groups can stand (at around 100 years/a real month).
I want to thank Gavin and other contributors for this interesting discussion. I look forward to more.
SteveFsays
Interesting looking Milankovitch related paper in Nature this week:
Re: Gavin’s inline remarks at #51. Thank you. There are obvious parallels between what you are describing and any formal design process – the capacity for improvement combined with the immediate need for product means that you are never satisfied with today’s model.
Where does this fit in? Do you expect it to have much of an impact?
john byattsays
# 49 Janin
here are the interactive maps (australian gov) for Australian cities and predictions of SLR for three scenarios ,
I’ve tried in the past to use both a sigma scheme for pressure and a relative humidity scheme for water vapor. Neither worked particularly well. Recently I put both together, and suddenly I’m getting much more realistic results. Mark that, folks–when writing a radiative-convective model of Earth’s atmosphere, use both a sigma scheme for pressure AND a relative humidity scheme for water vapor. Either by itself won’t help much.
Results of the latest run:
Ts 291.4 K (1.1% error).
Mean stratosphere temperature 227.4 K. Remember how I used to get figures in the 300s and even 400s? I’d like to drop it another 10 K or so, though, if I can do so without cheating.
Albedo 0.321. Too high; I’d be happier with something in the 0.30-0.31 range.
Surface illumination 182.8 W m-2. Trenberth et al. 2009 get 161.2.
Water vapor content 8.95 x 1015 kg. Definitely too low, should be 1.27 x 1016
Tropopause 9.2 km (too low).
Stratopause 46.8 km (darned close).
Top of Atmosphere 272.8 km. I’ve now got a mesosphere as well as a stratosphere, thanks to the sigma scheme.
And, best of all,
Conservation of energy error: 0.1%.
I achieve the latter by accounting for reflection down as well as up.
My next step is to stop parameterizing reflection and try the full equation of radiative transfer with particle effects included for clouds–based on effective radius, and deriving asymmetry factor, single-scattering albedo, etc. Wish me luck…
ianashsays
@45, 46
I think Muller was trying to be funny by quoting Happer.
The only link I am aware of is they are both part of the JASON Defense Advisory Group.
[Response: I didn’t see this, what did he say? – gavin]
Thomassays
Gavin, I’m presuming that month per hundred years is proably running on a few hundred processors. The armchair scientist is likely to have only a handfull of cores available, and probably not nearly enough memory for these bigger models to run in even if he had unlimited time.
[Response: Yes, you are correct. But the armchair scientist can usefully run the model version (or resolution) from a a few years back, and might prefer to stuck to relatively uncomplicated atmosphere only simulations without interactive aerosols or chemistry, and think about experiments that don’t need a hundred years of simulation. There are plenty of those. For the ‘fully-loaded’ simulations being done for AR5, you need the massively parallel super-computers though. – gavin]
Susan Andersonsays
Oops, sloppy thinking on my part, sinning above my station. Will send a note offsite. My deep apologies.
You all believe science works, but for the guy on the street, if the data are claimed to be resolved in a way that misrepresents reality their “vote” is to not comprehend. What you all are saying helps me understand better and perhaps worry less, but I’m already convinced after years of amateur observation and study at my low level. The denialosphere is gloatingly promoting BEST as the final solution (pun intended). Seems to me a recommendation from Singer is pretty much a guarantee of a certain kind of negativity (better not say too much about that either, as I’m a lightweight and regardless of what I think of his influence, he’s not).
On checking I find the connection with Happer is fairly remote and may be coincidental – a footnote involved with the APS petition.
What is one supposed to do with a greek symbol on captcha?
[Response: Ahh… I hardly think that counts as an implication that Happer is involved in BEST, though that quote, and indeed, the whole op-ed do give one pause. The response might be here. – gavin]
“What is one supposed to do with a greek symbol on captcha?”
Request a new Captcha, which you can do by clicking on the little icon with two arrows on it–it’s above and to the right of the text box. Sometimes you’ve just got to admit defeat!
OT, but I saw a phi and I typed phi. It worked! used the little arrows and that worked too. Thanks.
John E. Pearsonsays
ianash 58: Dunno if he was trying to be funny or not. In Mullins’ book he says that in the end it doesn’t matter whether Mann’s hockey stick was wrong or not because the IPCC consensus doesn’t depend on Mann’s results. My take away message from Mullins’ book was that (1)he hates Al Gore (2)he’ll do anything he can to discredit Gore but (3)he is a scientist and reality places limits on exactly how far he’ll go. WHen he was hawking his book on the Glenn Beck show he didn’t back down from his position: Global warming is real. In his book he says: By the end of the 21st century it will (if caused by humans) grow enough to be disruptive. He also says words to the effect of don’t trust consensus science which is a damned strange thing to tell a future president. Who should a future president listen to? Mullins? I don’t think so. I personally recommend ignoring Mullins. I highly recommend Burton Richter’s book Beyond Smoke and Mirror’s which, unlike Mullins, discusses climate change and energy without arrogance and without disinformation. I’d also recommend Making Technology Work by Deutsch and Lester waaaay ahead of Mullins.
[Response:sed s/Mullins/Muller/g? – gavin]
Lynn Vincentnatnathansays
RE BEST, I feel sorry for my alma mater; it’s probably extremely strapped for cash. Oil REALLY pollutes.
I feel like listening to the Garbage song, which I first heard there over at Berkeley in 1969 — http://www.youtube.com/watch?v=lmD9IJh1Cr0 (sounded better with Steele’s raw and gutteral “garbage, garbage, garbage, garbage”)
[Response: As a UC Berkeley grad myself, I will defend the school. Despite the perhaps misleading name of this project, it in no way, to my knowledge, bears the official imprimatur of UC Berkeley. Indeed, I would imagine that there are quite a few faculty and researchers at UC Berkeley and (LBL which is more closely tied to the project) that are not too happy about the way these institutions are being tied to the project, particularly given the revelations regarding Muller’s for-profit family venture . -mike]
As suspect as some of the BESTies may seem, may I still suggest (in accord with R. Ladbury) that people wait to see what they actually produce, before condemning them?
and his precis of its findings echoes the NAS’ own press summary
“June 22 — There is sufficient evidence from tree rings, retreating glaciers, and other “proxies” to say with confidence that the last few decades of the 20th century were warmer than any comparable period in the last 400 years, according to a new National Research Council report. There is less confidence in reconstructions of surface temperatures from 1600 back to A.D. 900, and very little confidence in findings on average temperatures before then” http://www.nationalacademies.org/morenews/20060622.html
Some of the BESTies may be suspect, but I agree with Ray Ladbury that people should wait to see what they actually produce, before deciding to condemn them
MikeCoombessays
Tamino had an article, http://tamino.wordpress.com/2011/01/20/how-fast-is-earth-warming/ , a while back in which he took global temperature data starting around 1975 from GISS, NCDC, HadCRUT3v, RSS, and UAH. He removed major sources of natural variation (the effects of el Nino, volcanoes, and the solar cycle) through some simple modelling. The adjusted annual temperatures, http://tamino.files.wordpress.com/2011/01/adj1yr.jpg, agreed very well with one another and were much more of monotonic increase with time.
I was wondering what would happen if one took the various climate models and did a hindcast of 1975 to 2010 and then processed the output temperature data by Tamino’s approach. I assume this can be done. How well would results of climate models agree with each other and the measured temperature data? Would the results tell you anything useful about the accuracy of the climate models?
Thanks.
Martin Vermeersays
> [Response: sed s/Mullins/Muller/g? – gavin]
Gavin, you bloody elitist… using communist software, on the tax payers’ time. Al Gore is also paying you to write in LaTeX, right?
J Bowerssays
Barry Bickmore has a three part extended critique of Roy Spencer’s ‘Great Global Warming Blunder’ (H/T to Tim Lambert):
By listing LBL as a participating institution they seem to be suggesting LBL are involved.
I fully agree – both UC Berkeley and LBL’s reputation are to die for. Why would the Directors want to get involved in this?
Ray Ladburysays
Susan,
The thing is that we have 5 global temperature series that all show warming–and warming trends that are quite consistent, especially if you account for volcanism, ENSO, etc. as Tamino has shown. I think one would have to work VERY hard to construct a series that didn’t show warming. And of course we have experience with that. UAH has been modified several times to correct biases and now, while lower than the other series and plagued by systematics, is consistent within errors.
I would be ecstatic if the denialists agreed to be bound by evidence–they would then become skeptics–as evidence can at worst be wrong. Wrong is correctable. Bullshit is irredeemable.
Lorius´s Carsays
Hi
a blogger by the name of Tamino kindly offered some great blog posts of his about volcanoes and GHGs causing most of the early 20th century warming, with solar forcing having a smaller impact. Could anybody point me to some scientific papers elaborating more on the causes of the warming 1910-40? Thnx in advance! – LC
If I’m not mistaken, the lead scientist of the Berkely temperature project is Robert Rohde. He has a good reputation, and I see no evidence of prejudice (i.e., pre-judgement) on his part.
Let’s not jump the gun — we shouldn’t criticize their efforts before any results have even been produced.
P.S. Recaptcha now includes the symbol for a set union, over all sets indexed from 1 to “l”.
…250 million years ago, about 95 percent of life was wiped out in the sea and 70 percent on land. Researchers at the University of Calgary believe they have discovered evidence to support massive volcanic eruptions burnt significant volumes of coal, producing ash clouds that had broad impact on global oceans.
“This could literally be the smoking gun that explains the latest Permian extinction,” says Dr. Steve Grasby, adjunct professor in the U of C’s geoscience department and research scientist at Natural Resources Canada. …
I recall reading some decades ago that some areas around basalt flows in the western US might be coal beds as well as discussion that the Siberian basalt flows might have done the same thing.
Go to “Climate at Glance” at the NCDC’s website. Plot the annual mean temp for Texas. The trend for the 1895 to 2010 interval is 0.00 deg F.
The conc of CO2 in dry air has increased by ca 38% since ca 1900.
It this trend continues, doubling the conc of CO2 will have no effect on Texas annual mean temperature.
I really do like Texas cherry pie!
RE: Weather Noise for Feb in Utah is 2 deg F.
I downloaded tbe Utah Feb monthly mean temperature for the interval 1900 to 2010 from the NCDC site just mentioned.
For each decade I computed Tmean +/- AD, where AD is the classical average deviation from the mean. I then computed the average Tmean +/- AD and got 31 +/- 3 deg F.
I propose that AD = weather noise (WN) + resolution of field therometer (RFT). Since RFT = 1 deg F, WN = 2 deg F or ca 1 deg C.
The drawbacks of this methods is that WN is most likely site and season specific and a small sample interval must be used. Nevertheless, the method provides a way to determine a component of natural variability.
Since I got zero comments, I have concluded the guys over there no have curiosity and are brain dead.
I added this comment. This calculation is best done for Tmax and Tmin separatively. It is possible AD might be temp dependent.
David B. Bensonsays
Gavin — Thanks for the responses regarding model runs; helpful.
As long as we are in to strange, one of the limits on how long a run can go is the blue screen of death, e.g. your run should be a lot shorter than the MTBF for the computer system it is running on. This might be a strong constraint on the amateur systems also
Thomassays
Gavin,
I used to be a supercomputer guy (interactions between algorithms and comp architecture). Still keep up a bit with the field. I’d love to share some of my thoughts on how to take advantage of the evolution in HW that I think is coming in the next several years. If you want to contact me offline, or have a post on how to make future climate codes take maximal advantage of the coming changes, I’d love to contribute to it.
J Bowerssays
According to BBC News, Glory failed to reach orbit. Bummer.
siddsays
Ice never ceases to surprise, this is another way it gets around:
“… water is forced up valley sides to locations of lower pressure, or into ponds in places away from retained heat in rocks, then it will rapidly turn to ice – and can stick to the bottom of the sheet above.
The survey data reveals that this add-on ice makes up 24% of the ice sheet base around Dome A ….”
Also mud –glacially ground up rock, and whatever old sediment remains from times the area was open water — would be pushed up and out around the edges of the ice cap, the same way, I’d think.
Deconvolutersays
TV reporting in the UK of news from climate science…
is normally done in documentaries rather than news bulletins. So will the advance of Murdoch’s global empire discussed here, influence public opinion about science? :
Apparently Fox TV is in thrall to the pseudo-Dirac hypothesis that every fact (theory) can be made more entertaining by making it interact with its anti-fact (theory):
One BBC commentator suggested that it will enable the Murdochs to make sufficient money to prepare for their next targets in German and even Italian TV.
Doug Proctorsays
46EFS_Junior says:”My own analyses suggest an overall warming of ~4C since the early 1920′s and an overall warming of ~3C since the early 70′s …”
The 1K rise of circa 1922 to circa 1972 of 1K is, at 2K/century, “normal” as a recovery from the LIA. From 1972 to 2011, another 3K rise is 7.7K/century, far beyond the modelled CO2 impact in the Arctic by the CO2 rise since 1975. Something else is going on, Junior: you must be picking up a regional change as, say, in cloud cover or ocean currents. Could the data be that bad?
It is interesting to see what happens when others look at the data. I guess that is why many climate change professionals dislike the activity of bloggers: weird things can show up, but what they mean is not necessarily what they look like. And maybe that accounts for strange adjustments (or what look like “strange” adjustments).
The BEST project will satisfy the skeptics only if the data adjustments are looked at very, very hard. The AGW meme exists only if the global temperature rise since the 1970s has the extra 0.4K of warming the skeptics claim is due to uncorrected UHIE and biased urban/rural data changes. The membership of BEST will not be a problem for the skeptics; the dollar influence is not an issue here. Judith Curry still has a foot in both camps; she is just not as convinced on the certainties in AGW as she was.
On another note, there was a skeptical position about CERES and satellite records from a 2002 and 2008 paper about cloud-based albedo changes from 1988 ti 2002/2008 (Palle and Goode, I believe) that said a 20% decrease in albedo gave more W/m2 than needed by CO2 forcing. But I see nothing anymore at the CERES site. Did this skeptical line of “evidence” fall off the table with more work? It was initiated by the EarthShine project, and then integrated into satellite work.
A couple of days ago there was a “teaser” quiz on a TV news program. The question was asked about Antartica is — growing, stable of shrinking. The right answer was said to be “growing.” When that appeared on the screen one person present asked me how GW could be real if that answer was correct. I have looked on several sites, including this one but have not been able to either confirm the TV answer or find an explanation that if it is true it is a valid piece of evidence against GW. Can anyone help?
[Response: It’s not correct. Antarctica is losing mass in the net. The GRACE data show this clearly (though there is some uncertainty in the absolute magnitude). Conceivably your TV person was thinking about accumulation in the interior which does appear to be increasing (as a function of the increase in water vapour and thus water vapour convergence in the southern polar region). But more ice is being lost at the edges (calving/surface melt/basal melt) than is being gained from increased snow accumulation. – gavin]
There seem to be a few self-confessed geeks (myself included) that have a passing interest in the actual computing – some of the trade-offs of running models on some pretty serious hardware.
Would RC consider a post elaborating on this? What sort of modelling is now possible on new hardware not available 5 years ago? What would you like to do, but current hardware is inadequate? Do you see a future in massively parallel devices (for example, graphics cards running CUDA BLAS), or are they unsuitable for your problem? Is http://climateprediction.net/ really useful to you (you, in this instance, the scientific community), or is a waste of C02?
Damiensays
Re #80:
There seem to be a few self-confessed geeks (myself included) that have a passing interest in the actual computing – some of the trade-offs of running models on some pretty serious hardware.
Would RC consider a post elaborating on this? What sort of modelling is now possible on new hardware not available 5 years ago? What would you like to do, but current hardware is inadequate? Do you see a future in massively parallel devices (for example, graphics cards running CUDA BLAS), or are they unsuitable for your problem? Is http://climateprediction.net/ really useful to you (you, in this instance, the scientific community), or is a waste of C02?
(Forgive the double post… not sure if CAPCHA worked)
Thomassays
John @86. The popular press so often conflates sea ice and land ice, that they might have been talking about sea ice extent, which has been discussed here.
Ron R.says
Just noticed this article in Science Daily: Rising Carbon Dioxide Is Causing Plants to Have Fewer Pores, Releasing Less Water to the Atmosphere
Some passing thoughts. It says that as C02 has risen in the atmosphere the amount and size of stomata on the undersides of leaves has dropped. They mention a figure of 34%. The focus is on a lessening of water vapor released. Kind of seems like a Gaian thing. Since water vapor is a greenhouse gas perhaps the earth is trying to find a way to turn down the coming heat?
On the other hand perhaps the stomata are becoming fewer because plants don’t need as many stoma to gather the amount of C02 they require since more of it is now available?
Another thought comes to mind, if the above is wrong and fewer stoma mean that the plants are actually taking in less C02 for some reason then that would mean more C02 left in the atmosphere. Where perhaps in a normal world this drop in stomata on a per plant basis might be mitigated by a surge in total green growth globally so that on average the same amount of C02 is absorbed, in our modern world with all of the deforestation and development going on perhaps all that carbon will be left to float around the sky making things hot.
The man in the orange jacket is Sergei Kirpotin. He is a botanist at Tomsk State who studies the thawing of the permafrost and the release of methane gas.
When the CRU emails were hacked, he was the only Russian scientist I could find who spoke up and said that this was a “provocation” against the Copenhagen meeting. I have it here (translation at bottom).
Russian Greenpeace reported what he said, but I couldn’t find any big Russian media that reported what this pretty famous Russian expert said. This shows you that there was censorship. A lot of the TV and big media are owned by Gazprom and controlled by the Kremlin.
I could be wrong, of course, because I don’t have an entire science institute housed in a parcel post mailbox like the wizard Bob (Robbert Ferguson) of SPPI.
I am not some big expert, but it doesn’t take a big expert to know that there is something fishy about people whose “institutes” are loated in PO boxes in Virginia malls.
I personally drove to that strip mall and saw that parcel post store, and SPPI is nothing but a mailbox. My GPS (a satellite) figured it out. Finally, I understood why it said the parcel post store was the SPPI.
I felt like Dorothy in the Wizard of Oz pulling back the curtain. SPPI has a lot of articles by Lord Monckton, who doesn’t have a clue. He even appears on the Kremlin-financed Russia Today Satellite channel to debunk climate change.
Being on Russia Today is the closest SPPI people are going to get to a satellite.
“The trails — formed when moisture condenses around aircraft engine exhaust — create cirrus clouds that block solar energy from above and trap heat below. They may be contributing to warming of the Earth’s surface temperature, NASA studies show.”
The article also mentions having aircraft fly lower as a possible solution, but acknowledges that this would increase greenhouse gas emissions by reducing fuel economy.
Marksays
Geophysical Research Abstracts
Vol. 13, EGU2011-4505-1, 2011
EGU General Assembly 2011
Study of CO2, methane and water vapor sensitivity. Concludes climate sensitivity is 0.41 degrees C per doubling, far less than the average of many other studies. Highly likely they made some questionable assumptions, but I am not a planetary climate modeler. Ray Pierre?
Mark, I’ve only looked into this briefly, and I don’t see anything more available online than a brief description of what he did (an extended abstract). It looks like the guy uses a very simple (and in my opinion useless, though it’s still how you learn about the greenhouse effect in undergrad curriculum) two-layer climate model. He won’t be able to capture any feedbacks correctly if at all. I don’t see this as a useful contribution at all.
@Snapple says The man in the orange jacket is Sergei Kirpotin. He is a botanist at Tomsk State who studies the thawing of the permafrost and the release of methane gas.
Overall conclusion seems to be that the result is in the ballpark (ie similar order of magnitude as more refined calculations), but as Chris Colose said, an interesting approach but ultimately a pointless exercise when there are much better methods for working this out and getting a more likely result.
(One commenter, Eric, drew a parallel: “The classic joke is about physicists oversimplyfing this… ‘consider a spherical cow’.”
Septic Matthewsays
86, Gavin in comment: Antarctica is losing mass in the net.
dhogaza says
Gavin:
My assumption is that resolution (pizza box lat long degrees and number of pizza box layers in the atmosphere) and “stuff” is driven by how much time you’re willing to spend per model run – you mention about a month to model 100 years.
So where does this “month for 100 years” come from? I assume that for AR4, AR5 etc you want to make a certain number of runs (dozens?) and that there are obviously time constraints (AR5 has a schedule) and possibly funding enters into it (you can only buy so many CPU cycles) etc etc.
Care to spend a bit of time fleshing out the details, though? Is it budget driven primarily (can only spend so much on model runs), technology driven (while the next supercomputer generation will always be faster, today’s supercomputers are what you run on today), schedule driven (AR5 deadlines), a combination, or what?
[Response: The maximum length of a useful run has stayed pretty constant over the years for a much more basic reason. The fact of the matter is that climate modeling is a continually evolving process – it is never ‘done’. But it takes time to implement, test, and evaluate improvements – a timescale of months to years. So if you are happy with a model configuration now (the parameterisations, set up, experiment etc.), you will almost certainly be unhappy with it in a year’s time (because you will have improved a bunch of stuff in the meantime, found bugs, made the code more efficient, put in new diagnostics etc.). While groups are usually ok about doing long simulations with ‘almost up-to-date’ code, they become less willing to continue experiments with codes that that known to be flawed – especially when the particular error/issue was fixed/improved months before in the development version and the code has been improved to run faster now on some new hardware etc. The rate at which new stuff gets developed, combined with the timescale of developer frustration has been pretty constant for a long time. So groups only very rarely set up simulations that will take more than 6 months/1 year in total (including spin-up, historical runs, future scenarios, equilibration etc.). That puts a limit on the practical throughput that groups can stand (at around 100 years/a real month). – gavin]
Septic Matthew says
51, Gavin in comment: [Response: The maximum length of a useful run has stayed pretty constant over the years for a much more basic reason. … That puts a limit on the practical throughput that groups can stand (at around 100 years/a real month).
I want to thank Gavin and other contributors for this interesting discussion. I look forward to more.
SteveF says
Interesting looking Milankovitch related paper in Nature this week:
http://www.nature.com/nature/journal/v471/n7336/full/nature09825.html
dhogaza says
Gavin, thanks for the insight, I appreciate it …
One Anonymous Bloke says
Re: Gavin’s inline remarks at #51. Thank you. There are obvious parallels between what you are describing and any formal design process – the capacity for improvement combined with the immediate need for product means that you are never satisfied with today’s model.
Where does this fit in? Do you expect it to have much of an impact?
john byatt says
# 49 Janin
here are the interactive maps (australian gov) for Australian cities and predictions of SLR for three scenarios ,
http://www.ozcoasts.org.au/climate/sd_visual.jsp
Barton Paul Levenson says
Here’s a progress report on my RCMs.
I’ve tried in the past to use both a sigma scheme for pressure and a relative humidity scheme for water vapor. Neither worked particularly well. Recently I put both together, and suddenly I’m getting much more realistic results. Mark that, folks–when writing a radiative-convective model of Earth’s atmosphere, use both a sigma scheme for pressure AND a relative humidity scheme for water vapor. Either by itself won’t help much.
Results of the latest run:
Ts 291.4 K (1.1% error).
Mean stratosphere temperature 227.4 K. Remember how I used to get figures in the 300s and even 400s? I’d like to drop it another 10 K or so, though, if I can do so without cheating.
Albedo 0.321. Too high; I’d be happier with something in the 0.30-0.31 range.
Surface illumination 182.8 W m-2. Trenberth et al. 2009 get 161.2.
Water vapor content 8.95 x 1015 kg. Definitely too low, should be 1.27 x 1016
Tropopause 9.2 km (too low).
Stratopause 46.8 km (darned close).
Top of Atmosphere 272.8 km. I’ve now got a mesosphere as well as a stratosphere, thanks to the sigma scheme.
And, best of all,
Conservation of energy error: 0.1%.
I achieve the latter by accounting for reflection down as well as up.
My next step is to stop parameterizing reflection and try the full equation of radiative transfer with particle effects included for clouds–based on effective radius, and deriving asymmetry factor, single-scattering albedo, etc. Wish me luck…
ianash says
@45, 46
I think Muller was trying to be funny by quoting Happer.
The only link I am aware of is they are both part of the JASON Defense Advisory Group.
[Response: I didn’t see this, what did he say? – gavin]
Thomas says
Gavin, I’m presuming that month per hundred years is proably running on a few hundred processors. The armchair scientist is likely to have only a handfull of cores available, and probably not nearly enough memory for these bigger models to run in even if he had unlimited time.
[Response: Yes, you are correct. But the armchair scientist can usefully run the model version (or resolution) from a a few years back, and might prefer to stuck to relatively uncomplicated atmosphere only simulations without interactive aerosols or chemistry, and think about experiments that don’t need a hundred years of simulation. There are plenty of those. For the ‘fully-loaded’ simulations being done for AR5, you need the massively parallel super-computers though. – gavin]
Susan Anderson says
Oops, sloppy thinking on my part, sinning above my station. Will send a note offsite. My deep apologies.
You all believe science works, but for the guy on the street, if the data are claimed to be resolved in a way that misrepresents reality their “vote” is to not comprehend. What you all are saying helps me understand better and perhaps worry less, but I’m already convinced after years of amateur observation and study at my low level. The denialosphere is gloatingly promoting BEST as the final solution (pun intended). Seems to me a recommendation from Singer is pretty much a guarantee of a certain kind of negativity (better not say too much about that either, as I’m a lightweight and regardless of what I think of his influence, he’s not).
On checking I find the connection with Happer is fairly remote and may be coincidental – a footnote involved with the APS petition.
What is one supposed to do with a greek symbol on captcha?
Hank Roberts says
Gavin, ianash is referring to the WSJ editorial by Muller that contains a paraphrase, not a direct quote; it’s above at https://www.realclimate.org/?comments_popup=7006#comment-202158
[Response: Ahh… I hardly think that counts as an implication that Happer is involved in BEST, though that quote, and indeed, the whole op-ed do give one pause. The response might be here. – gavin]
Kevin McKinney says
“What is one supposed to do with a greek symbol on captcha?”
Request a new Captcha, which you can do by clicking on the little icon with two arrows on it–it’s above and to the right of the text box. Sometimes you’ve just got to admit defeat!
(And Captcha opines, “SSAX estallitp.” Esperanto, maybe?)
Susan Anderson says
OT, but I saw a phi and I typed phi. It worked! used the little arrows and that worked too. Thanks.
John E. Pearson says
ianash 58: Dunno if he was trying to be funny or not. In Mullins’ book he says that in the end it doesn’t matter whether Mann’s hockey stick was wrong or not because the IPCC consensus doesn’t depend on Mann’s results. My take away message from Mullins’ book was that (1)he hates Al Gore (2)he’ll do anything he can to discredit Gore but (3)he is a scientist and reality places limits on exactly how far he’ll go. WHen he was hawking his book on the Glenn Beck show he didn’t back down from his position: Global warming is real. In his book he says: By the end of the 21st century it will (if caused by humans) grow enough to be disruptive. He also says words to the effect of don’t trust consensus science which is a damned strange thing to tell a future president. Who should a future president listen to? Mullins? I don’t think so. I personally recommend ignoring Mullins. I highly recommend Burton Richter’s book Beyond Smoke and Mirror’s which, unlike Mullins, discusses climate change and energy without arrogance and without disinformation. I’d also recommend Making Technology Work by Deutsch and Lester waaaay ahead of Mullins.
[Response: sed s/Mullins/Muller/g? – gavin]
Lynn Vincentnatnathan says
RE BEST, I feel sorry for my alma mater; it’s probably extremely strapped for cash. Oil REALLY pollutes.
I feel like listening to the Garbage song, which I first heard there over at Berkeley in 1969 — http://www.youtube.com/watch?v=lmD9IJh1Cr0 (sounded better with Steele’s raw and gutteral “garbage, garbage, garbage, garbage”)
[Response: As a UC Berkeley grad myself, I will defend the school. Despite the perhaps misleading name of this project, it in no way, to my knowledge, bears the official imprimatur of UC Berkeley. Indeed, I would imagine that there are quite a few faculty and researchers at UC Berkeley and (LBL which is more closely tied to the project) that are not too happy about the way these institutions are being tied to the project, particularly given the revelations regarding Muller’s for-profit family venture . -mike]
Steven Sullivan says
#38 John Pearson:
Muller was indeed a reviewer of NAS 2006
http://books.nap.edu/openbook.php?record_id=11676&page=R11
And his precis of the findings, as paraphrased by you, is pretty much in line with the NAS’ own
http://www.nationalacademies.org/morenews/20060622.html
As suspect as some of the BESTies may seem, may I still suggest (in accord with R. Ladbury) that people wait to see what they actually produce, before condemning them?
Steven Sullivan says
#38 John Pearson
Muller was indeed a reviewer of NAS 0’6
http://books.nap.edu/openbook.php?record_id=11676&page=R11
and his precis of its findings echoes the NAS’ own press summary
“June 22 — There is sufficient evidence from tree rings, retreating glaciers, and other “proxies” to say with confidence that the last few decades of the 20th century were warmer than any comparable period in the last 400 years, according to a new National Research Council report. There is less confidence in reconstructions of surface temperatures from 1600 back to A.D. 900, and very little confidence in findings on average temperatures before then”
http://www.nationalacademies.org/morenews/20060622.html
Some of the BESTies may be suspect, but I agree with Ray Ladbury that people should wait to see what they actually produce, before deciding to condemn them
MikeCoombes says
Tamino had an article, http://tamino.wordpress.com/2011/01/20/how-fast-is-earth-warming/ , a while back in which he took global temperature data starting around 1975 from GISS, NCDC, HadCRUT3v, RSS, and UAH. He removed major sources of natural variation (the effects of el Nino, volcanoes, and the solar cycle) through some simple modelling. The adjusted annual temperatures, http://tamino.files.wordpress.com/2011/01/adj1yr.jpg, agreed very well with one another and were much more of monotonic increase with time.
I was wondering what would happen if one took the various climate models and did a hindcast of 1975 to 2010 and then processed the output temperature data by Tamino’s approach. I assume this can be done. How well would results of climate models agree with each other and the measured temperature data? Would the results tell you anything useful about the accuracy of the climate models?
Thanks.
Martin Vermeer says
> [Response: sed s/Mullins/Muller/g? – gavin]
Gavin, you bloody elitist… using communist software, on the tax payers’ time. Al Gore is also paying you to write in LaTeX, right?
J Bowers says
Barry Bickmore has a three part extended critique of Roy Spencer’s ‘Great Global Warming Blunder’ (H/T to Tim Lambert):
Part 1
Part 2
Part 3
Ianash says
@ 65 (comment by mike)
See http://www.berkeleyearth.org/participating
By listing LBL as a participating institution they seem to be suggesting LBL are involved.
I fully agree – both UC Berkeley and LBL’s reputation are to die for. Why would the Directors want to get involved in this?
Ray Ladbury says
Susan,
The thing is that we have 5 global temperature series that all show warming–and warming trends that are quite consistent, especially if you account for volcanism, ENSO, etc. as Tamino has shown. I think one would have to work VERY hard to construct a series that didn’t show warming. And of course we have experience with that. UAH has been modified several times to correct biases and now, while lower than the other series and plagued by systematics, is consistent within errors.
I would be ecstatic if the denialists agreed to be bound by evidence–they would then become skeptics–as evidence can at worst be wrong. Wrong is correctable. Bullshit is irredeemable.
Lorius´s Car says
Hi
a blogger by the name of Tamino kindly offered some great blog posts of his about volcanoes and GHGs causing most of the early 20th century warming, with solar forcing having a smaller impact. Could anybody point me to some scientific papers elaborating more on the causes of the warming 1910-40? Thnx in advance! – LC
tamino says
If I’m not mistaken, the lead scientist of the Berkely temperature project is Robert Rohde. He has a good reputation, and I see no evidence of prejudice (i.e., pre-judgement) on his part.
Let’s not jump the gun — we shouldn’t criticize their efforts before any results have even been produced.
P.S. Recaptcha now includes the symbol for a set union, over all sets indexed from 1 to “l”.
Jim Galasyn says
Curious to hear opinions on this study:
Earth’s greatest mass extinction caused by coal: study
Hank Roberts says
> eruptions burnt significant volumes of coal
I recall reading some decades ago that some areas around basalt flows in the western US might be coal beds as well as discussion that the Siberian basalt flows might have done the same thing.
This does look like evidence.
Catastrophic dispersion of coal fly ash into oceans during the latest Permian extinction
Harold Pierce Jr says
I just posted these two comments over WUWT.
RE: M Z’s CS Calculation
His theoretical calculations are probably wrong.
Go to “Climate at Glance” at the NCDC’s website. Plot the annual mean temp for Texas. The trend for the 1895 to 2010 interval is 0.00 deg F.
The conc of CO2 in dry air has increased by ca 38% since ca 1900.
It this trend continues, doubling the conc of CO2 will have no effect on Texas annual mean temperature.
I really do like Texas cherry pie!
RE: Weather Noise for Feb in Utah is 2 deg F.
I downloaded tbe Utah Feb monthly mean temperature for the interval 1900 to 2010 from the NCDC site just mentioned.
For each decade I computed Tmean +/- AD, where AD is the classical average deviation from the mean. I then computed the average Tmean +/- AD and got 31 +/- 3 deg F.
I propose that AD = weather noise (WN) + resolution of field therometer (RFT). Since RFT = 1 deg F, WN = 2 deg F or ca 1 deg C.
The drawbacks of this methods is that WN is most likely site and season specific and a small sample interval must be used. Nevertheless, the method provides a way to determine a component of natural variability.
Since I got zero comments, I have concluded the guys over there no have curiosity and are brain dead.
I added this comment. This calculation is best done for Tmax and Tmin separatively. It is possible AD might be temp dependent.
David B. Benson says
Gavin — Thanks for the responses regarding model runs; helpful.
Eli Rabett says
As long as we are in to strange, one of the limits on how long a run can go is the blue screen of death, e.g. your run should be a lot shorter than the MTBF for the computer system it is running on. This might be a strong constraint on the amateur systems also
Thomas says
Gavin,
I used to be a supercomputer guy (interactions between algorithms and comp architecture). Still keep up a bit with the field. I’d love to share some of my thoughts on how to take advantage of the evolution in HW that I think is coming in the next several years. If you want to contact me offline, or have a post on how to make future climate codes take maximal advantage of the coming changes, I’d love to contribute to it.
J Bowers says
According to BBC News, Glory failed to reach orbit. Bummer.
sidd says
Ice never ceases to surprise, this is another way it gets around:
http://www.bbc.co.uk/news/science-environment-12619342
I do not recall seeing this in any ice sheet model, that half the ice column can be refrozen ice from water melted elsewhere.
Amazing!
sidd
Hank Roberts says
From that BBC story:
“… water is forced up valley sides to locations of lower pressure, or into ponds in places away from retained heat in rocks, then it will rapidly turn to ice – and can stick to the bottom of the sheet above.
The survey data reveals that this add-on ice makes up 24% of the ice sheet base around Dome A ….”
Also mud –glacially ground up rock, and whatever old sediment remains from times the area was open water — would be pushed up and out around the edges of the ice cap, the same way, I’d think.
Deconvoluter says
TV reporting in the UK of news from climate science…
is normally done in documentaries rather than news bulletins. So will the advance of Murdoch’s global empire discussed here, influence public opinion about science? :
http://www.guardian.co.uk/commentisfree/2011/mar/03/sky-news-news-corp-bskyb
Apparently Fox TV is in thrall to the pseudo-Dirac hypothesis that every fact (theory) can be made more entertaining by making it interact with its anti-fact (theory):
http://www.washingtonpost.com/wp-dyn/content/article/2010/12/15/AR2010121503181.html
One BBC commentator suggested that it will enable the Murdochs to make sufficient money to prepare for their next targets in German and even Italian TV.
Doug Proctor says
46EFS_Junior says:”My own analyses suggest an overall warming of ~4C since the early 1920′s and an overall warming of ~3C since the early 70′s …”
The 1K rise of circa 1922 to circa 1972 of 1K is, at 2K/century, “normal” as a recovery from the LIA. From 1972 to 2011, another 3K rise is 7.7K/century, far beyond the modelled CO2 impact in the Arctic by the CO2 rise since 1975. Something else is going on, Junior: you must be picking up a regional change as, say, in cloud cover or ocean currents. Could the data be that bad?
It is interesting to see what happens when others look at the data. I guess that is why many climate change professionals dislike the activity of bloggers: weird things can show up, but what they mean is not necessarily what they look like. And maybe that accounts for strange adjustments (or what look like “strange” adjustments).
The BEST project will satisfy the skeptics only if the data adjustments are looked at very, very hard. The AGW meme exists only if the global temperature rise since the 1970s has the extra 0.4K of warming the skeptics claim is due to uncorrected UHIE and biased urban/rural data changes. The membership of BEST will not be a problem for the skeptics; the dollar influence is not an issue here. Judith Curry still has a foot in both camps; she is just not as convinced on the certainties in AGW as she was.
On another note, there was a skeptical position about CERES and satellite records from a 2002 and 2008 paper about cloud-based albedo changes from 1988 ti 2002/2008 (Palle and Goode, I believe) that said a 20% decrease in albedo gave more W/m2 than needed by CO2 forcing. But I see nothing anymore at the CERES site. Did this skeptical line of “evidence” fall off the table with more work? It was initiated by the EarthShine project, and then integrated into satellite work.
John (Burgy) Burgeson says
A couple of days ago there was a “teaser” quiz on a TV news program. The question was asked about Antartica is — growing, stable of shrinking. The right answer was said to be “growing.” When that appeared on the screen one person present asked me how GW could be real if that answer was correct. I have looked on several sites, including this one but have not been able to either confirm the TV answer or find an explanation that if it is true it is a valid piece of evidence against GW. Can anyone help?
[Response: It’s not correct. Antarctica is losing mass in the net. The GRACE data show this clearly (though there is some uncertainty in the absolute magnitude). Conceivably your TV person was thinking about accumulation in the interior which does appear to be increasing (as a function of the increase in water vapour and thus water vapour convergence in the southern polar region). But more ice is being lost at the edges (calving/surface melt/basal melt) than is being gained from increased snow accumulation. – gavin]
Hank Roberts says
For Doug Proctor:
http://www.google.com/search?q=site%3Arealclimate.org+palle
Damien says
Re #80:
There seem to be a few self-confessed geeks (myself included) that have a passing interest in the actual computing – some of the trade-offs of running models on some pretty serious hardware.
Would RC consider a post elaborating on this? What sort of modelling is now possible on new hardware not available 5 years ago? What would you like to do, but current hardware is inadequate? Do you see a future in massively parallel devices (for example, graphics cards running CUDA BLAS), or are they unsuitable for your problem? Is http://climateprediction.net/ really useful to you (you, in this instance, the scientific community), or is a waste of C02?
Damien says
Re #80:
There seem to be a few self-confessed geeks (myself included) that have a passing interest in the actual computing – some of the trade-offs of running models on some pretty serious hardware.
Would RC consider a post elaborating on this? What sort of modelling is now possible on new hardware not available 5 years ago? What would you like to do, but current hardware is inadequate? Do you see a future in massively parallel devices (for example, graphics cards running CUDA BLAS), or are they unsuitable for your problem? Is http://climateprediction.net/ really useful to you (you, in this instance, the scientific community), or is a waste of C02?
(Forgive the double post… not sure if CAPCHA worked)
Thomas says
John @86. The popular press so often conflates sea ice and land ice, that they might have been talking about sea ice extent, which has been discussed here.
Ron R. says
Just noticed this article in Science Daily: Rising Carbon Dioxide Is Causing Plants to Have Fewer Pores, Releasing Less Water to the Atmosphere
http://www.sciencedaily.com/releases/2011/03/110303111624.htm
Some passing thoughts. It says that as C02 has risen in the atmosphere the amount and size of stomata on the undersides of leaves has dropped. They mention a figure of 34%. The focus is on a lessening of water vapor released. Kind of seems like a Gaian thing. Since water vapor is a greenhouse gas perhaps the earth is trying to find a way to turn down the coming heat?
On the other hand perhaps the stomata are becoming fewer because plants don’t need as many stoma to gather the amount of C02 they require since more of it is now available?
Another thought comes to mind, if the above is wrong and fewer stoma mean that the plants are actually taking in less C02 for some reason then that would mean more C02 left in the atmosphere. Where perhaps in a normal world this drop in stomata on a per plant basis might be mitigated by a surge in total green growth globally so that on average the same amount of C02 is absorbed, in our modern world with all of the deforestation and development going on perhaps all that carbon will be left to float around the sky making things hot.
Just some random thoughts.
Snapple says
The man in the orange jacket is Sergei Kirpotin. He is a botanist at Tomsk State who studies the thawing of the permafrost and the release of methane gas.
http://www.youtube.com/watch?v=r2EoyrY05WY
When the CRU emails were hacked, he was the only Russian scientist I could find who spoke up and said that this was a “provocation” against the Copenhagen meeting. I have it here (translation at bottom).
http://legendofpineridge.blogspot.com/2009/12/meet-russian-scientist-from-tomsk.html
Russian Greenpeace reported what he said, but I couldn’t find any big Russian media that reported what this pretty famous Russian expert said. This shows you that there was censorship. A lot of the TV and big media are owned by Gazprom and controlled by the Kremlin.
I could be wrong, of course, because I don’t have an entire science institute housed in a parcel post mailbox like the wizard Bob (Robbert Ferguson) of SPPI.
I am not some big expert, but it doesn’t take a big expert to know that there is something fishy about people whose “institutes” are loated in PO boxes in Virginia malls.
I personally drove to that strip mall and saw that parcel post store, and SPPI is nothing but a mailbox. My GPS (a satellite) figured it out. Finally, I understood why it said the parcel post store was the SPPI.
I felt like Dorothy in the Wizard of Oz pulling back the curtain. SPPI has a lot of articles by Lord Monckton, who doesn’t have a clue. He even appears on the Kremlin-financed Russia Today Satellite channel to debunk climate change.
Being on Russia Today is the closest SPPI people are going to get to a satellite.
Hank Roberts says
http://climatechange.umaine.edu/news/article/2011/02/25/ancient_catastrophic_drought_leads_to_question_how_severe_can_climate_change_become__extreme_megadrought_in_afroasian_region_likely_had_consequences_for_paleolithic_cultures__c_stager
“February 24, 2011
How severe can climate change become in a warming world?
Worse than anything we’ve seen in written history, according to results of a study appearing this week in the journal Science…..”
Richard Palm says
Does this article about the climate effects of aircraft contrails make any sense? I thought clouds were believed to have a net cooling effect.
http://www.suntimes.com/news/nation/4122160-418/increased-air-traffic-may-be-a-factor-in-climate-change.html
Excerpt:
“The trails — formed when moisture condenses around aircraft engine exhaust — create cirrus clouds that block solar energy from above and trap heat below. They may be contributing to warming of the Earth’s surface temperature, NASA studies show.”
The article also mentions having aircraft fly lower as a possible solution, but acknowledges that this would increase greenhouse gas emissions by reducing fuel economy.
Mark says
Geophysical Research Abstracts
Vol. 13, EGU2011-4505-1, 2011
EGU General Assembly 2011
Study of CO2, methane and water vapor sensitivity. Concludes climate sensitivity is 0.41 degrees C per doubling, far less than the average of many other studies. Highly likely they made some questionable assumptions, but I am not a planetary climate modeler. Ray Pierre?
chris colose says
Mark, I’ve only looked into this briefly, and I don’t see anything more available online than a brief description of what he did (an extended abstract). It looks like the guy uses a very simple (and in my opinion useless, though it’s still how you learn about the greenhouse effect in undergrad curriculum) two-layer climate model. He won’t be able to capture any feedbacks correctly if at all. I don’t see this as a useful contribution at all.
ccpo says
@Snapple says The man in the orange jacket is Sergei Kirpotin. He is a botanist at Tomsk State who studies the thawing of the permafrost and the release of methane gas.
How about a translation of the video? :-(
Sou says
@Mark #95 – there is some discussion of the paper here:
http://rabett.blogspot.com/2011/03/toy-model.html
Overall conclusion seems to be that the result is in the ballpark (ie similar order of magnitude as more refined calculations), but as Chris Colose said, an interesting approach but ultimately a pointless exercise when there are much better methods for working this out and getting a more likely result.
(One commenter, Eric, drew a parallel: “The classic joke is about physicists oversimplyfing this… ‘consider a spherical cow’.”
Septic Matthew says
86, Gavin in comment: Antarctica is losing mass in the net.
I am glad that we finally got that clarified.
Mark Coupon says
http://www.youtube.com/watch?v=IaKQCY5ybMY
This is how to convince the AGW believers that something is not right at their leadership level ..