As many people will have read there was a glitch in the surface temperature record reporting for October. For many Russian stations (and some others), September temperatures were apparently copied over into October, giving an erroneous positive anomaly. The error appears to have been made somewhere between the reporting by the National Weather Services and NOAA’s collation of the GHCN database. GISS, which produces one of the more visible analyses of this raw data, processed the input data as normal and ended up with an October anomaly that was too high. That analysis has now been pulled (in under 24 hours) while they await a correction of input data from NOAA (Update: now (partially) completed).
There were 90 stations for which October numbers equalled September numbers in the corrupted GHCN file for 2008 (out of 908). This compares with an average of about 16 stations each year in the last decade (some earlier years have bigger counts, but none as big as this month, and are much less as a percentage of stations). These other cases seem to be mostly legitimate tropical stations where there isn’t much of a seasonal cycle. That makes it a little tricky to automatically scan for this problem, but putting in a check for the total number or percentage is probably sensible going forward.
It’s clearly true that the more eyes there are looking, the faster errors get noticed and fixed. The cottage industry that has sprung up to examine the daily sea ice numbers or the monthly analyses of surface and satellite temperatures, has certainly increased the number of eyes and that is generally for the good. Whether it’s a discovery of an odd shift in the annual cycle in the UAH MSU-LT data, or this flub in the GHCN data, or the USHCN/GHCN merge issue last year, the extra attention has led to improvements in many products. Nothing of any consequence has changed in terms of our understanding of climate change, but a few more i’s have been dotted and t’s crossed.
But unlike in other fields of citizen-science (astronomy or phenology spring to mind), the motivation for the temperature observers is heavily weighted towards wanting to find something wrong. As we discussed last year, there is a strong yearning among some to want to wake up tomorrow and find that the globe hasn’t been warming, that the sea ice hasn’t melted, that the glaciers have not receded and that indeed, CO2 is not a greenhouse gas. Thus when mistakes occur (and with science being a human endeavour, they always will) the exuberance of the response can be breathtaking – and quite telling.
A few examples from the comments at Watt’s blog will suffice to give you a flavour of the conspiratorial thinking: “I believe they had two sets of data: One would be released if Republicans won, and another if Democrats won.”, “could this be a sneaky way to set up the BO presidency with an urgent need to regulate CO2?”, “There are a great many of us who will under no circumstance allow the oppression of government rule to pervade over our freedom—-PERIOD!!!!!!” (exclamation marks reduced enormously), “these people are blinded by their own bias”, “this sort of scientific fraud”, “Climate science on the warmer side has degenerated to competitive lying”, etc… (To be fair, there were people who made sensible comments as well).
The amount of simply made up stuff is also impressive – the GISS press release declaring the October the ‘warmest ever’? Imaginary (GISS only puts out press releases on the temperature analysis at the end of the year). The headlines trumpeting this result? Non-existent. One clearly sees the relief that finally the grand conspiracy has been rumbled, that the mainstream media will get it’s comeuppance, and that surely now, the powers that be will listen to those voices that had been crying in the wilderness.
Alas! none of this will come to pass. In this case, someone’s programming error will be fixed and nothing will change except for the reporting of a single month’s anomaly. No heads will roll, no congressional investigations will be launched, no politicians (with one possible exception) will take note. This will undoubtedly be disappointing to many, but they should comfort themselves with the thought that the chances of this error happening again has now been diminished. Which is good, right?
In contrast to this molehill, there is an excellent story about how the scientific community really deals with serious mismatches between theory, models and data. That piece concerns the ‘ocean cooling’ story that was all the rage a year or two ago. An initial analysis of a new data source (the Argo float network) had revealed a dramatic short term cooling of the oceans over only 3 years. The problem was that this didn’t match the sea level data, nor theoretical expectations. Nonetheless, the paper was published (somewhat undermining claims that the peer-review system is irretrievably biased) to great acclaim in sections of the blogosphere, and to more muted puzzlement elsewhere. With the community’s attention focused on this issue, it wasn’t however long before problems turned up in the Argo floats themselves, but also in some of the other measurement devices – particularly XBTs. It took a couple of years for these things to fully work themselves out, but the most recent analyses show far fewer of the artifacts that had plagued the ocean heat content analyses in the past. A classic example in fact, of science moving forward on the back of apparent mismatches. Unfortunately, the resolution ended up favoring the models over the initial data reports, and so the whole story is horribly disappointing to some.
Which brings me to my last point, the role of models. It is clear that many of the temperature watchers are doing so in order to show that the IPCC-class models are wrong in their projections. However, the direct approach of downloading those models, running them and looking for flaws is clearly either too onerous or too boring. Even downloading the output (from here or here) is eschewed in favour of firing off Freedom of Information Act requests for data already publicly available – very odd. For another example, despite a few comments about the lack of sufficient comments in the GISS ModelE code (a complaint I also often make), I am unaware of anyone actually independently finding any errors in the publicly available Feb 2004 version (and I know there are a few). Instead, the anti-model crowd focuses on the minor issues that crop up every now and again in real-time data processing hoping that, by proxy, they’ll find a problem with the models.
I say good luck to them. They’ll need it.
jcbmack says
Kevin #148, well said.
steven mosher says
RE 89. There are several issues here. One of the driving motivations behind FOSS is to increase reliability. many people looking at the same code finding solutions, posting patches. However, Stallman and FSF tend to argue for “open software” from an ethical point of view: http://www.gnu.org/philosophy/philosophy.html, more explicitly here: http://www.gnu.org/philosophy/open-source-misses-the-point.html where Free software is construed as a social movement.
Torvalds has taken the position that Open Source is preferable on pragmatic grounds. And it’s fun.
( see the title of his autobiography) It produces better code. There is nothing in either philosophy that advocates or dictates “blank slate” development, in fact, quite the opposite. the community is driven by people reusing other peoples code. And so having good regression testing is a must. If somebody submits a patch, you have to vet the patch before it’s released. That’s what I’m driving at WRT the intermediate data files in GISSTEMP. In fact, recently GISSTEMP was altered to provide some intermediate data files. This was greatly appreciated. In short, replicating GISSTEMP output, final output, is geatly simplified by releasing the data files created at each intermediate step of the program. Finally, there is another motivation behind all of this and that is the drive toward reproducible research. Reproducible research goes beyond the requirements of supplying code and data. Here are a few sources.
http://www-stat.stanford.edu/~wavelab/Wavelab_850/wavelab.pdf
http://www.reproducibleresearch.org/
http://www.reproducibleresearch.org/articles.html
http://www.stanford.edu/~vcs/papers/Licensing08292008.pdf
http://perspectives.on10.net/blogs/jonudell/Roger-Barga-on-Trident-a-workbench-for-scientific-workflow/
Think about reproducible research as an executable document.
Another good source on Open Science — a really funny guy in person and on the tonight show, creator of Nomic –Peter Suber.
http://www.earlham.edu/~peters/hometoc.htm
Bottomline, The issue is not pioneer status or priority, science’s version of copyright, the issue is the most effective way of replicating GISSTEMP so that the various assumptions it makes ( interpolating over the polar regions, excluding certain sites, using “nightlights” instead of population to gauge rurality, merging records from adjacent sites, etc etc ) can be tested. You see before one wants to measure the impact of certain assumptions made in the program one must be able to reproduce the results of the program. The task of reproducing results, as I noted in my post, is made easier by making the intermediate files, files written by the program in question, available to the public. We know the files are written out to disk by looking at the code. It’s a simple matter to include them in the zip files.
anything less is just a polite version of fork U. ( FOSS joke sorry)
petras says
I suspect that critical comments would decrease significantly if the parties that made the original mistakes (and promulgated the error) would apologize, act embarrassed, beg for forgiveness, publicly discuss the source of the errors, openly report the steps (and time line) that they are taking to rectify the problem, and gladly receive all offers of help, instead of rushing out a quick response that still contains errors. A massive public apology is in order for this train wreck.
steven mosher says
gavin you wrote
[Updated response: I originally mispoke in the above comment since the data is missing from the collated file rather than non-existent in any file. I’m happy to correct any mis-interpretation that might have caused. That collation is the responsibility of NOAA and any queries as to what goes into it or why should be directed to them. Now if people want to correct their insinuations that including Northern Canada data in the last update was akin to a shell game, we might be getting somewhere. – gavin]
As i pointed out in comment #23, john Goetz had determined that the data WAS available in the .dly files at NOAA and the error seemed to have occurred in collating this into the monthly numbers. By the way thanks for giving him credit.
tamino says
I suspect that the obscenely critical comments will never stop; we’ll hear about this from Steve McIntyre and Anthony Watts until long after the Greenland Ice Sheet has disintegrated.
Although the error originated NOT with GISS but before numbers even got through their door, I’ve heard no cries for heads to roll at NOAA or NWS — just vicious attacks on GISS. The reason? Personal distaste for James Hansen and his prominent advocacy of the need to reduce greenhouse gas emissions.
If an apology is called for, it’s from those who have promoted their agenda by attempting what amount to character assassination against GISS.
captdallas2 says
[Updated response: I originally mispoke in the above comment since the data is missing from the collated file rather than non-existent in any file. I’m happy to correct any mis-interpretation that might have caused. That collation is the responsibility of NOAA and any queries as to what goes into it or why should be directed to them. Now if people want to correct their insinuations that including Northern Canada data in the last update was akin to a shell game, we might be getting somewhere. – gavin]
For someone who isn’t involved in GISTEMP I have to admire your dedication and affinity for blog that shall remain nameless.
Cheers.
snorbert zangox says
Perhaps, Gavin would respond to this.
Cosmic rays are also connected to climate change. In 1998, Henrik Svensmark of the Danish Space Research Institute filled a reaction chamber with the earth’s mix of atmospheric gases, and turned on a UV light to mimic the sun. He was amazed as the cosmic rays coming through the building’s walls quickly filled the chamber with huge numbers of microscopic, electrically charged droplets of water and sulfuric acid—the “cloud seeds” that help create low, wet, cooling clouds in the earth’s atmosphere. Since such clouds often cover 30 percent of the earth’s surface, they can play a crucial role in the planet’s warming or cooling.
Currently, the World Meteorological Organization uses the photochemical model to predict that the Antarctic springtime ozone hole will increase by another 5–10 percent by 2020. In sharp contrast, Dr. LU says the severest ozone loss will occur over the South Pole this month—with another large ozone-triggered hole occurring around 2019.
If the South Pole gets an ozone-hole maximum in the coming weeks, it will strengthen the case for cosmic rays, and endorse a Modern Warming driven by solar variations rather than human-emitted CO2. The solar model is already endorsed by oxygen isotopes in ice cores from both Greenland and the Antarctic, by microfossils in the sediments of nine oceans and hundreds of lakes worldwide, and by cave stalagmites from every continent plus New Zealand.
The case for a solar-driven climate is also strengthened by a drop in global temperatures over the past 18 months: The temperature decline had been forecast by the sunspot index since 2000, but was not predicted by the global climate models.
The original is at http://www.cgfi.org/2008/10/03/record-south-pole-ozone-hole-predicted-by-dennis-t-avery/
Jared says
Tamino-
I suspect that criticism of poor data gathering/quality control/publication never will stop, nor should it.
No matter how you feel about them, NASA GISS is regarded as one of the premier authorities on global temperature. For such an organization, there is no excuse for not having an error-checking system in place that would prevent gaffes like this. It doesn’t matter where the data is sourced from, GISS is responsible for making sure it is accurate before they go public with it.
That is not character assassination, that is common sense and taking responsibility.
Hank Roberts says
Petras, are you talking about the worldwide collapse of the financial system, or the brief error in reporting some temperatures?
A sense of proportion suggests the former, but alas, if so, you’re in the wrong forum.
High dudgeon sometimes tries to overreach.
David B. Benson says
taamino (154) wrote “If an apology is called for, it’s from those who have promoted their agenda by attempting what amount to character assassination against GISS.”
I agree. The whole thing is a tempest in a teapot.
Lawrence Brown says
Regarding comments that because of these errors, some should lose, or beg for their jobs. Get real! If everyone who has made mistakes were to lose their jobs,we’d have a 99.999% unemployment rate. The only exception that comes to mind is our Chief Executive, who at a past news conference,when asked if he’d made mistakes, replied that if given a couple of weeks he might think of something.
“I used to be conceited, but now I don’t have any faults” (author unknown)-possibly someone from the API or a top executive of the American auto industry
My sense is that GISS is being scapegoated, being singled out for attack is because they’re one of the biggest thorns in the side of the those who still wear blinders on the cause of our current changing climate.
Ron Taylor says
#154 Tamino – I could not agree more. This is the kind of nonsense that wears everyone down. Let’s just get to the facts, please, and stop looking for ways to clobber GISS, which is one of the heroic organizations in the global warming crisis (for that is what it is rapidly becoming!).
Philip Machanick says
I’ll worry about this kind of error when you guys are not quick to correct yourselves. I’ll give the inactivist bunch a teensy bit more credit when they are quick to acknowledge their errors, which are numerous and often repeated long after they’ve been pointed out.
I wonder where the people getting so excited about the occasional error do their science. Peer review is a filter, but not perfect. A lot of mistakes slip through. Competition from other scientists who try to replicate results and cross-check against other work keeps things on the straight and narrow. The fact that this error was spotted quickly and not by the people who created or posted the data is good. It means the system is working.
Blogging on the other hand has no quality checks whatsoever as noted before here (and on my own blog).
Maurizio Morabito says
If the whole thing was a “tempest in a teapot” and a “molehill”, why did it need 1,100+ words to be explained, and more than 150 comments?
[Response: More words are devoted to much more trivial topics (Paris Hilton anyone?) and I am not responsible for what people want to comment on. But perhaps you would care to give a monetary damage estimate for this little episode? Then we could compare it to a chocolate bar or a beer or something… – gavin]
Hank Roberts says
Hey, a mountain just moved:
http://www.desmogblog.com/breaking-epa-kills-us-coal-plants
sdw says
I fear many here miss the point. This is not a ‘usual’ science. The data sets, the hypotheses, the conclusions – all will have a profound effect on civilisation. Not to mention the opportunity cost, billions of dollars of research funds that could be spent in other scientific areas, such as medical research.
Outside of science, say for example in the engineering arena, there is audit, there is review, there is sound replication – this is because mistakes can not be allowed (otherwise bridges fall down). What the climate science community needs to get used to, is that they are an unusual science, and that they are the ones that need to adjust – not McIntyre, not Watts.
With respect to this topic, the sooner the data sets and the finest details of methodologies are open to absolute scrutiny (and replication) the better. If AGW is the dire problem as asserted, then this should trump individual or laboratory ‘ownership’ of algorithms and methods.
regards, sdw (scientist and engineer)
wayne davidson says
156 David, IT is good to see who gets excited for this systemic human error, and not excited about trying to understand Climatic mysteries
unfolding now. I would judge those who are fascinated by climate
mysteries the most powerful climate experts around, I find those quick to criticize a wonderful service which is reliable by its honesty amongst other virtues, experts in sarcasm by far, what do they know about Climate when they spend so much time refining vengeful strategies designed to weaken the best science messengers we have?
petras says
I guess that I come from much humbler stock, and I just don’t have the ego to play in the arena. If I screw up, I certainly apologize, and I do my best to ensure that it doesn’t happen again. I feel that my credibility is built upon an honest appraisal of my own work and of others; not invectives or updates released without much reflection. My research group would have handled a botched international release of important policy-relevant data differently. I hope that there are others that feel the same. Perhaps I’m just too old fashion for this competitive climate-science research.
RPauli says
“Humankind cannot bear very much reality” said T.S. Eliot
Denialism, like delusion may be an understandable psychological reaction to the stress of realizing the news is very bad, possibly bleak. And reactions must be swift and painful and no matter how much we sacrifice, we can only mitigate and adapt – not fix.
Certainly some will have always have tunnel vision preventing them from seeing looming danger ahead. The added challenge is be polite and tolerant while continuing to attack the problem.
dhogaza says
Good engineers and good project management, coupled with adequate resources, produce better code.
Regardless of development methodology, you need people who know what they’re doing. One thing successful Open Source projects do is to run as a meritocracy. You need to attract expert compiler writers if you want to produce a truly excellent compiler (and, speaking as a professional compiler writer, I’d have to say that early versions of gcc, at least, truly sucked, though I imagine the robust development community that built up around it coupled with decent funding has fixed that).
The original version of Unix was proprietary, and excellent. I hacked on the first version released in source form (to educational institutions) back in 1974. Better or worse than Linux? There’s no way Linux would be able to run on the limited memory resources offered by the PDP-11 architecture, so from one point of view OS Linux sucks compared to proprietary Unix System 7.
If you want to build an independent, Open Source, competitor to GISS you’re going to have to attract people who are familiar with the science and the data munging techniques usual in the field. Blaming the failure of your project on a lack of cooperation by GISS staff is, well, not terribly convincing.
Speaking as someone who makes his living writing code for, and managing, an Open Source software project, and has done so for nearly a decade now. After having previously made his living developing the proprietary technology underlying a rather large list of minicomputers back in the 1970s and 80s, for a company he help fund with sweat equity.
BTW, the OS model would not have worked in the 1970s. Few people would’ve been able to afford $50K or so for a personal development platform (I’m considering inflation here), and the international communications network really didn’t exist (Usenet opened the door but the internet made things practical) that would support a distributed development model. Proprietary or not. I’d argue that the distributed development model is necessary for OS development of large scale software.
Hank Roberts says
Hey, you’ve got friends everywhere — another major source of error in the warming figures just hit the news. I wonder why this is being reported now?
Today’s Wall St. Journal:
Toxic Cloud Masks Warming Effects
By SHAI OSTER
BEIJING — A roughly two-mile thick cloud of soot and smog hanging over most of Asia is wrecking havoc on agriculture and health but masking the effects of global warming, a United Nations study found.
The atmospheric brown cloud made of different particles resulting mostly from burning coal is causing hundreds of thousands of deaths in Asia and billions of dollars in economic losses, the study said. But it helps reduce the impact of climate change by between 20% and 80%, said the report released Thursday by the Project Atmospheric Brown Cloud, established by the United Nations Environment Program…..
—————–
Bloomberg.com writes:
The pollution makes skies from Beijing to Tehran darker by blocking sunlight as it is absorbed by particles linked with burning fossil fuels. Guangzhou, a city in southern China, has reported a 20 percent drop in sunlight since the 1970s.
The cloud may also be contributing to shifts in weather patterns, including drought in northern China and flooding the southern region of the country as well as affecting the seasonal monsoon, the UN said. At the same time, it may be having a cooling effect on the planet as some soot and biomass material reflects sunlight, the report said. ….
jcbmack says
#172, well presented… dhogoza, if you believe you can provide improvements, contact the NOAA, perhaps you can assist them in quality support.
Martin Vermeer says
Gavin:
…illustrating the futility of building in specific tests for all the possible concrete mistakes you can imagine. There will always be one more that you didn’t think of.
Reminds me of the way Windows anti-virus works :-(
The sad reality of life is that any error that slips through the routine consistency checks in the GIStemp processing chain, even an after-the-fact embarrassing one, can only be found by someone sitting down and looking at the data in detail. If you don’t have the resources for this, you won’t find it, but someone with too much free time on their hands just may.
The proper question to ask is the one asked by Kevin in #138: if such errors are present in the data, how do they, alone or together, affect the estimated trends? And mathematically naive or not, the answer indeed is “not a lot”. This analysis has to be made anyway, because there are gross errors in the data. You bet there are. Trying to eliminate them all is a dream, and not even a pretty one, to paraphrase Von Clausewitz. All the consistency checks do, is put an upper bound on their size.
Jonas N says
Seriously,
This more and more turns into a joke.
Disregard for a second the edgy and defensive tone of Gavin, who is spending the better part of his time defending something that is not his responsibility, on the grounds that it neither is the responsibility of the guys in charge, who only happen to sit in the same building, but whose combined effort cannot seem to spend the time to occasionally (once a month) check if the latest temperature compliation is not completely out of bounds.
This being said about one global temperature record competing with only three other records for being the most relevant available …
OK, here’s the going:
First GISTEMP publishes a map with an october 2008 anomaly of 0.86°C, mostly because of a giant bloodred blob over av of Sibiria/Russia with an about +12°C anomaly. Warmer than september 08, in spite of rapid growth of ice and snowcover! The hottest global october ever on the record.
This is picked up immedeately by people following these releases. It is found extraordinary, questioned, and eventually dismissed as fawlty. Even the probable origin of the giant error is identified and reported to GISTEMP principals.
Well, after som quibling, the data is withdrawn, pending a check and uppdate, whereafter a new mao appears. Now, red Sibiria is so somewhat less, october anomaly is stated as 0.65°C, but instead northern Canada has heated noticably since the first version.
Again, this is quickly noticed on commented by several laymen.
Checking in at GISTEMp again yesterday reveals yet another uppdated version. This time with major parts of Canada gray (= no data) or cooler again, and the october anomaly is stated as 0.61°C?
Strangest thing! And when redrawing the map with only a 250km radius for a better spacial resolution of data, reveals essentially no candian data at all, neither any antarctic data points, Sibiria looks more red again, and here we are informed of an october anomaly of 0.78°!?
C’mon! Even if one wanted to take all these different datasets/maps seriously, or more impartant, use them and their historical record for som more serious purpose. Or just use the data for anything of marginal value …
How would one go about that?
[Response: The numbers from various stations come in over a long period of time – sometimes months. Thus regardless of the specific glitch that occurred this month, these maps/numbers are always preliminary. In the December update, there will be more data for October, and the same will be true in January. The GISTEMP analysis is not the raw data – and it is the raw data that is the ‘historical record’. – gavin]
Mark says
Ron Taylor, #162. Is this slow? There’s a SEVEN YEAR MISTAKE that has only just been corrected in Microsoft’s code.
And how would you know that September figures were going into October figures until october was over? So we have a fortnight, four weeks tops (since half way though you *may* be able to think “hang on, this month looks like last month”: you can’t tell on a day’s data, you know).
Now how long does it take Microsoft to patch their systems? Once a month, isn’t it? And this is deemed acceptable (else there would be no outrage at the disclosure of unfixed bugs by the white hat community).
So I ask: Is this slow?
I say not.
Mark says
Sorry, that last was meant for Philip Machanick #163. Can you change it?
Ta.
Barton Paul Levenson says
gerrym writes:
Just because you don’t know, it doesn’t follow that no one knows. Is “what [you]’ve read of the debate” from web sites like CO2science and Watts Up? Or is it from the peer-reviewed science literature?
Barton Paul Levenson says
petras writes:
Not only that, they should have to walk to the Vatican on their knees, wearing sackcloth and ashes, and carrying a candle.
Get real. The “critical comments” from people who want to stop AGW mitigation will continue whatever the scientists do.
Barton Paul Levenson says
snorbert —
The fact that cosmic rays are a mechanism for cloud formation does not mean they are a major factor in Earth’s climate. The GCR flux in Earth’s atmosphere is about 5 particles per square centimeter per second, which isn’t enough to generate a serious amount of clouds.
What’s more, the GCR flux has been stable for 50 years, so it can’t have contributed to the sharp upturn in global warming of the last 30.
dhogaza says
It’s not a screw-up to fail to anticipate every possible way in which people collecting data may make a hash of it. GISS does run sanity checks on the data, but they didn’t think of this possibility (the thoroughly unprofessional error of submitting exactly the same data two months in a row).
You might as well sharpen your knives because I can guarantee that there are other ways for the reporting agencies to screw up the data that GISS has not anticipated. Given that there are (practically speaking) infinite ways to screw up, do you honestly claim that GISS should be able to anticipate, and automatically screen for, all of them?
Why aren’t the denialists demanding an apology FROM THE PEOPLE WHO SUBMITTED THE SCREWED-UP DATA IN THE FIRST PLACE?
Those are the people who screwed up. Not GISS.
Perhaps you’re just too old fashioned to recognize attacks that are made purely for political reasons, with no view to improving the science.
In case you noticed, climate science wasn’t affected by this snafu. The data would’ve been recognized as being incorrect in due course, and corrected. It’s no big deal in the big-picture view that scientists are concerned with.
As a scientist, you should know that all data is wrong by definition, the only question is the magnitude of the wrongness, and whether or not it is so wrong as to be useless. The politically-driven denialists are intent on trying to convince people that since occasional errors are caught by people outside GISS, then all the work done by GISS is useless. As a scientist, you should know this is bull.
Ray Ladbury says
Actually, this current episode provides an excellent example of why you don’t want just anybody poking around the data–amateurs don’t have the experience or judgment to recognize what is and is not significant. What we have is a trivial error that would have been corrected rapidly in any case, and it excites an seizure of chest thumping in the denialosphere. If these guys are going to jump on every single error as a “smoking gun,” they have no business poking around anything scientific. Fix the frigging error, check for such errors in the future and move on. That’s how real scientists would handle it–indeed that is how it is being handled while everyone else seems to be distracted by the antics of short-bus crowd in the side rings.
Rod B says
Just an idle clarification for the record from the peanut gallery. One of my areas of skepticism toward AGW is the reliability and credibility of global temperature measurements. While other skeptics may jump all over the molehill/mountain example of this thread, I personally find it benign and not supportive of my skeptical concern. IMHO, grasping at straws for an AHA! moment by some of my fellow skeptics is unbecoming and unscientific.
Hank Roberts says
> When? And by whom?
By people like Santer, and McI can’t be unaware of the history here.
http://pubs.acs.org/subscribe/journals/esthag-w/2006/aug/policy/pt_santer.html
How much abuse like that do you expect anyone to take and remain open with and trusting of strangers, whiners, and namecallers like those?
One more reason science journals — not blogs and PR — are the place error correction is done. Because it’s hard enough as it is to do good science without adding more opportunities for uncompensated abuse to their lives.
Now — did you even bother to click the link? Do you have any awareness of the history here?
Every field has episodes everyone should keep in mind. That’s one.
Hank Roberts says
Gavin, if you agree with Martin’s reply recently
—
Martin Vermeer 14 November 2008 at 1:14 AM
—
May I suggest that (and a bit about the ongoing travails) be promoted as an addition to the main post?
The topic will roll on; it’d be good to keep an update with the best info at the top, for new readers coming in later.
Mark says
RodB, #182. Why? What is it that makes you decide that the numbers aren’t adding up?
Why are the AGW figures open to disapproval when you do not ask the same of people? Such as snorbert zango (#157)? Paul M(#143), Branden (109)? Steven Mosher’s statements in #89? Jared’s assertion in 88, your own figure pulled from the air in 86 (in which your skepticism didn’t descend to Pierre in #75, did YOU ask whether the figures constituted GIGO in a climate sense? didn’t see one) or, indeed whether Pierre’s assertion of affecting people’s lives in #74 had any figures to back it up.
Rob’s assertions went without comment from a “skeptic” from #66. Never heard you asking why Patrick in #65 said it was time to move on. Even Martin’s (#63) ridiculous statement didn’t raise a single peep from you.
BD’s bold assertion (#60) didn’t result in any queries from a skeptical person such as yourself and the cost estimate from ashby (#59) was unquestioned by you or your compatriots.
Hank had to step in in #57 to ask the pertinent sketical question of Alistair (#20).though there was ample stupidity and unscientific gobbldygook for a “scientist and a skeptic” like yourself. Ellis #55 went unqueried by you as did gerrym (50), sky(29), Lyn (12), julius (7) and captdallas2 (4) all made statements – STATEMENTS, not considered explanations – that you did not feel compelled to question.
Again in #33 you make a demand without wondering or giving reason why this would work in real life. You didn’t try and find the problems, you merely gave out the idea and assumed it works. Hardly skeptical, is it.
And all that just on this one thread.
A pattern emerges, though. I have not yet seen you query anyone who is trying to say that AGW doesn’t exist or that the IPCC/this site/government are overestimating the problem and underestimating the cost. You have never turned skeptic at the “you want us to live in caves!!!” rhetoric and when someone comes up with a well understood and debunked claim (hockey stick) you remain quiet.
So many cases and so strong a correlation.
Are you skeptical, or just skeptical of anything that would require you to change?
dhogaza says
Bridges don’t fall down despite “review, audit, and sound replication”?
This will be news to the good citizens of Minneapolis. Hurry up and make sure they are informed that the I-35 bridge didn’t fall down. Apparently the earth and river fell up …
jcbmack says
Just take a course in meteorology and one cousre covering climate; the chemistry of CO2 is pretty clear, even if one does not blog or read sites on AGW, scientists way outside the major groups know it is a fact.
Hank Roberts says
jcbmack, can you give cites, please, for the many statements of fact you’re making? They may very well be accurate, but it’s a time sink trying to check what you say.
Given no last name, I can’t look up your publications and decide whether to trust your opinions based on your work and those citing your work. Without that track record, cites to sources are the next best thing.
It would really help.
Joseph O\'Sullivan says
#165 (Hank Roberts)
“Hey, a mountain just moved” Indeed it has
And the waters may be parted:
http://dotearth.blogs.nytimes.com/2008/11/13/water-laws-may-be-used-to-fight-warming/
dp says
The Russian stations wernt the only error. Originally it showed temperatures over Britain higher than normal, which was certainly not the case. I now see they have been corrected. What caused this?
[Response: It was the same issue. – gavin]
jcbmack says
Ihave given my references, the books, the journals and some of the courses I have taken;it is quites easy to see if I am accurate. #188
jcbmack says
I am just into science, I teach and tutor part time and have free time to write here and stay abreast to the latest developments in climate science which effects us all.
Hank Roberts says
jcbmack, maybe it’s “easy” but you’re asking us to redo your work.
Your typos show you’re in a hurry; please slow down, and give sources. It’s a courtesy to the reader who most needs your help.
The next reader may be a grade schooler who deservesthe best help you can offer. That would be citations.
With a good cite, any kid can ask a librarian to find the source and check what you say about it.
Else you’re just “some guy on a blog said” and making work needlessly.
Please.
David B. Benson says
jcbmack — As an example of Hank Roberts’s request to you, I recently needed to check the approximate sea level around 10,000 years ago. So I looked at
http://www.globalwarmingart.com/wiki/Wikipedia:Sea_level_rise
to discover that it was about 50 meters below the current level.
Now anybody can check the same for themself, with little effort.
[reCAPTCHA agrees, entoning “cheaper said”.]
jcbmack says
The typos come from being rush due to being busy, but here are the references I have used or alluded to lately on Realclimate.org: Organic Chemistry Morrison Boyd, (pick an edition you can find) Loudon Organic chemistry, Physical Chemistry Peter Atkins, Advanced inorganic chemistry Peter Atkins, Calculus with applications by Daniel L. auvil, Advanced Molecular Biology by R.M. Twyman, Fundamentals of Biochemistry Life at the Molecular Level (the little boat) and Biochemistry (the big boat:) by Voet, Voet, and Pratt, Global Climate Systems: Patterns, Processes, and teleconnections,2006, by Howard Bridgman, John Oliver, and Michael Glantz, Global Biogeochemical cycles in the Climate System,(2001)by Ernst Schulze, Sandy Harrison, Martin Heimann, Elisabeth Holland, and Jonathan Loyd, The Rough Guide to Weather (2002) by the meteorologist Robert Henson; Earth magazine and http://www.earth.magazine.org, journal nature and http://www.nature.com,I also keep up with reports from scientific american for anything way outside my field or that I mat have missed in other magazines and journals. Usually I quote and leave the reference or state the name of a book once or twice and then I cease citing them, but I use Environmental organic chemistry by René P. Schwarzenbach, Philip M. Gschwend, Dieter M. Imboden: Books; which I highly recommended early in my posting debut here; thick book a lot of information; and the Pchem textbook virtually constantly;Statistical mechanics by R.K. Pathria, and molecular physics and biophysics (pick an author you like, they all get nasty, detailed and sometimes vague:)
all of the appropriate keywords from Nasa.gov, PBS.org, and the listed publications by the moderatore here at realclimate.org and most of their books I have gotten my hands on an read:)
Physics.web graduate and post graduate (teacher selection) Einstein, relativity, gravity. faraday cage, atmospheric fluid dynamics, solar forcing, etc… (you get the idea) and everything on climate, weather, physics, models etc available on Britannica.com, but then again I also have a well read set of Britannica anyways;
http://www.giss.nasa.gov/tools/modelE/ –
http://www.aip.org/history/climate/GCM.htm – 223k
jcbmack says
I also recommend for those who lack a background or are rusty on stats: Elementary Statistics by Mario F. Triola, and get a subscription to one of the aforementioned magazines atleast, or, if you are real new to science, get the magazine New Scientist; read Dacid Archer’s books and publications. Wikipedia is not just unacceptable in academic writing, and in referencing for peer review, but also, the accuracy and details are serious question. Britannica is superb for introductory and detailed preliminary information and anyone can read it, even if they are in grade school of they are patient and dedicated. Also climate in prehistory by William James Buroughs, may be of help to some and, Global warming and global politics by Mathew Paterson, systems biology Bern O Palsson Systems Biology Kathryn Johnson, physics in molecular biology by kim sneppen and Giovanni Zocchi. Many of these books can be found in part online Google books/scholar, and some in public libaries; others if you have ebooks linked to a major library, you should be able to find them. I have these books in my personal library and others I can recommend on specific topics downloaded to my computer upon request, I can get you the info.
Most of my references are NOT understandable by the typical grade, junior high, high school, or college freshman student, but there are enough there to get them started. The rest are advanced undergraduate, graduate and training texts for those in a given field… read and learn; if you require more advanced stuff or less I can lead you in the right direction:)
jcbmack says
The plane and medical analogies were extracted from: Harrison’s Principles of Internal Medicine, and DP Davies, Handling The Big Jets.
jcbmack says
Oh and I will watch the typos a little more, but this is not english class, you know what I mean even if Global Climate change is spelled as GLobael Climate chaneg:)
jcbmack says
http://www.cambridge.org/us/catalogue/catalogue.asp?isbn=9780521844192 – 12k
http://www.noc.soton.ac.uk/JRD/OCCAM/ – 17k
http://www.noaa.gov/ lots of info there.
Also a decent Analytical Chemistry textbook assuming one has a chem background.
It is my desire that people learn, bit also people need to read, learn and form a foundation for critical thought, utilizing the tools of science and math.
David B. Benson says
I’ll defend Wikipedia’s entries related to climate as generally an accurate and clear place to begin. Of course one needs to look to the references and other sources if more substance is required.