This is a continuation of the last thread which is getting a little unwieldy. The emails cover a 13 year period in which many things happened, and very few people are up to speed on some of the long-buried issues. So to save some time, I’ve pulled a few bits out of the comment thread that shed some light on some of the context which is missing in some of the discussion of various emails.
- Trenberth: You need to read his recent paper on quantifying the current changes in the Earth’s energy budget to realise why he is concerned about our inability currently to track small year-to-year variations in the radiative fluxes.
- Wigley: The concern with sea surface temperatures in the 1940s stems from the paper by Thompson et al (2007) which identified a spurious discontinuity in ocean temperatures. The impact of this has not yet been fully corrected for in the HadSST data set, but people still want to assess what impact it might have on any work that used the original data.
- Climate Research and peer-review: You should read about the issues from the editors (Claire Goodess, Hans von Storch) who resigned because of a breakdown of the peer review process at that journal, that came to light with the particularly egregious (and well-publicised) paper by Soon and Baliunas (2003). The publisher’s assessment is here.
Update: Pulling out some of the common points being raised in the comments.
- HARRY_read_me.txt. This is a 4 year-long work log of Ian (Harry) Harris who was working to upgrade the documentation, metadata and databases associated with the legacy CRU TS 2.1 product, which is not the same as the HadCRUT data (see Mitchell and Jones, 2003 for details). The CSU TS 3.0 is available now (via ClimateExplorer for instance), and so presumably the database problems got fixed. Anyone who has ever worked on constructing a database from dozens of individual, sometimes contradictory and inconsistently formatted datasets will share his evident frustration with how tedious that can be.
- “Redefine the peer-reviewed literature!” . Nobody actually gets to do that, and both papers discussed in that comment – McKitrick and Michaels (2004) and Kalnay and Cai (2003) were both cited and discussed in Chapter
2 of3 the IPCC AR4 report. As an aside, neither has stood the test of time. - “Declines” in the MXD record. This decline was
hiddenwritten up in Nature in 1998 where the authors suggested not using the post 1960 data. Their actual programs (in IDL script), unsurprisingly warn against using post 1960 data. Added: Note that the ‘hide the decline’ comment was made in 1999 – 10 years ago, and has no connection whatsoever to more recent instrumental records. - CRU data accessibility. From the date of the first FOI request to CRU (in 2007), it has been made abundantly clear that the main impediment to releasing the whole CRU archive is the small % of it that was given to CRU on the understanding it wouldn’t be passed on to third parties. Those restrictions are in place because of the originating organisations (the various National Met. Services) around the world and are not CRU’s to break. As of Nov 13, the response to the umpteenth FOI request for the same data met with exactly the same response. This is an unfortunate situation, and pressure should be brought to bear on the National Met Services to release CRU from that obligation. It is not however the fault of CRU. The vast majority of the data in the HadCRU records is publicly available from GHCN (v2.mean.Z).
- Suggestions that FOI-related material be deleted … are ill-advised even if not carried out. What is and is not responsive and deliverable to an FOI request is however a subject that it is very appropriate to discuss.
- Fudge factors (update) IDL code in the some of the attached files calculates and applies an artificial ‘fudge factor’ to the MXD proxies to artificially eliminate the ‘divergence pattern’. This was done for a set of experiments reported in this submitted 2004 draft by Osborn and colleagues but which was never published. Section 4.3 explains the rationale very clearly which was to test the sensitivity of the calibration of the MXD proxies should the divergence end up being anthropogenic. It has nothing to do with any temperature record, has not been used in any published reconstruction and is not the source of any hockey stick blade anywhere.
Further update: This comment from Halldór Björnsson of the Icelandic Met. Service goes right to the heart of the accessibility issue:
Re: CRU data accessibility.
National Meteorological Services (NMSs) have different rules on data exchange. The World Meteorological Organization (WMO) organizes the exchange of “basic data”, i.e. data that are needed for weather forecasts. For details on these see WMO resolution number 40 (see http://bit.ly/8jOjX1).
This document acknowledges that WMO member states can place restrictions on the dissemination of data to third parties “for reasons such as national laws or costs of production”. These restrictions are only supposed to apply to commercial use, the research and education community is supposed to have free access to all the data.
Now, for researchers this sounds open and fine. In practice it hasn’t proved to be so.
Most NMSs also can distribute all sorts of data that are classified as “additional data and products”. Restrictions can be placed on these. These special data and products (which can range from regular weather data from a specific station to maps of rain intensity based on satellite and radar data). Many nations do place restrictions on such data (see link for additional data on above WMO-40 webpage for details).
The reasons for restricting access is often commercial, NMSs are often required by law to have substantial income from commercial sources, in other cases it can be for national security reasons, but in many cases (in my experience) the reasons simply seem to be “because we can”.
What has this got to do with CRU? The data that CRU needs for their data base comes from entities that restrict access to much of their data. And even better, since the UK has submitted an exception for additional data, some nations that otherwise would provide data without question will not provide data to the UK. I know this from experience, since my nation (Iceland) did send in such conditions and for years I had problem getting certain data from the US.
The ideal, that all data should be free and open is unfortunately not adhered to by a large portion of the meteorological community. Probably only a small portion of the CRU data is “locked” but the end effect is that all their data becomes closed. It is not their fault, and I am sure that they dislike them as much as any other researcher who has tried to get access to all data from stations in region X in country Y.
These restrictions end up by wasting resources and hurting everyone. The research community (CRU included) and the public are the victims. If you don’t like it, write to you NMSs and urge them to open all their data.
I can update (further) this if there is demand. Please let me know in the comments, which, as always, should be substantive, non-insulting and on topic.
Comments continue here.
AC says
And it’s not the CRU data, let’s say – but 80% of the recent paleo stuff in the recent IPCC report came from CRU, Briffa and Mann – see Ch 6.
So where’s the “virtually certain” support for an unprecendented warming period over the last 50 years? Where’s the “virtually certain” support for “the only forcing that would explain this is human generated CO2?”. If the paleo data is suspect or the error range is too large, we don’t really know how warm the globe was 1,000 years ago, much less 10,000. So the “forcings” are unclear, the temperature variation is unclear. Not disproven, sure, but certainly not “virtually certain”.
[Response: What are you quoting? The IPCC statements are that 20th Century has “unequivocally” warmed (which is true), and that it is 90% confident that the most of the rise is due to anthropogenic effects. There is no statement that it is “virtually certain [that] the only forcing that would explain this is human generated CO2?” – this appears to be a strawman of your own devising. Your other claims are similarly made up. – gavin]
If the other data is so great, why does the IPCC seem to rely almost exclusively on the CRU stuff?
[Response: Try reading the report. Or better, tally up all the references with a CRU author and compare them to the total. – gavin]
See AR4, Ch 6, table 6.1, page 469. 100% of the instrumental temperature records come from the CRU group. Of the 12 proxy-based reconstructions, 7 include CRU people or Mann, another relies on CRU data, and another states
“Therefore we stress that presently available paleoclimatic reconstructions are inadequate for making specific inferences, at hemispheric scales, about MWP warmth relative to the present anthropogenic period and that such comparisons can only still be made at the local/regional scale. ”
(D’Arrigo)
Again, why such significant reliance on just a very few people, if there are truly other reliable data sources that show “man-made forcings” and “unprecendented rise”?
[Response: You again have it very wrong. Man-made forcings are discussed extensively in Chapter 2 (not Chapter 6), and I don’t know that anyone from CRU is involved in that side of the research. And of course if include all the authors who’ve worked in the field of paleo-reconstructions that will be 100% of the papers! (A conspiracy, I tell you!). – gavin]
huxley says
You said [947]: “Your point is to insist on some impossible standard so that you can avoid paying any attention to the results. My point is that there are completely open projects that if you were really concerned with evaluating you would do so. That you don’t understand my point, simply underlines the issue. – gavin]
Gavin: How am I insisting on an impossible standard here? The entire point of the scientific method is to allow others to reproduce one’s results.
[Response: You can reproduce the CRU results using the publicly available GHCN data – as indeed GISTEMP and NCDC actually have. If you want to know that the climate has warmed, then we are done (and if you don’t like the GHCN data, look at ICOADS, or the glaciers etc.). When CRU does get permission to post its whole database I guarantee that you will still not be happy. You will ask for more data, maybe even going back to the handwritten weather report sheets from every station in the world (which of course CRU does not even have). Without that you will declare that this is still not science because you can’t go back in time and reproduce exactly the same measurement. The point is that there are limits on what can be reasonably provided (beyond the specific issue here of the NMS data), and yet your fundamentalist stance will not allow for any leeway. Everything or else. Thus it is an impossible standard. To convince me otherwise, state beforehand what amount of data you will decide is ‘enough’ data for reproducibility of the temperature record. – gavin]
[edit]
You don’t seem to understand. You and your colleagues have a PR disaster on your hands. You need to restore trust that you are acting in good faith. Demanding that someone like myself personally audit some aspect of the AGW project is beside the point and comes across as patronizing — exactly the sort of PR you don’t need now.
[Response: You are asking for something you have no idea about because it suits you politically to ignore the vast amount of information that is available. Do you need 80% to check something? 90%? 95%, 99%, 99.9%? If they answer is that nothing short of 100% of everything is sufficient, then you are claiming that the remaining 0.1% of the data is somehow determining the whole answer (even though with even 80% of the data you get pretty much the same thing). Remember that just because something makes a nice rhetorical point, it doesn’t actually make it true. – gavin]
Michael Trigoboff says
There are two claims:
1) The climate is warming.
2) Human emissions of CO2 are causing a significant part of it.
I am discussing claim 2. The CRU Hack has called into question the quality of the software and data underlying claim 2. It is irrelevant whether or not claim 1 is true. If claim 2 is false, actions regarding human emissions of CO2 are completely beside the point.
[Response: No it doesn’t. Attribution of climate change does not hinge on tree ring data, nor does it hinge on HadCRUT and so point 2 is not affected in the least by tossing out anything for CRU that you don’t like. – gavin]
JFK says
I see some confusing discussion about “upside down” proxies from Tiljander sediment cores at CA.
Mann has indicated correctly that changing the sign of an explanatory variable has no impact on
a regression model (sign of the coefficient just flips). It seems the real issue being raised is
that the Tiljander paper notes distortion of the climate signal by human influence starting
c 1720 and becoming extreme post 1930. Is it now the consensus opinion that the Tiljander series
cannot be meaningfully calibrated to modern temperature records?
[Response: I have no idea. But if it can’t then it clearly can’t be used in the approach Mann and colleagues use. Just as well then that they already tested what difference leaving is out makes (answer? very little). – gavin]
AC says
[Response: What are you quoting? The IPCC statements are that 20th Century has “unequivocally” warmed (which is true), and that it is 90% confident that the most of the rise is due to anthropogenic effects. There is no statement that it is “virtually certain [that] the only forcing that would explain this is human generated CO2?” – this appears to be a strawman of your own devising. Your other claims are similarly made up. – gavin]
Here’s an exact quote “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.”
Hos is this materially different than my statement?
[Response: Greenhouse gases are not just CO2, the statement ‘most…’ is stated to be very likely, not ‘virtually certain’, and most is not all in any case. – gavin]
[Response: You again have it very wrong. Man-made forcings are discussed extensively in Chapter 2 (not Chapter 6), and I don’t know that anyone from CRU is involved in that side of the research. And of course if include all the authors who’ve worked in the field of paleo-reconstructions that will be 100% of the papers! (A conspiracy, I tell you!). – gavin]
Chapter 2 sets the stage, yes, with a discussion of the issue. But the justification is really what matters, right? Sure, you can propose an explanation, but at some point this explanation must be tested against the real world.
That happens in Ch. 9. Look at 9.3 and all the space devoted to recent paleo data there. Then in Section 9.4, where the HADCRUT3 temperature data is used.
Then, in 9.7, “Combining Evidence of Anthropogenic Climate Change”, the paleo stuff is everywhere:
“The observed warming is highly significant relative to
estimates of internal climate variability which, while obtained
from models, are consistent with estimates obtained from
both instrumental data and palaeoclimate reconstructions”
“Palaeoclimatic evidence suggests that such a widespread warming has not been
observed in the NH in at least the past 1.3 kyr (Osborn and
Briffa, 2006), further strengthening the evidence that the recent
warming is not due to natural internal variability.”
“Detection and attribution of external influences on 20th-century and palaeoclimatic
reconstructions, from both natural and anthropogenic sources
(Figure 9.4 and Table 9.4), further strengthens the conclusion
that the observed changes are very unusual relative to internal
climate variability.”
This Paleo stuff is everywhere, and seems to come almost entirely from the CRU group – at least as the IPCC presents it.
[Response: It’s not everywhere. The whole of Chp 9 is about detection and attribution, and at best, the recent paleo stuff plays a minor supporting role compared to the bulk of the other work. For instance, D&A in the 2001 report didn’t use paleo data at all. There are no differences to the D&A in the modern period if you substitute GISTEMP or NCDC for the HadCRUT data in this context so that is a complete red-herring. – gavin]
Barani says
Somehow I feel that many simple minds are treating global warming as a mere temperature problem. To my physics eyes I see this more of an entropy problem. The entropy manifestations of global warming may lead to climate extremes (e.g. more powerful Hurricanes like Katrina). For that matter one cannot even rule out an ice-age from global warming if emissions completely block out sunlight and earth loses heat in other wavelengths. Any system upset from its equilibrium could behave unpredictably and possibly swing to both extremes.
I take this opportunity to request that the illegal, unethical and wanton act of sabotaging scientific process is not dignified any further with response from scientific community, and leave it to law enforcement agencies to handle it like any other cyberattack.
Michael Trigoboff says
I see that you have decided to “moderate out” my longer comment regarding the mess displayed in HARRY_READ_ME.txt. This strikes me as bad faith on your part. Would you care to let me know why you did that?
[Response: Sure. It’s repetitive (the same stuff has been covered a dozen times. You didn’t get my joke (that’s ok – it wasn’t very funny anyway), and your point made no sense whatsoever. Thus it does not add any signal. To reiterate, Harry_read_me.txt is a work log for a 4 year period of someone working on upgrading CRU TS 2.1 to CRU TS 3.0 (which is actually now available and not the same as HadCRUT in any case). It is a very typical, frustrating and exasperating account of someone left with legacy code and databases trying to make a better product. Your implicit claim that because someone has struggled to improve what existed before means that what exists now is bad makes no sense at all. – gavin]
Michael Trigoboff says
You are missing my point.
The CRU Hack materials demonstrate an appalling lack of competent management of a particular software project. Is that the only project that was mismanaged? That one example raises the question, which can only be answered if you allow the community that understands software projects to examine all of the code and data that underpins the claim that human emissions of CO2 are causing global warming.
Will you do that? And if you don’t, what can we conclude from your refusal to do so?
[Response: Let me get this straight. You think AGW isn’t a problem because of a badly managed software product? I would be more than happy to employ this software community to manage all of our software projects if we had the money. If you like, write to NSF and NASA and request that our budgets be doubled. They may well be able to upgrade our software and make them more efficient – I’m sure that is true for GISTEMP for instance, and that would be a good thing. But if they are going to start off with the attitude that the whole enterprise is fundamentally corrupt and that they are just there to find the subroutine that massages the data under orders from our political masters, then it isn’t going to work. However, if you want to do it properly, institute an independent effort to analyse the GHCN or the SYNOP data and then use the independent replication as a test of the other approaches. That way you actually make a constructive difference without having to wade through all the legacy stuff. – gavin]
J. Bob says
942 OK Hank, governments are supported by taxes, and I pay taxes. If I have to pay a government agency for climate info, I am paying twice. Now I can live with that. However when people being paid with tax money, publish “peer reviewed papers” which affects my well being, why can’t I get the supporting data, even if I have to pay for it? And why do they have to keep this “under wraps”?
You your reply states “You’ve paid nothing, not even the time to learn what you’re talking about”. I have paid federal, state & local taxes for many years, hence I am partially supporting Mann, et. al.. For you to make such a statement, above, I might assume you don’t pay taxes, hence I may be supporting you.
945 Hi John, it’s been some time, since we had this discussion. While I agree, it would be nice to measure the volume and density of the Arctic ice, and compare long term records, the best we can do is measure ice extent and thickness. I if remember correctly, ice extent has a 5-14% error ( tc-3-1-2009 Cyosphere) for passive microwave systems. While microwave devices have better range measurement capability, then beam direction, adding ice volume would add even more errors. Since the microwave records have been around since about 1979, we have a “blink of an eye” in time, for the more accurate recent records. So while ice extent isn’t the best, it’s looks like it’s the best we have for the present.
AC says
[Response: It’s not everywhere. The whole of Chp 9 is about detection and attribution, and at best, the recent paleo stuff plays a minor supporting role compared to the bulk of the other work. For instance, D&A in the 2001 report didn’t use paleo data at all. There are no differences to the D&A in the modern period if you substitute GISTEMP or NCDC for the HadCRUT data in this context so that is a complete red-herring. – gavin]
It looks like the paleo stuff is used to improve the “likelihood” values. A couple of quotes from 9.4.a:
Row 1 – Warming during the past half century cannot be
explained without external radiative forcing
“Main uncertainty from forcing and internal variability estimates”
Row 3 – Greenhouse gas forcing has been the dominant cause
of the observed global warming over the last 50 years.
“Main uncertainty from forcing and internal variability estimates”
The paleo data seems to be critical in developing these internal variability estimates. If you have a good idea of forcings for the past 2000 years, and a good idea of tempuratures, you can develope supportable models of internal variability. However, if your tempurature data, say, is unreliable, your internal variability estimates are going to be much less precise. Then you run the risk of having a much broader range of temperatures within the “internal variability” range, including those within the last 50 years.
So it sure does seem like paleo stuff is a critical part of the edifice. Maybe in 2001 there wasn’t a need to buttress the “likelihood” stat?
[Response: It plays a role. Previously people had used the internal variability in model control runs to make estimates of how much things vary in the absence of forcing. The paleo-based estimate, which involves subtracting out the natural forcings (which are themselves uncertain) from the reconstructions (uncertain too) come up with similar numbers (at least in the hemispheric mean). If they had been radically different there might be more of an issue, but they aren’t. However, this will always be a difficult problem because it has to be diagnosed somehow, and yet there is no period in which this is the only thing going on. – gavin]
stella says
Science is not conducted in a vacuum. This is something scientists are taught (or at least should be) in their first year of grad school. Corporations and governments will both use and attempt to influence science to support their agendas. I personally have worked in a science department at a major university and witnessed the altering of research goals in order to chase funding from major governmental sources. The IPCC was established by the UN, a political organization. In my opinion, this is not the way it should be, but it is the way that it is. That is why scientists should hold above all else their ethical guidelines in the face of these challenges, in order to keep their science pure.
For this reason, I am shocked and appalled that the entire scientific community is not up in arms over the statements made in some of these emails. Even if they were cherry-picked, they clearly show a lack of ethics amongst the scientists involved. Yet the deafening silence among the scientific community stirs deep concern in me. Perhaps disparaging peer-review (to me, the most important step in the process of science) and excluding papers that show alternative evidence are common activities among climate scientists. I am in the social sciences, and I personally would be in shock if I saw such a cavalier attitude towards scientific principles amongst my colleagues. I certainly wouldn’t defend it.
Perhaps because climate scientists are not working directly with people, they don’t feel the need to take as seriously their guiding code of ethics and scientific principles. However, unfortunately, their research has the power to influence people’s lives in very serious ways. I would encourage scientists to take a day off from processing data models and reach out to their colleagues in the political science, philiosophy, sociology, economics, and history departments at their institutions to find out, truly, what the implications of their research could be for our society, and the world. To find out why there was a sudden spike in government funding for climate change from the government now, at this point in our history. To find out how, although their research might be intended to save lives, it could very well end up destroying them. Maybe then they would take their scientific code of ethics a bit more seriously, and be sure that what they are telling the world is 100%, categorically, no doubt, Factual Truth.
AC says
Response: It’s not everywhere. The whole of Chp 9 is about detection and attribution, and at best, the recent paleo stuff plays a minor supporting role compared to the bulk of the other work. For instance, D&A in the 2001 report didn’t use paleo data at all. There are no differences to the D&A in the modern period if you substitute GISTEMP or NCDC for the HadCRUT data in this context so that is a complete red-herring. – gavin]
Also, look at 12.6, Concluding Remarks, in the 2001 report:
“20th century climate was unusual.
Palaeoclimatic reconstructions for the last 1,000 years (e.g., Chapter 2, Figure 2.21) indicate that the 20th century warming is highly unusual, even taking into account the large uncertainties in these reconstructions.”
and
“It is therefore unlikely (bordering on very unlikely) that natural internal variability alone can explain the changes in global climate over the 20th century (e.g., Figure 12.1).”
Looks like “unlikely” in 2001 has gone to “very unlikely” in 2006, with the inclusion of more paleo data.
Really, without paleo, it is a lot of model hand-waving – ” Recent observed changes cannot be accounted for as pure internal variability even if the amplitude of simulated internal variations is increased by a factor of two or more” – well, yes, it’s a model – the size of the factor by which you must change a simulated value has no actual bearing on whether or not the change is correct!
Gavin2 says
Gavin…
Over a week ago you posted a comment clearly stating that no data had been destroyed or lost. Now it appears you were incorrect. Or is this still not true?
We seem to be hearing it from the horse’s mouth now, so to speak. I feel you are somwhat discredited, and your attempts to validate what are obviously shady, if not wholly conclusive emails, is ridiculous.
If you are reasonable and scientific, surely you agree all information and evidence (both sides) should not only be published, but encouraged into the public domain for all to see?
[Response: What is ridiculous is the implication that because I have an inkling for why CRU can’t release its database that you automatically think I’m against releasing things into the public domain. This is completely at odds with my thinking, statements and actions over many years. Ask the people involved with the PCMDI archive, or think about why GISS ModelE is public. So, let me state it again, we should endeavour to have as much as is possible in the public domain. This helps transparency, trust and actual research. Happy? ;) But note that I do not agree with the converse, that everything that is not released cannot be trusted – which appears to be the underlying theme in many commenters minds. There are some valid reasons why some data and code isn’t public (the CRU case being an example of that). I stand by my statement that no data has been destroyed. The originating met offices are custodians of their raw data, not CRU. – gavin]
huxley says
You said [952]:”You are asking for something you have no idea about because it suits you politically to ignore the vast amount of information that is available. Do you need 80% to check something? 90%? 95%, 99%, 99.9%? If they answer is that nothing short of 100% of everything is sufficient, then you are claiming that the remaining 0.1% of the data is somehow determining the whole answer (even though with even 80% of the data you get pretty much the same thing). Remember that just because something makes a nice rhetorical point, it doesn’t actually make it true. – gavin]”
Gavin: Just because nice rhetorical point is inconvenient doesn’t make it false.
I’m sure it’s not easy, but given the size of computer storage and the abundance of programs for handling data, I’m surprised that a high degree of compliance is as difficult as you say.
Your post makes me doubt that AGW scientists themselves are able to replicate their own work. Is this the case?
All I’m asking is that whatever it would take for your associates to replicate their studies ought to be available to others. That doesn’t necessarily mean absolute 100.00% perfection but something workable, and something far, far more cooperative than the bunker mentality you and your associates displayed in the CRU Hack emails.
I am aware that this level of cooperation is not the case for all of science, but the stakes of global warming are high and concern all of us. [edit]
[Response: Oh dear. 95% of the data in the CRU database is in the public domain. GISTEMP uses it and gets the same basic answer. Both GISTEMP and CRU are perfectly able to reproduce their own analyses and the differences between them are very minor – especially when it comes to the long term trends. The issue is not the size of the data or the complexity of the task but, as been made clear for years, the third-party NDAs that CRU has entered into. – gavin]
David Gordon says
Re: #894 John
>Are you ignoring the fact that the definitions of sunrise and sunset represent actual observable, quantifiable events?
No, but when the word is defined by the event, it’s not a theory, it’s a definition. The distinction is not trivial – when the sun comes over the horizon, there is no chance at all that it is not a sunrise, and it doesn’t depend on any theory or previously observed data, it is the actual definition of the word. Thus there is no such thing as a “theory of sunrise” as there is a “theory of gravity”. This is basic 8th grade science stuff, or at least it was when I was that age. Who knows what nonsense they teach in its stead these days?
manacker says
Re 951 and 955.
There is a double dilemma.
IPCC (AR4 Ch. 9, p.685): Climate simulations are consistent in showing that the global warming observed since 1970 can only be reproduced when models are forced with combinations of external forcings that include anthropogenic forcings. [HadCRUT: 1970-2000 linear warming = 0.4°C]
p.691: Detection and attribution as well as modelling studies indicate more uncertainty regarding the causes of the early 20th century warming than the recent warming. [HadCRUT: 1910-1944 linear warming = 0.5°C]
The models told us that it would warm at a rate of 0.2°C per decade in the 21st century, yet it has cooled at a rate of 0.1°C since January 2001. [HadCRUT: 2001-2009 linear cooling = -0.1°C]
The logic
1. Our models cannot explain the early 20th century warming.
2. We know that CO2 has been part of the cause of the late 20th century warming, because our models cannot explain it otherwise.
3. Our models cannot explain the 21st century cooling.
Gavin, can you explain this dilemma?
Max
[Response: The only dilemma I have is why I bother responding to you at all since you never seem to actually learn anything. But despite my better instincts, I will try (for the benefit of other readers) explain our confusion. Models show natural variability. On short timescales natural variability is much larger than the expected trend due to rising greenhouse gases. The time period over which you expect to detect a signal of that forced trend is over a decade (somewhere between 15 and 20 years if you want 95% confidence). Thus shorter periods of greater than expected, or less than expected warming are not disconfirming, they are actually expected. (As are the plaintive cries of the contrarians who suddenly develop an attachment for the data set that gives the lowest trend, even though the day before were claiming that nothing from that institution could be trusted. Coherence is not their strong point). Trends longer than 15 or 20 years are predictable from the forcings and you can do a formal attribution study to conclude that the trends in the last 50 years are well-explained by rises in GHGs. In the period immediately prior, there is not such a dominant forcing, and the uncertainties in the other forcings (solar, aerosols) and higher uncertainties in the data, is sufficient to make a clear attribution difficult. It’s really not that difficult. – gavin]
Former Skeptic says
Some common sense from the UK Telegraph and also from the UK Spectator. Nice to see that not all op-eds in right-wing broadsheets are shooting BS out of their collective mouths.
Ray Ladbury says
OK, now let me get this straight:
These guys have appointed themselves guardians of the scientific method, and nothing is acceptable unless they approve it.
It doesn’t matter that it’s been reviewed by the National Academies, by every relevant professional organization of scientists, by DOD, by governments and organizations across the political spectrum, even by most Republican Senators and found to be valid.
And meanwhile these self-appointed guardians have demonstrated utter unfamiliarity with normal scientific practice, with the science of climate change… They refuse even to acknowledge the overwhelming evidence that demonstrates anthropogenic causation.
Uh… Right. Pull the other one. It’s got bells on it.
Moira Kemp says
If anyone wants to know how strict the Conditions of Use for restricted data can be, here are those of the UK Met Office. They make it quite clear that data cannot be passed to third parties “under any circumstances,” not even to someone “within the same institute or programme of research.”
http://badc.nerc.ac.uk/data/surface/met-nerc_agreement.html
Ron Taylor says
Gavin, you are still hanging in there with your incredible patience. But, frankly, the people who have been challenging you lately, and to whom you have responded rationally, will not be moved by your explanations. They seem to have already decided that the whole climate science enterprise is corrupt, which is utter nonsense to those of us who have followed the science over these past several years. Or perhaps they are skilled people recruited to derail the excellent work of RC on behalf of the fossil fuel industry – I really cannot say which.
My hat is off to you, but I would think it enitrely appropriate for you to moderate out the repetitive accusations of malfeasance, whether direct or implied. Enough already! I look forward to getting back to the science.
Patrick 027 says
Re 965 David Gordon – the point is not that the theory predicting sunrises takes ‘sun coming over horizon’ as a given; it predicts that very action. And until each sunrise happens, it remains theory.
Michael Trigoboff says
What I’m saying is that if the software underpinnings of AGW are of the quality we saw in HARRY_READ_ME.txt, those underpinnings are rotten and not to be trusted. And that’s why I’m advocating that you “open source” all of those underpinnings.
Put your code and data up on something like SourceForge. Let the open source community look at it. You will get at least two benefits from this:
1) People will look at your code, find errors, and suggest improvements.
2) People will, if you want to allow open source to do its magic, start actually improving your code. You might want to read The Cathedral and the Bazaar by Eric S. Raymond, which would show you how open source software can produce amazing results, potentially without expenditure of money on your part.
If you do this and it turns out your code was already of high quality and worked correctly (other than Harry’s unfortunate project), everyone will know this. If it turns out your code had problems, this could be the first step towards fixing them.
If, on the other hand, you don’t do something like this, the software community will be wondering why not, and whether all of your code has been managed like what we saw in HARRY_READ_ME.txt.
Sean says
Hi Gavin,
I think your determination to respond to just about every comment here is admirable.
Apologies if someone already asked this, but I couldn’t find it in a skim of the comments above.
Can you give a couple examples of significant national meteorological services (maybe Germany or France?) that have confidentiality agreements with CRU, and who would not like it if CRU released their data?
Thanks,
Sean
RaymondT says
Gavin, With the approach of the Copenhagen meeting, we see more and more alarming articles on climate change which is vaguely understood to be the same as AGW. Most of these articles seem to emphasize the warming of the artic as proof of climate change. My question is: what contribution does the radiative forcing due to CO2 have on the warming of the artic? Some climatologist believe that the gulf stream will reduce in intensity thus cooling down northern europe with AGW which seems to contradict the observation of a warming of the artic. Is the gulf stream the main mechanism by which heat is transmitted from the tropics to the artic ?
[Response: The Gulf Stream is part of that process, but there is no good evidence that there has been any significant change in the heat transported north in recent decades. CO2 and the other greenhouse gases do have an impact on the arctic, but than so do aersols like black carbon and ozone – both warming fators. Look up Shindell et al 2008 for a discussion of this. – gavin]
John P. Reisman (OSS Foundation) says
#959 J. Bob
As you know I’m a generalist, which is merely a manner of examination of the bigger picture (context, relevance). You need to try a thought experiment and broaden your view of the contexts involved if you wish to see the reasoning of this. Take the ice extent and thickness and weigh it along with long-wave infrared absorption, compare that with industrial GHG increases and expected sensitivity as currently understood. Then add that to the models and expectations and compare that to the observations in ocean and land temps recent and paleo.
Yes there are error bars and always will be, but the observations are matching the models. Atmospheric temperatures are warming and cooling respectively/generally in regard to expectations. That is an awful lot of coincidence… glacial melt rates, tropical regions and droughts, high/low lats with increased moisture (rain/snow). Jet-stream lat. shift, TC strength increasing over time…
It seems you are still limiting your view to the short term sat. volume measurements on the ice volume and not putting it into the overarching context. I think that is still limiting your view. In this case, especially because NSIDC/others have not completed their longer term analysis models on volume, we can just place the short term observation into the context of the longer term observations that surround it and the model expectations. That gives us at least a reasonable picture.
Any data out of context is difficult to reasonably attribute but looking at the bigger picture, the pattern is quite clear.
Also, the argument in your post #940 is “11/28/2009, Arctic sea ice extent still “nip & tuck” with 2005 & 2003. Above 2007 & 2008.”. This is even shorter than the ice volume record and yet you post it though it contradicts your argument? I believe when we last discussed it Walt from NSIDC added some comments on the ice volume.
Also, your argument in #940 does not account for government bureaucracy. I belief there are efforts underway to free up more data, but we will have to see how this plays out. Maybe if we all helped put on the pressure things might move faster (many already are though)?
But to let such things impede reason regarding well understood attribution for AGW is rather silly. Delay will eat economic potential and that is my greatest fear. Context is key and we need the economy to address this, so early action is wisest and delayed action may very well seal our fate beyond capacity to avert egregious ramificaitons.
Steve Fish says
Michael Trigoboff (several posts)
You have made multiple outraged posts regarding the hidden and awful computer code that all our lives depend upon. However, Gavin has said repeatedly that much of this code has been available for years. You can be a denier and just ask for the inaccessible programs and complain, or you can get off your ass and actually help out by working on what is available. Put your time where your mouth is.
Steve
John P. Reisman (OSS Foundation) says
#965 David Gordon
I understand your point but the subject is not the definition of sunrise or sunset but the well established ‘theory’ that these quantifiable events give us a virtual certainty that the events will continue in the foreseeable future. It was merely an illustration. Feel free to present one you feel better makes the point.
Further, we were discussing what a theory is. Your context was that AGW is just a theory thus intimating that we really can’t rely on a theory. My point was simply that a theory is also:
1 : the analysis of a set of facts in their relation to one another
We are not in 8th grade, and we are discussing what may very well be the most important subject in the modern history of humankind. You are saying there is no chance it is not a sunrise, but in reality, there is a chance that there will not be a sunrise here on earth, sometime in the future, for humans to observe… planet killer asteroid, human extinction…?
Let us not wallow in the trivial aspect of the discussion but concentrate on the context and reasoning involved; that of the importance of the AGW ‘theory’ and what we as a human race can divine from that understanding, hopefully in time to to avoid to much devastation to our fellow man, our economy, our resource capacity.
huxley says
You said []: “Oh dear. 95% of the data in the CRU database is in the public domain. GISTEMP uses it and gets the same basic answer. Both GISTEMP and CRU are perfectly able to reproduce their own analyses and the differences between them are very minor – especially when it comes to the long term trends. The issue is not the size of the data or the complexity of the task but, as been made clear for years, the third-party NDAs that CRU has entered into. – gavin]”
Gavin: Oh dear. You keep missing my point. I want to know whether AGW scientists can share their data and methodology so others can inspect and replicate the results.
At first you said it was impossible. Now you say that most of the data is available, though no mention of the methodology, and some of the data is bound up by NDAs — which I already knew.
Is the missing 5% data crucial? What about the methodology? Can the NDAs be overturned in the name of the common good?
[edit]
[Response: Write to your government to encourage them to release more data without NDAs. And no, the 5% isn’t crucial because you know that broad results of the GISTEMP approach gives the same result – which is confirmed by dozens of independent sources. If you want to see the sensitivity to methodology, read the papers, or play around with it yourself. Really, it’s not as hard as it seems, and you will learn a lot. – gavin]
Ron R. says
Apparently the Cretinists, ‘cuse, I mean Creationists at the Discovery Institute are “thrilled” about “Climategate” (someone was sure in a hurry to name it!). Kind of validates their extremely selective quotation methods I suppose.
http://littlegreenfootballs.com/article/35229_Discovery_Institute_Creationists_Thrilled_About_Climategate
Barani says
I see that some people are under an assumption that unless an event occurs it is not REAL. In Science, an event does not have to occur in front of our eyes. It can occur in a thought-experiment. Universal laws are time-invariant. You can reverse cause and effect and those laws will remain valid. A sunrise can occur in a thought-experiment and can be deemed as good as real.
There are many events predicted by Theories ahead of their occurence. For example, nuclear fission was theoretically predicted. Superconductivity was another. If we listened to skeptics who demand physical evidence, Science would go nowhere.
Global Warming is another complex scenario prediction. Some evidence of it occurs in physical world and a lot more in logical realm. Denying it is unscientific. Overwhelming number of scientists and recorded data exist outside of CRU to support GW theory that it is even ridiculous to challenge the integrity of thousands of scientists through unlawfully obtained personal emails.
To the other person demanding release of all data immediately – an institution cannot allow itself to be arm-twisted by cyberterrorism. All data will sooner or later appear in public domain either as raw data or as graphs in journal publications in normal course of time. Or you may follow established procedures to access data held back by researchers, explaining why it will help you. If you are genuine researcher in the field, I am positive that fellow scientists will be more than glad to share their data with you.
gary thompson says
are you financed by david fenton?
[Response: No. – gavin]
AC says
[Response: It plays a role. Previously people had used the internal variability in model control runs to make estimates of how much things vary in the absence of forcing. The paleo-based estimate, which involves subtracting out the natural forcings (which are themselves uncertain) from the reconstructions (uncertain too) come up with similar numbers (at least in the hemispheric mean). If they had been radically different there might be more of an issue, but they aren’t. However, this will always be a difficult problem because it has to be diagnosed somehow, and yet there is no period in which this is the only thing going on. – gavin]
Yeah, that’s exactly the concern, isn’t it? Not that the paleo data is unreliable, but that it’s been selected, adapted, etc to fulfill the purpose of not being “radically different”. That the period chosen to compare to the instrumental data is artifically chosen for this purpose, and the “divergences” are if not hidden, then treated as if they are exceptions to the known association, as opposed to perhaps indicators that the association is unreliable – all because, again, everything needs to be massaged to not be “radically different” than the models…
[Response: This is paranoia. There is no evidence that anyone has massaged the variability in the proxy reconstructions to match estimates of model internal variability. None. How would you even do it if you wanted to? – gavin]
manacker says
Gavin, like Dr. Judith Curry and many others here I think you are doing an admirable job of defending the AGW premise against the few skeptical voices here.
But I do not think it makes much sense for you to defend the bad science, arrogance, data manipulation and cover-ups by those that were exposed by the leaked CRU emails.
By circling the wagons and defending these individuals you only put yourself into the same box. Let them hang out to dry and move on.
Max
[Response: Since I’m pretty much on my own up here, I’d like to know how you can circle a single wagon? But the reason why I’m doing this is to counter statements like yours. I see no evidence of ‘bad science’ in these emails – on the contrary, I see many people asking questions, challenging assumptions and doing new analyses. Neither is their any evidence of malicious data manipulation or cover-ups of the same. Your claims are simply a reflection of your confirmation bias. – gavin]
Andy Vaughan says
Is there any connection between the HARRY_read_me.txt file and this paper:
Mitchell T. D. and P. D. Jones. 2005. An improved method of constructing a database of monthly climate observations and associated high-resolution grids. International Journal of Climatology 25, 693-712.
Many thanks.
[Response: This paper was the description of the CRU TS 2.1 data. I think ‘Harry’ was tasked to move this to the next version (CRU TS 3.0) incorporating more data and tidying up the process as he went. The draft paper on the new version is not yet finished, though the data is available (via ClimateExplorer for instance). – gavin]
jef says
Gavin…I have to say that I like how you’ve stuck in there with close to 1000 comments.
Having said that, you said
“But note that I do not agree with the converse, that everything that is not released cannot be trusted – which appears to be the underlying theme in many commenters minds.”
I worked for an Aerospace Engineering company for most of my career. The general climate was that if you did a presentation for a group of Engineers they would look at every detail, especially the numbers part.
Oh happy day if they could find something, because then they could dismiss the rest of the presentation (usually politely) and then the presenter would have to go back and rework the whole thing.
This is the same culture who’s motto was: 1 “oh, s****’ blows away 10 “atta boys”
Fortunately I was very careful, very detailed, so wasn’t ever caught in that position.
But there was the HR type person who once presented a Cume chart that started going up, then went down. Much hilarity ensued.
[edit]
Deech56 says
RE jef
But the point is that the engineers have the technical knowledge to critique the presentation. Would you stop what you were doing if the person bringing in the coffee asked a question of the the presenter that had already been considered and found wanting?
Deech56 says
RE Michael Trigoboff
As mentioned before, the modeling software has already been available, and no movement to improve it by outsiders has been apparent. Also, the open source community would have to have some understanding of the the input and, and frankly, considering the seemingly approval of hacking into an organization’s computer system and the climate deniers in their ranks, would have to earn the trust of the end users. Just read ESR’s stuff.
Andy Vaughan says
Many thanks Gavin, much appreciated. So would it be fair to say that the methodology logged in the HARRY_read_me.txt files are (along with a lot of other things) intended to be incorporated into a future paper detailing all the limitations and assumptions in the data set, so that all using the data are aware of the circumstances surrounding the data? The reason I ask is because the sceptics are making a great play on the ‘Harry’ file, but is seems to me that the file is a record (and some probable (in hindsight) incautious ‘off the cuff’ comments for personal consumption) of the struggles involved in pulling together and rationalizing disparate information. ie The proposed paper should clear everything up and put the CRU TS data in context.
stella says
“I see no evidence of ‘bad science’ in these emails – on the contrary, I see many people asking questions, challenging assumptions and doing new analyses. ”
Wow, I really can’t for the life of me figure out how you can say that. Do ethics play no role in your science? This code should be the backbone of everything that is done in science. Without it, you are just another bunch of zealots professing claims no one can really trust or stand behind:
THE CODE
Act with skill and care, keep skills up to date
Prevent corrupt practice and declare conflicts of interest
Respect and acknowledge the work of other scientists
Ensure that research is justified and lawful
Minimise impacts on people, animals and the environment
Discuss issues science raises for society
Do not mislead; present evidence honestly
Can you really tell me that the emails did not show clear violations of, and disregard for, such a code? As I said earlier, when you defend such behavior, it really makes me wonder what sort of practices have become the norm in today’s science. And it makes me question every new finding that is presented as fact.
[Response: Do please calm down. I assure you that no animals were harmed during the writing of those emails. Despite what you might be reading, there is no evidence for anyone misleading anyone (except by some of contrarians who are discussed), while there is plenty of evidence for people acting with skill and care in doing their research and working on papers or the IPCC report. I agree that some people were not discussed with respect, but respect in science is earned and doesn’t generally extend in private conservations to people who’ve slandered you. – gavin]
whatAboutBob says
Gavin,
Your response in comment #289 is incorrect (at best uninformed) please see
http://pielkeclimatesci.wordpress.com/
[Response: Not sure what you are pointing to specifically, but I stand by that statement. The relevant IPCC summary (Chp 2. p185) is as follows:
and (p184)
Thus while there are complexities and uncertainties involved, the best estimate is that LCC has been a cooling effect historically. I still don’t know where the US statistic that was quoted in #289 comes from. – gavin]
DaveG says
About the FOI requests.
Why do people like Steve McIntyre want CRU’s raw data? If CRU handed over the meteorological office data (assuming a way had been found around the contractual problem) how would McIntyre know that the data was the genuine raw data? He wouldn’t, would he? Surely, if McIntyre, or anyone else, wants the raw meteorological office data, he would be wiser to get it from the meteorological offices themselves. They own it, and McIntyre would know that the data had not been tampered with by CRU (not that I’m suggesting that CRU would do that).
Is this just a case of McIntyre, knowing that CRU cannot release all of the data due to contract stipulations, trying to embarrass CRU by repeatedly asking for the info to be released and then shouting loud and often about “secrecy” and “withholding data”? He doesn’t really want the data. He just wants to make CRU squirm.
Bart Verheggen says
Stella (989),
It seems that many of the more angry commenters (I guess that includes you) are very uneasy with scientists adhering to the 6th commandment of ‘THE CODE’:
“Discuss issues science raises for society”
pjclarke says
Is this just a case of McIntyre, knowing that CRU cannot release all of the data due to contract stipulations, trying to embarrass CRU by repeatedly asking for the info to be released and then shouting loud and often about “secrecy” and “withholding data”? He doesn’t really want the data. He just wants to make CRU squirm.
Well according to the Nature ‘Climate Feedback’ blog, he and his associates submitted 58 FOI requests over a 6 day period, that included a weekend. So that’s 14 requests a day or about 2 an hour. This is not science, this is spam!
Stefan Andersson says
Gavin, thank you for these two concise posts. I had just begun to read up on the “hack”, and these posts I believe have given me a good clear picture of the situation. I do hope this debacle blows over soon. Hopefully there will be something positive coming out of the process. Thank you for your work.
Ray Ladbury says
Stella, Perhaps you would care to point out just where you think the CRU scientists ar in violation of said code. I see no evidence of any of these in the emails–merely harassed and frustrated scientists trying to do their job in the face of irrational and ideologically motivated resistance.
Moira Kemp says
Sean — 29 November 2009 @ 8:59 PM
“Can you give a couple examples of significant national meteorological services (maybe Germany or France?) that have confidentiality agreements with CRU, and who would not like it if CRU released their data?”
I don’t know who exactly has these agreements – I think I read a figure of 150 MOs somewhere? – but, as an example, the UK Met Office confidentiality agreement is here –
http://badc.nerc.ac.uk/data/surface/met-nerc_agreement.html
The German CA (in English) is here –
http://www.dwd.de/bvbw/appmanager/bvbw/dwdwwwDesktop?_nfpb=true&_pageLabel=dwdwww_spezielle_nutzer&T76002gsbDocumentPath=Navigation%2FOeffentlichkeit%2FDatenservice%2FAllgemein%2FForschung%2Fnutzungsbedingungen__ful__node.html__nnn%3Dtrue
I can’t see an English version of the French site and I don’t speak French –
http://france.meteofrance.com/
I’m not sure it would be responsible to issue a list of the individual MOs at this point. Some of them are probably small operations, but dealing with very serious issues, such as tropical cyclones for instance, they don’t need to be the centre of a press storm. This is being dealt with through the UKMO who are their representatives in the UK. Some things just take time to sort out.
Moira Kemp says
Stella 30 November 2009 @ 8:17 AM
Stella, I wonder, have you taken the time to read through everything Gavin has written about this since last weekend. There is a great deal of it I know, but I do recommend you do so. I think you will find there is plenty here to give you, at the very least, pause for thought.
As you are in the Social Sciences you might find the caution the science historian Spencer Weart recommends worth thinking about too – -http://voices.washingtonpost.com/capitalweathergang/2009/11/perspective_on_a_climate_scien.html
Joe says
Poor old Harry.
Loving your off-hand dismissal of HARRY_READ_ME.txt. You give a perfectly plausible explanation, no doubt aimed at avoiding too much technicality in order to keep your reassurances accessible to all.
While that is understandable, what isn’t understandable to anyone with any programming knowledge is how the code used (admittedly in the past) to maintain your database could contain basic school-boy errors such as overflows, or the wanton ignoring of run-time errors in data handling.
It’s clear to anyone who knows anything about coding or databases that any data that’s come anywhere near code like that is likely to be corrupt beyond salvation and that the only way to ensure its integrity would be to go back to the original sources and start again, more carefully this time. Only, it seems you haven’t retained the original data.
Give Harry a medal for his efforts (no, really) and a pay-rise for not going public with the hacked-about mess himself. But don’t even try to suggest that he could salvage a reliable and integral database from that train-wreck!
[Response: Haven’t you ever had code that bombed when applied to a new situation that you thought was similar to the one it did work for? Or make a coding error that only showed up intermittently? And did you not eventually fix them? Code can be debugged you know… and given the existence of the CRU TS 3.0 product, that code must have been. – gavin]
Hank Roberts says
> I’m not sure it would be responsible to issue a list of the
> individual MOs at this point. Some of them are probably small
> operations, but dealing with very serious issues, such as tropical
> cyclones for instance, they don’t need to be the centre of a press storm.
Yup. There have long been lists at caca sites. Don’t trust them.
dhogaza says
NASA GISS Model E is open source, and online for quite some time.
How long do we have to wait before the “open source magic” starts? How many more years? Why hasn’t it started already?