The 1991 Science paper by Friis-Christensen & Lassen, work by Henrik Svensmark (Physical Review Letters), and calculations done by Scafetta & West (in the journals Geophysical Research Letters, Journal of Geophysical Research, and Physics Today) have inspired the idea that the recent warming is due to changes in the sun, rather than greenhouse gases.
We have discussed these papers before here on RealClimate (here, here, and here), and I think it’s fair to say that these studies have been fairly influential one way or the other. But has anybody ever seen the details of the methods used, or the data? I believe that a full disclosure of their codes and data would really boost the confidence in their work, if they were sound. So if they believe so strongly that their work is solid, why not more transparency?
There is a recent story in the British paper The Independent, where Friis-Christensen and Svensmark responded to the criticism forwarded by Peter Laut (here). All this would perhaps be unnecessary if they had disclosed their codes and data.
Gavin and I published a paper in Journal of Geophysical Research, where we tested the general approach used by Scafetta & West, and tried to repeat their analysis. We were up-front about our lack of success in a 100% replication of their work, but we argue that the any pronounced effect – as claimed by Scafetta & West – should be detectable even if the set-up is not 100% identical.
However, Scafetta does not accept our analysis and has criticized me for lacking knowledge about wavelet analysis – he tells me to read the text books. So I asked him to post his code openly on the Internet so that others could repeat our test with their code. That should settle our controversy.
After repeated requests, he told me that he doesn’t really understand why I’m not able to write my own program to reproduce the calculations (actually, I did in the paper together with Gavin, but Scafetta wouldn’t accept our analysis), and keeps insulting me by telling me to take a course on wavelet analysis. Furthermore, he stated that there “are several other and even more serious problems” in our work. I figure then that the easiest way to get to the bottom of this issue it to repeat our tests with his code.
A replication in general doesn’t require full disclosure of source code because the description in the paper should be sufficient, though in this case it clearly wasn’t. So to both save having us do it again and perhaps miss some other little detail – in addition to using an algorithm that Scafetta is happy with – it’s worth getting the code with which to validate our efforts.
It should be a common courtesy to provide methods requested by other scientists in order to speedily get to the essence of the issue, and not to waste time with the minutiae of which year is picked to end the analysis.
The reason why Gavin and I were not able to repeat Scafetta’s analysis in exact details is that his papers didn’t disclose all the necessary details. The first point he raised was that we used periodic instead of reflection boundaries. The fact that the paper referred to the expression ‘1/2 A sin (2 pi t)’ to describe the temperatures or solar forcing would normally suggest that they used periodic rather than reflection boundaries. There was no information in the paper about reflection boundary. But this is no big deal, as we have subsequently repeated the analysis with reflection boundary, and that doesn’t alter our conclusions.
After further communication, we found out that Scafetta re-sampled the data in such a way that the center of the wavelet band pass filter was located exactly on the 11 and 22 year solar cycles, which were the frequencies of interest. He also informed me that a reasonable choice of the year when the reflection boudary was made should be the year 2002-3 when the sun experienced a maximum for both the 11 and 22 year cycles. This information was not provided in the papers.
I’m no psychic, so I couldn’t have guessed that all this was needed to reproduce his result. But since Scafetta has lost faith in my ability to repeat his work, I think it’s even a greater reason to disclose his code so that others can have a go.
For the record, we did not just use wavelets to filter the data – we obtained the same conclusion with an ordinary band-pass filter.
Dave says
Gavin, welcome back to the scientific community. I guess this makes you a skeptic. (No, I won’t say that you are a “denier,” as that would be untrue and unprofessional.) Now maybe you can empathize with the rest of the scientific community that has been calling for this kind of transparency in other areas of climate science.
green r&d mgr says
Hold everyone to this standard.
Nothing is worth a damn unless it reviewable and independently verifiable.
That includes the current views, for and against.
Be open about the error margins in the measurements, every adjustment and the calculations.
If others starting from the raw data cannot replicate every single step, it is not robust science.
If the final result does not include a clear and verifiable error margin that explicitly accounts for every step from initial measurement, every adjustment and any uncertainties introduced by every algorithm, then it is not robust science.
Conclusions about trends that are within the error margins are questionable.
I have seen both sides consistently overstate the confidence in their conclusions by leaving this information out.
So a call for code is well justified. However, you must hold the current views to the same standards. Many of the conclusions being pushed by all sides appear to be unjustified given the size of the +/- errors in the data and the methods.
Omiting or understating this information may help an author sell a particular point of view for a while, but it is self defeating as it delays work on more robust methods to get real answers.
Steve Fitzpatrick says
Yes, Nicola Scafetta should release his code, even if it is poorly commented, ugly spaghetti.
My experience is that things which seem “obvious” to the writer of code will often turn out to be non-obvious to someone else, and so essentially impossible to independently replicate; the devil really does more often than not reside in the details.
Hank Roberts says
Mike W, people can go on about this at great length. Do you have a blog? Maybe you could do it in a topic actually intended for that kind of discussion? I dropped a few phrases from your postings into Google and found this is a popular question to be asking lately, elsewhere.
MangoChutneyUK says
I think you are absolutely right in demanding the code from Scarfetta, surely the publishing journal has a view on this too?
Rob says
While I am a hardcore denialist (currently enjoying the happy end of the Copenhagen circus)I second the opinion that Svensmark’s paper shall be ignored until it is entirely clear how it was created. Code, data everything must be freed. As always – ha! guys ;-)
[Response: Scafetta, not Svensmark. – gavin]
Rob says
Gavin@52
“My personal opinion is that FOIA requests are often used just a form of harrassement…”
Don’t think such harassment would be effective, the law is written with the possibility to ignore obviously nonsense requests. In Sweden we have had these kinds of laws forever, never heard of any problem.
Gavin@55
[Response: The hacked documents were not ‘the CRU code’ in any real sense. They are random bits and pieces of research code that had been emailed to people at various times. I wouldn’t read too much into any bugs you found there given that you have no idea how old the code was or whether it was subsequently fixed (or abandoned). – gavin]
This is arm waving, you could just as well assume all the “Harry code” was used. That’s what I do! Why do you think otherwise, why should I think otherwise?
[Response: Because a) the Harry document isn’t code – it’s a work log, and b) the dataset he was working on (CRU TS 3.0) has been released with the documented problems resolved. These documents are nothing even close to a complete and current archive of all production code. – gavin]
Dennis Hamilton says
Thanks for the Post Doug (response 61) nice article but again I think this needs more math and science help by the AGW experts. I can accept that the sun has cyclical variations that span very small timeframes in geological terms – got it (at least base on our current limited observations). Can we postulate that other natural cycles that contribute/reduce global warming forces have the same cyclical variations? we then postulate that CO2 additions by humans has a definitive ceiling (don’t care what percentage you use) and the CO2 additions by humans climb arithmetically (straight line). Let’s say for giggles that there are 100 material natural cyclical forces that influence global climate, 1 man made force – in geological time scales there would be occurrences when cyclical forces combine to produce super heating/super cooling periods (we know this true given know fossil records) and that the even with constant increase of CO2 there will be a period of super cooling. The warming we have seen is a very very very small timeframe – my concern is that (Post 81 Chris Crawford – please re-read my post) a change in 1% of a very massive number (sun radiation) can have extraordinary effects when combined with other cyclical forces contrasted with even a large change in a very small number (CO2 percentage in atmosphere) of constant force. The postulation is overly simple because we continue to discover meaningful and material natural forces (jet streams, ocean currents, the number of government AGW studies, etc…) so just saying that the certainty expressed about AGW seems to contrary to the relative uncertainty of that huge blast furnace in our sky effect on all the other cyclical forces we know or have yet to discover. So again, Gavin are you confident that the suns affects have been adequately discounted?
t_p_hamilton says
MikeW asks:”Eli et al, how is the orignal momentum of the absorbed photon conserved? It annoys me a little that it’s rarely mentioned and often just waved away.”
There is an old spectroscopy based on this very phenomenon (for photons with much higher momenta by the way) – Mossbauer.
Sycorame says
How is this different from Steve McIntyre’s requests for code that were being denied, and people saying it is ridiculous to ask for code?
Steve Fish says
Comment by Allen — 18 December 2009 @ 3:41 AM:
Your moral equivalence argument is vacuous. Who has denied methods and data to whom? I think that, without understanding the real issues, you just popped in here to drop your little empty moral bomb for the fun of it. Maybe not, surprise me.
Steve
Completely Fed Up says
It’s different because Steve McIntyre is a hypocrite.
He demands others do something he isn’t willing to do.
When did Gavin et al ever demand source code from Steve?
After Steve had made a huge stink over his requests being ignored (which is what you do with ignorant queries).
Silk says
“The warming we have seen is a very very very small timeframe – my concern is that (Post 81 Chris Crawford – please re-read my post) a change in 1% of a very massive number (sun radiation) can have extraordinary effects when combined with other cyclical forces contrasted with even a large change in a very small number (CO2 percentage in atmosphere) of constant force”
Do you know what a change of 1% in solar activity does to forcing?
Do you know what a change of 280ppm to 550ppm of CO2 does to forcing?
Because Gavin and the rest of the climate commuity DOES KNOW (there is mountains of work on this) and the solar effect is small compared to the GHG effect.
Read the IPCC Report. AR4, Working Group 1. It’s all in there.
BlogReader says
Gavin #98 If someone says they used linear regression, you can do that yourself. If the say the used an EOF decomposition of the sea level pressure field, then you can do it yourself.
In that case the program could look something like this:
data = (read from file)
linearReg = WellKnownProgram.linearRegres(data.column(3))
At least with Java (apache commons, maven repos) and Perl (CPAN) there’s well known libraries out there that everyone uses for free. Many eyeballs have looked them over. Probably the same with Fortran and R.
Now what if the temperature was really in column 4 and not 3, which was the smoothed out temperature reading? No amount of saying “we used linear regression” is going to catch that error.
[Response: Of course not, but this kind of error is pretty easy to spot in an independent replication too. For instance, the only case in which I know this happened is from Craig Loehle and was spotted without any code being posted (though the data was). – gavin]
Bill DeMott says
I’ve just now grading some undergraduate biology exams that involved calculations and interpredations and came across answers that provide insights into the reasoning of people who talk about “small, subtle changes” in atmospheric C02 over the last century or so as in #41 above.
The results looked something like this
Coefficient A Coefficient B
Species 1 0.46 0.0581
Species 2 0.40 0.0050
Some students said that variation in Coefficient A is bigger (and more important), because (0.46 – 0.40) > (0.058 – 0.005). However, this overlooks the fact coefficient B varies over an order of magnitude (1,000 percent) while the variation in coefficient B is closer to 10%. I just took for granted that everyone would find the variation in coefficient B more significant. The notion of a coefficient of variation (Standard deviation/mean) is important here.
Mark Schaffer says
Sycorame asks: “How is this different from Steve McIntyre’s requests for code that were being denied, and people saying it is ridiculous to ask for code?”
If you take a look at the link to the data sources right on the main page of this website, you will find that his requests constituted harassment and were not an attempt at honest debate because everything needed was always available. Did you just fall off the turnip truck and miss all the prior discussions here on this very topic?
BillB says
Lucia’s site is aflame with “deniers” joining you in your request that Scafetta send you his code.
Rob says
Gavin@107
[Response: Because a) the Harry document isn’t code – it’s a work log, and b) the dataset he was working on (CRU TS 3.0) has been released with the documented problems resolved. These documents are nothing even close to a complete and current archive of all production code. – gavin]
Harry document is a log, logging what’s going on with the code. Don’t you remember the file where this funny hard coded array were superimposed onto the measurement data, think it was called the fudge factor or similar. And to preempt, I do not mean the one where this particular code where commented out, but the other one.
Me opinion is that the fudging in that case is so twisted, there is something behind it. One doesn’t even think such thoughts without having an intent. What could possibly make me believe otherwise? Answer, the full CRU code, the full CRU data, logs of which data where included, which were excluded, the rationale behind such decisions. (Remember what the Russinas now say, CRU did some heavy cherry-picking of data from Russia). I don’t know whats true about the Russian claim but when all these claims are put together i makes a substantial pile of no-no’s. That’s about it, not you arm waiving. No offense, but this thread is about releasing code.
[Response: Now you are confused. The Harry log was not the code that had the ‘fudge factor’ comment – that code was related to the tree ring MXD calibration tests that were (in the end) never published. Not sure how this thread is relate to posting code that was never used in a publication. – gavin]
Nick O. says
Maybe if they won’t release their code they would be willing to release their e-mails?
And maybe those e-mails could include any discussions of how they view the work of those scientists who generally support the view that human activity is contributing to global warming?
Or is that too much to ask? After all, it was okay to release the UEA traffic, wasn’t it?
Or maybe the people who hacked the UEA e-mails think – for balance and open discussion, of course – that they should also hack Scafetta and West’s e-mails and code? Presumably that would prove just how correct and scientifically robuts their methods and data are.
All in the interests of free and open discussion, of course …
Dean says
#110: “How is this different from Steve McIntyre’s requests for code that were being denied, and people saying it is ridiculous to ask for code?”
Because they made a good-faith effort to replicate the results on their own based on the descriptions that were initially provided. And because the author of the original paper criticized their methods without providing details. They didn’t start by asking for code, they asked for it when they failed to replicate the results.
Gary Rissling says
Considering that CERN is conducting the $14 million CLOUD experiment, which will – at least in part – confirm or reject Friis-Christensen & Lassen’s work, we will likely at least have empirical, unbiased, and clear empirical information about the correlation between cosmic rays/cloud formation and climate change. Should this experiment determine that cosmic rays have a greater influence over global temperatures than anthropogenic co2, will the proponents of AWG be objective enough to accept it?
[Response: Yes, of course. Just don’t hold your breath for the experiment to determine that. See our commentary on this, here.]
DVG says
RealClimate authors could do the entire climate science industry and the rest of the world a favor by publically proposing that ALL published studies involving data/code be published with the simultaneous full disclosure of the involved data/code, and that would-be ‘peers reviewers’ refuse to review — and publishers refuse to publich — if the data/code is not so disclosed.
Persistent full disclosure, at least in the long run, creates credibility. And that’s what we need much more of.
What about it, RealClimate authors?
Doug Bostrom says
Dennis Hamilton says: 18 December 2009 at 10:28 AM
“Monster waves”, in a different domain. Sure, but nothing like that has so far turned up, apparently, and not for lack of trying to find it. I’m speaking from a point of view that is abysmally and truly ignorant compared to professionals in the field, but I suppose a lot of candidates could be ruled out due to insufficient power, time span of influence, etc. Using the “monster wave” analogy, they come and go rather quickly.
Bear in mind, the article I cited is a very tight, concise but necessarily compressed synopsis of a very long and focused effort by a lot researchers bringing a lot of analysis to bear on a lot of observations. To really get a head start on the whole thing, you’ll probably need to dig into the research cited in Weart’s writeup.
Martin Vermeer says
Sycorame #110, the difference is very simple: a scientific paper should be presented in such a way that someone ‘versed in the art’ is able to replicate it results.
Don’t you think Gavin and Rasmus qualify? They are ‘versed in the art’ yet didn’t manage to replicate, and not for not trying. That is the point at which to ask for more info.
Steve McIntyre OTOH is not ‘versed in the art’ and not on the way to becoming. Don’t take my word for it: he failed to replicate papers where real scientists succeeded, and that’s a pattern. He demands endless hand-holding without ever producing anything of scientific interest.
For enlightenment, read this, and links therein:
http://rabett.blogspot.com/2009/12/needy-in-almost-unnoticed-interview.html
mondo says
Re #111. CFU. You surely are being ironic?
ADR says
I have some *very* basic questions from someone who is trying to understand AGW:
– What percentage of CO2 does the atmosphere contain in total? 0.038 percent
– What percentage of that CO2 is natural and what percentage is man-made? The value in the 1800s was about 0.028 percent. So about 30% of the current number can be considered ‘man made’
– By what methods is CO2 measured to know how much is being emitted by humans vs. natural?Read this
– Are natural CO2 emissions steady? If not, how much does it vary?There are seasonal variations, and smaller interannual variations, all of which are completely swamped by human-caused increases. See e.g. here
– Compared to other greenhouse gases, what percentage of total GHGs does man-made CO2 affect global warming?roughly half
If these questions have been stated before somewhere that is easy to understand for the average person please post a link. Thanks.
[Response: Answers in bold.–eric]
Doc Savage Fan says
Good point and I couldn’t agree with you more…transparency is a good thing for science. BTW, your information you linked on GCR is quite dated…any chances that you’ll address this subject in light of what’s going on at CERN with CLOUD? I saw some of the chamber calibration results and wonder what you think. It appears to be very clear that GCR does in fact modulate clouds.
Dennis Hamilton says
Silk (Post 112)
OK – your answer is that you and those who are wise and knowledgeable (I assume that is 100% of the climatology/scientific community) have properly calibrated the effects of solar radiation and are extremely comfortable that based on known forcing elements, the solar input is relatively minor compared to that of CO2. The IPCC report discounting solar forcing using 5 pages of 234 page subgroup document is not what I call 100% certainty via scientific method (again the magic tree rings cures all proxy problems). I would expect you good sirs to spend allot more energy on this discussion. ALL energy, warmth, life starts and will eventual end with sun – I am simply asking Gavin (notice Silk has committed you on this) whether you are 100% sure that the solar radiation forcing has been accounted for and appropriately discounted given that we are not 100% sure about absorption rates and other potential cyclical factors they may have an impact by increased radiation. I remain unconvinced that the amount of solar/magnetic radiation changing by 1% does not have a huge impact on thermodynamic exchanges in the atmosphere and terra firma.
SecularAnimist says
Meanwhile, in Copenhagen, it appears that The Powers That Be are struggling to agree on a plan for emissions reductions that are already known to be insufficient to limit the global temperature increase to 2 degrees Celsius, an increase which is already known to be too great to avoid catastrophic effects (this being self-evident given that we are already seeing catastrophic effects from the warming that has already occurred).
chris says
Doc Savage Fan
You need to be more specific about what particular result you’re referring to. The more general point is that one can hardly use the observations of gamma ray-induced microparticle formation in a chamber with an atmospheric memetic to say very much about the ability of GCR to “modulate clouds”.
Barton Paul Levenson says
Dennis: For information on why it isn’t the sun causing the current global warming, start here:
http://BartonPaulLevenson.com/Sun.html
Ani says
Just a remark from an old weather forecaster and solar analyst. With the understanding that forecasting the weather the main worry is 24 hours and if my knee ached in the morning it might influence a forecast for rain in the afternoon. In the 80’s when I heard about global warming I brushed it off to solar activity. When activity is high in the 11 year cycle It seemed to me that the longwave pattern would change over the U.S. my guess was the positioning of the low pressure off of Alaska. The east and west coasts would be warmer and and the midwest would be colder with possibilty of polar outbreaks in the winter. When solar activity was low (cooler) the midwest was warmer and the coasts cooler with possibility of summer winds and fire in California. If this makes sense then over the last 5-6 years the midwest should have been warmer and the coasts cooler. It seems to me that I could pick up a the 11 year cycle in temps and weather at different locations, also the farmers almanac used 5 year climactology which sometimes made them better than the government since they use 1 year. Of course throw in El Nino and then I just get confused. But anyway what I’m trying to get at is slight variations in temp seem to be able to make big differences in regional weather. What worries me is that the earth should have been slightly cooler the the last 5 years and with the increase in solar activity now we may have interesting things happening the next few years.
Barton Paul Levenson says
Doc: It appears to be very clear that GCR does in fact modulate clouds.
BPL: It’s not clear at all. Read the asinine “The Chilling Stars” opus by Svensmark and Calder. Published in 2007. All the charts of GCR-cloud correlations cut off at 1995. Can you guess why?
Ray Ladbury says
Dennis Hamilton says “I remain unconvinced that the amount of solar/magnetic radiation changing by 1% does not have a huge impact on thermodynamic exchanges in the atmosphere and terra firma.”
Would that be because you don’t understand the Stefan-Boltzmann Law?
NoPreview NoName says
ADR wrote:
eric responded:
Eric, I’m surprised you answered that. The question is very unclear. What global warming? The increase from preindustrial times or compared to Earth in a vacuum? What GHG? Including or excluding water? Literally the question doesn’t even make sense. Maybe he was asking about how much of anthropogenic carbon dioxide is still in the atmosphere, although I don’t think so.
And you messed up the bolding.
Ken W says
Greg Leisner (73)
Greg, debates (often with cursing and raised voices) is not at all uncommon at scientific meetings. That’s definitely not limited to climate science. Scientists being convinced of their own evidence and strongly defending it, is a great asset to the scientific process. The best ideas rise to the top and those that don’t hold up to careful peer scrutiny die off.
Even after there are well established and accepted theories in a field, scientists will continue to debate intensely on relatively minor points. That’s the current state in the field of climate science. While there will always be a few contrarians (as there were in Evolution, Plate Tectonics, etc.), the climate science debates now are far removed from the nonsensical debates you see on various blogs (e.g. Is it really warming? Are humans really responsible for the increased atmospheric CO2? etc.)
Alexandre says
Great post. I´m stunned theses guys even have the nerve to keep codes away from scrutiny after all this Climategate noise.
ADR says
— Eric:
Thanks for the response. So to clarify (correct me if I’m wrong): The atmosphere is made up of 3.8% CO2. Of that, 30% is man-made CO2. So, that translates into 1.14% of the total atmosphere is man-made CO2.
— NoPreview NoName:
What I was trying to ask is, how much of an *impact* does CO2 have compared to to other GHGs (as a percentage), even though the percentage of CO2 seems small. Eric said about 50%, which is a big factor.
[Response: But don’t confuse yourself here. Finding out that it’s ‘only 1.14%’ doesn’t tell you anything. If I gave you a glass of water that was 1.14% Draino, would you drink it? How about if I gave you a class of water that was 0.000000001% plutonium?-eric]
Mike W says
I would like to ask Eli, Ray, Rod and Spaceman some more questions and continue probing their juicy, juicy brains but I don’t want to keep straying off the thread topic. Is there anywhere on RC where I can throw my own questions up?
ADR says
— Eric:
I’m not confused, I didn’t use the term “only” (you put that in there), and I understand your Draino example in that context. I just wanted to understand the basics and you provided them. Thanks.
Phil. Felton says
Mike W says:
18 December 2009 at 1:48 AM
Thankyou guys, always exiting to be able to draw back the curtain on the world like this!
Rod, Eli (was hoping you’d contribute, I read some great stuff of yours written in 2007 in response to a similar question) Spaceman; great answers! Will need some time to digest them and do some more reading, may have some more qualifying questions.
Figured there would have to be an asymmetry for it to work, the message I’m taking away is that it is “easier” for vibration to “become” translational energy than it is for translational energy to become vibrational energy. I’m just reading a bit about equipartition and the ‘freezing out’ of rotational and vibrational modes….I think I see how it fits into the picture here.
Of course, now I want to know how much of the upwelling IR is thermalised? and “wasted” as translational energy, and how much is re-radiated.
In the lower troposphere the mean time between collisions is orders of magnitude smaller than the radiation lifetime of the excited state (ro-vibrational), consequently virtually all the absorbed energy is given up to surrounding molecules as translational energy. It’s only higher in the atmosphere (eg stratosphere) that significant radiational exchange takes place, which is responsible for cooling by GHGs (mostly CO2, there being no H2O there).
If you want to find some numbers look up papers on ‘CO2 lasers’. CO2 lasers rely on resonant.collisional activation of excited vibrational states by excited N2 (excited by electric discharge).
dhogaza says
Do Roy Spencer and John Christy make the source code used to create the UAH MSU temperature reconstructions public?
Doug Bostrom says
Comment by Dennis Hamilton — 18 December 2009 @ 2:28 PM
Not any chance here that your entry question was rhetorical, disingenuous? Contrasted with your first post on the topic, you seem entirely committed to what has rapidly evolved into a posture.
Have you actually read any of the documents referenced by Weart? If not, why are you asking Gavin for his input?
RB says
ADR #138, a useful analogy that I came across is that milk is opaque because of its fat content and skim milk has < 1% fat.
Donald Oats says
The blog article is a little picky, for Scarfetta and West do not need to release their code since they are wrong. After all, who is to say that you aren’t the correct party; and, performing both wavelet and a band-filtering method means you have corroborated your analysis by comparing two different techniques.
Therefore, by the inverted (perverted) rules of denialism, it is you who must publish all of the code and data for S&W to pick over and go “No, it isn’t periodic, use reflecting boundaries.” etc. See? No matter what you do, you are the ones who must release code and data. Isn’t it grand :-P
Good blog as always.
Dan Hughes says
eric @ 138
Plutonium Toxicity:
“Toxicity
Isotopes and compounds of plutonium are toxic to highly toxic due to their radioactivity. Contamination by plutonium oxide (spontaneously oxidized plutonium) has resulted from a number of military nuclear accidents where nuclear weapons have burned.[66] However, based on chemical toxicity alone, the element is less dangerous than arsenic or cyanide and about the same as caffeine.[67][68]
Plutonium is more dangerous when inhaled than when ingested. The risk of lung cancer increases once the total dose equivalent of inhaled radiation exceeds 400 mSv.[69] The U.S. Department of Energy estimates that the lifetime cancer risk for inhaling 5,000 plutonium particles, each about 3 microns wide, to be 1% over the background U.S. average.[70] It is not absorbed into the body efficiently when ingested; only 0.04% of plutonium oxide is absorbed after ingestion.[26] When plutonium is absorbed into the body, it is excreted very slowly, with a biological half-life of 200 years.[71] Plutonium has a metallic taste.[72]
The alpha radiation plutonium emits does not penetrate the skin but can irradiate internal organs when plutonium is inhaled or ingested.[26] Particularly at risk are the skeleton, where it is likely to be absorbed by the bone surface, and the liver, where it collects and becomes concentrated.[25]
Considerably larger amounts may cause acute radiation poisoning and death if ingested or inhaled; however, no human is known to have died because of inhaling or ingesting plutonium, and many people have measurable amounts of plutonium in their bodies.[68]”
Steve Fish says
Comment by Dennis Hamilton — 18 December 2009 @ 2:28 PM:
I understand what you are getting at, but first I caution you that no scientist worth his salt will give you 100% certainty on anything. Not even whether the sun will come up tomorrow. Our star is the source of all life and it is a very big fusion reactor nearby but it is, thankfully, very consistent. A one percent difference in the sun output would be very noticeable.
Measuring our sun’s output is easy; all you have to do is point a measuring device at it. Relatively simple calculations show that our earth without any greenhouse gasses would be an iceball. Only 280 ppm of CO2 makes it comfortable and a very small difference in CO2 doesn’t have much of an effect, relative to winter/summer differences, but it does affect worldwide weather patterns and polar ice dramatically. The earth’s temperature is delicately balanced, such that it is just right for our current ecology, and this is the reason why there are dramatic changes in geological time that are affected by relatively subtle changes in solar energy and other forcings.
Your misconceptions are why Doug Bostrom has suggested that you read Spencer Weart’s “Discovery of Global Warming,” found by way of the “start here” button at the top of the main Real Climate page. Look under– “Informed, but in need of more detail.” You will find it to be an interesting read. Come back to ask questions.
Steve
Ray Ladbury says
Phil Felton says, “Figured there would have to be an asymmetry for it to work, the message I’m taking away is that it is “easier” for vibration to “become” translational energy than it is for translational energy to become vibrational energy.”
The asymmetry is in the energies. Few gas molecules in the atmosphere (~1% or so I think) have sufficient energy to excite the vibrational mode of CO2. When such a mode is excited by an IR photon, the vibrational mode has an excess of energy compared to what would be expected via equipartition. As such it will tend to de-excite collisionally–and the long life of the vibrational state means it has plenty of time to do so.
Ray Ladbury says
Alexandre says “Great post. I´m stunned theses guys even have the nerve to keep codes away from scrutiny after all this Climategate noise.”
Let’s not get too excited. Remember, the Internet is only a decade and a half old. Prior to that there was no way to easily share datasets. And sharing code is still something I have doubts about–it seems like an excellent way to propagate errors into formerly independent analyses.
[Response: My emphasis –eric
Spaceman Spiff says
Mike W (post #139):
Your best bet is to consult a textbook on atmospheric radiation and climate. You can start with David Archer’s “Global Warming: Understanding the Forecast” ( http://geoflop.uchicago.edu/forecast/docs/index.html ). Or watch his lectures from this past Fall: http://geoflop.uchicago.edu/forecast/docs/lectures.html . Another fine introductory book is “Global Warming: The Complete Forecast. These are science textbooks at an introductory level: how does Earth’s climate work, and what are the data pertaining to climate change?
If after digesting one or both of these, you think you’re ready to swim in deeper waters, then you might try Grant Petty’s book, “First Course in Atmospheric Radiation” (parts of the 1st Ed. including all figures are online: http://www.sundogpublishing.com/AtmosRad/index.html ). It’s a textbook of the relevant physics of radiation transfer in Earth’s atmosphere; climate is a subordinate issue, and climate change is hardly mentioned. Topics include radiation transfer, absorption, scattering, molecular physics, opacities, etc. There may well be a better book at level intermediate between the above; this one is soon going to press, but is still available from the author’s website: http://geosci.uchicago.edu/~rtp1/ClimateBook/ClimateBook.html . This latter text is again about the physics of Earth’s climate, and climate change is largely restricted to paleoclimates.
At that point, you might then have a first year graduate student’s knowledge in Earth’s climate. :-)