In a new GRL paper, Svensmark et al., claim that liquid water content in low clouds is reduced after Forbush decreases (FD), and for the most influential FD events, the liquid water content in the oceanic atmosphere can diminish by as much as 7%. In particular, they argue that there is a substantial decline in liquid water clouds, apparently tracking a declining flux of galactic cosmic rays (GCR), reaching a minimum days after the drop in GCR levels. The implication would be that GCR can affect climate through modulating the low-level cloudiness. The analysis is based on various remote sensing products.
The hypothesis is this: a rapid reduction in GCR, due to FD, results in reduced ionization of the atmosphere, and hence less cloud drops and liquid water in low clouds. Their analysis of various remote sensing products suggest that the opacitiy (measured in terms of the Angstrom exponent) due to aerosols reaches a minimum ~5 days after FD, and that there is a minimum in the cloud liquid water content (CWC) minimum occurring ~7 days later than the FD. They also observe that the CWC minimum takes place ~4 days after the fine aerosol minimum (the numbers here don’t seem to add up).
The paper is based on a small selection of events and specific choice of events and bandwidths. The paper doesn’t provide any proof that GCR affect the low clouds– at best -, but can at most only give support to this hypothesis. There are still a lot of hurdles that remain before one can call it a proof.
One requirement for successful scientific progress in general, is that new explanations or proposed mechanisms must fit within the big picture, as well as being consistent with other observations. They must also be able to explain other relevant aspects. A thorough understanding of the broader subject is therefore often necessary to put the new pieces in the larger context. It’s typical of non-experts not to place their ideas in the context of the bigger picture.
If we look at the big picture, one immediate question is why it should take days for the alleged minimum in CWC to be visible? The lifetime of clouds is usually thought to be on the order of hours, and it is likely that most of the CWC has precipitated out or re-evaporated within a day after the cloud has formed.
In this context, the FD is supposed to suppress the formation of new cloud condensation nuclei (CCN), and the time lag of the response must reflect the life time of the clouds and the time it takes for new ultra-fine molecule clusters (tiny aerosols) to grow to CCN.
Next question is then, why the process, through which ultra-fine molecule clusters grow by an order of ~1000 to become CCN, takes place over several days while the clouds themselves have a shorter life time?
There is also a recent study in GRL (also a comment on May 1st, 2009 in Science) by Pierce and Adams on modeling CCN, which is directly relevant to Svensmark et al.‘s hypothesis, but not cited in their paper.
Pierce and Adams argue that the theory is not able to explain the growth from tiny molecule clusters to CCN. Thus, the work by Svensmark et al. is not very convincing if they do not discuss these issues, on which their hypothesis hinges, even if the paper by Pierce and Adams was too recent for being included in this paper.
But Svensmark et al. also fail to make reference to another relevant paper by Erlykin et al. (published January 2009), which argues that any effect on climate is more likely to be directly from solar activity rather than GCR, because the variations in GCR lag variations in temperature.
Furthermore, there are two recent papers in the Philosophical Transactions A of the Royal Society, ‘Enhancement of cloud formation by droplet charging‘ and ‘Discrimination between cosmic ray and solar irradiance effects on clouds, and evidence for geophysical modulation of cloud thickness‘, that are relevant for this study. Both support the notion that GCR may affect the cloudiness, but in different aspects to the way Svensmark et al. propose. The first of these studies focuses on time scales on the order of minutes and hours, rather than days. It is difficult to explain how the changes in the current densities taking place minutes to hours after solar storms may have a lasting effect of 4-9 days.
There are many micro-physical processes known to be involved in the low clouds, each affecting the cloud droplet spectra, CWC and the cloud life times. Such processes include collision & coalescence, mixing processes, winds, phase changes, heat transfer (e.g., diffusive and radiative), chemical reactions, precipitation, and effects from temperature. The ambient temperature determines the balance between the amount of liquid water and that of water vapour.
On a more technical side, the paper did not communicate well why 340 nm and 440 nm should the magic numbers for the remote sensing data and the Angstrom exponents, calculated from the Aerosol Robotic Network (AERONET). There are also measurements for other wavelengths, and Svensmark et al. do not explain why these particular choices are best for the type of aerosols they want to study.
For a real effect, one would expect to see a response in the whole chain of the CCN-formation, from the smallest to the largest aerosols. So, what about the particles of other sizes (or different Angstrom exponents) than those Svensmark et al. have examined? Are they affected in the same way, or is there a reason to believe that the particles grow in jumps and spurts?
If one looks long enough at a large set of data, it is often possible to discern patterns just by chance. For instance, ancient scholars thought they found meaningful patterns in the constellations of the stars on the sky. Svensmark et al. selected a smaller number of FDs than Kristjansson et al. (published in 2008) who found no clear effect of GCR on cloudiness.
Also, statistics based on only 26 data points or only 5 events as presented in the paper is bound to involve a great deal of uncertainty, especially in a noisy environment such as the atmosphere. It is important to ask: Could the similarities arise from pure coincidence?
Applying filtering to the data can sometimes bias the results. Svensmark et al. applied a Gaussian smooth with a width of 2 days and max 10 days to reduce fluctuations. But did it reduce the ‘right’ fluctuations? If the aerosols need days to form CCNs and hence clouds, wouldn’t there be an inherent time scale of several days? And is this accounted for in the Monte-Carlo simulations they carried out to investigate the confidence limits? By limiting the minimum to take place in the interval 0-20 days after FD, and defining the base reference to 15 to 5 days before FD, a lot is already given. How sensitive are the results to these choices? The paper does not explore this.
For a claimed ‘FD strength of 100 %’ (whatever that means) the change in cloud fraction was found to be on the order 4% +-2% which, they argue, is ‘slightly larger than the changes observed during a solar cycle’ of ~2%. This is not a very precise statement. And when the FD only is given in percentage, it’s difficult to check the consistency of the numbers. E.g. is there any consistency between the changes in the level of GCR between solar min and max and cloud fraction and during FD? And how does cloud fraction relate with CWC?
Svensmark et al. used the south pole neutron monitor to define the FD, with a cut-off rigidity at 0.06GV that also is sensitive to the low-energy particles from space. Higher energies are necessary for GCR to reach the lower latitudes on Earth, and the flux tends to diminish with higher energy. Hence, the south pole monitor is not necessarily a good indicator for higher-energy GCR that potentially may influence stratiform clouds in the low latitudes.
In their first figure, they show a composite of the 5 strongest FD events. But how robust are these results? Does an inclusion of the 13 strongest FD events or only the 3 leading events alter the picture?
Svensmark et al. claim that the results are statistically significant at the 5%-level, but for the quantitative comparison (their 2nd figure) of effect of the FD magnitude in each of the four data sets studied, it is clear that there is a strong scatter and that the data points do not lie neatly on a line. Thus, it looks as if the statistical test was biased, because the fit is not very impressive.
The GRL paper claims to focus on maritime clouds, but it is reasonable to question if this is true as the air moves some distance in 4-9 days (the time between the FD and the minimum in CWC) due to the winds. This may suggest that the initial ionization probably takes place over other regions than where the CWC minima are located 4—9 days afterward. It would be more convincing if the study accounted for the geographical patterns and the advection by the winds.
Does the width of the minimum peak reveal time scales associated with the clouds? The shape of the minimum suggests that some reduction starts shortly after the FD, which then reaches a minimum after several days. For some data, however, the reduction phase is slower, for others the recovery phase is slower. The width of the minimum is 7-12 days. Do these variations exhibit part of the uncertainty of the analysis, or is there some real information there?
The paper does not discuss the lack of trend in the GCR of moderate energy levels or which role GCR plays for climate change. They have done that before (see previous posts here, here, and here), and it’s wise to leave out statements which do not have scientific support. But it seems they look for ways to back up their older claim, and news report and the press release on their paper make the outrageous claim that GCR have been demonstrated to play an important role in recent global warming.
A recent analysis carried out by myself and Gavin, and published in JGR, compares the response to solar forcing between the GISS GCM (ER) and the observations. Our analysis suggests that the GCM provides a realistic response in terms of the global mean temperature – well within the bounds of uncertainty, as uncertainties are large when applying linear methods to analyse chaotic systems. The model does not include the GCR mechanism, and the general agreement between model and observations therefore is consistent with the effect of GCR on clouds being minor in terms of global warming.
As an aside to this issue, there has been some new developements regarding GCR, galaxy dynamics and our climate (see the commentary environmentalresearchweb.org) – discussed previously here.
Mark says
“I will now claim that the Sun will rise in the NNE 100 years from now. Either accept that or prove I’m wrong. The onus is yours.
Comment by Rod B”
You made the claim. Prove it.
But claiming that AGW is caused by GCR’s requires proving of that statement.
But Rod B, you’re being deliberately obtuse.
Mark says
“Denialists are not going to cross your eyes and dot your t’s if you step out into the public and start telling them that clouds are going to boil the seas and fry the land.”
And only you’ve said that, Bateman.
No climate scientist investigating AGW.
YOU.
Mark says
“I’m still having difficulty in understanding how c02 traps heat in the atmosphere. It doesn’t have heat capacity itself, but transfers heat”
P Wilson, why do you think that CO2 has no heat capacity?
Every other gas does.
Mark says
“Its a theoretical question: I received an email from the MET Office here in the UK asking them why the temperature plummeted during the solar eclipse.”
If it’s a theoretical question, then why did you have them answer it with “CO2 doesn’t have heat capacity”?
If they really answered that question with that, it isn’t a theoretical question, it’s a real one and I’d have to ask why you got the email asking the Met Office why it gets cooler when there’s an eclipse.
And if CO2 had not capacity for heat, it would be much colder with the sun out of the way than it is.
J. Bob says
#243 – Brian – The purpose was to show a simple example of how Fourier methods can be used to do some pretty good filtering, or extract information from a “noisy” signal. In this case, looking at certain frequencies (such as temperature)in a time series, and compare it visually with another signal (sunspots). You can also use these methods for correlation analysis, but that’s another story.
One of the nice things about Fourier convolution is that it reduces “phase delay” that many filtering methods introduce (Butterworth, Chebushev, etc.), and can work better at the end points, then say moving averages. However the best is to use a variety of tools, so as to cross check the results.
So in the example presented, the English data from 1659-2008 was converted to the frequency domain (via the Fourier transform [FFT]), ( after detrending and checking for “pre-whitening”). It was then filtered through a “mask” or “kernel”. The mask removed the freq. below 0.06 and above 0.12 cycles/year. The signal was then converted back to the time domain, using the inverse Fourier transform. The resultant graph shows the filtered temperature data about the 0.1 cycles/year area, as compared to sunspot activity. The spectral plot shows the + frequency content of the filtered temperature.
The Bandpass in the caption refers to the mask between 0-0.06 cycles/yr, while the lowpass refers to mask above 0.12 cycles/yr. And yes I would say the original data has some noise in it.
Does that help?
Hank Roberts says
There’s a P.Wilson posting many long debunked claims about physics over at:
http://www.spectator.co.uk/the-magazine/features/3755623/meet-the-man-who-has-exposed-the-great-climate-change-con-trick.thtml
Brian Dodge says
@ Carsten Brinch — 9 August 2009 @ 6:18 AM
Svensmark PR said “Our team at the Danish National Space Center has discovered that the relatively few cosmic rays that reach sea-level play a big part in the everyday weather. They help to make low-level clouds, which largely regulate the Earth’s surface temperature.”
Or “The Chilling Stars: A New Theory of Climate Change by Henrik Svensmark and Nigel Calder” which reportedly claims “that the individual water droplets that make up a cloud form mostly where ions (charged particles) have been created by passing cosmic ray particles”which offer “plausible explanations of the history of the Earth hundreds of millions of year ago.” and “can explain warming without invoking man-made CO2.” Which has become a mantra for denialists.
http://www.grassrootinstitute.org/system/old/GrassrootPerspective/ChillingStars.shtml
Svensmark doesn’t just offer an explanation to anomalies in temperature, but a whole new theory for Climate Change, with a lot of hyperbole* along the way.
*Definition: exaggeration Synonyms: PR, amplification, big talk, coloring, distortion, embellishment, embroidering, enlargement, hype, laying it on thick, magnification, overstatement, tall talk. http://thesaurus.reference.com/browse/hyperbole
Kevin McKinney says
FYI, everybody, BobFJ has been spamming me with “back door” copies of exchanges with the moderators. I have asked him to stop.
I am tired of his evidently inflated sense of self-importance, and plan to avoid doing anything that feeds it–which at this point includes responding to any of his posts.
Just me.
Let’s talk about something else.
Patrick 027 says
PS – the idea that aerodynamics says bees can’t fly:
The aerodynamic formulas that say bees can’t fly are approximate formulas that are quite innacurate on the scales of bees, though they apply well to airplanes. Different processes and factors come into and go out of dominance as scales change – a small vortex like a tornado can be in approximately in cyclostrophic balance, whereas large scale atmospheric motions tend to be in approximately geostrophic balance to a first approximation. Useful approximations are important but do not apply universally as would the full physical laws.
chris says
re #246
which particular “anomalies in temperature which cannot be explained by CO2, CH4 or other greenhouse gas effects” were you (and Svensmark) thinking of Carsten? Presumably they’re ones that also cannot be explained by solar irradince variations, volcanic effects, ocean circulation, etc. or some combination of these..
..can you give some examples?
Rod B says
Mark (251), my point exactly…
Paul says
Patrick 027 — 7 August 2009 You wrote:”Like how the sun’s mass was determined by planetary orbital characteristics was an extrapolation?”
Kevin McKinney – 8 August 2009 You wrote:”The deduction and calculation WRT AGW operate upon vast volumes of real-world data, obtained at considerable effort and expense, and parsed very carefully indeed. The observed responses of the various climate systems let us know that the “deductions” are largely correct.”
The estimation of the sun’s mass is a good example to consider in terms of the “canonical form” of scientific process. Observation: (Newton) This apple has just hit me on the head. (Brilliant) hypothesis: Gravitational attraction is proportional to the product of masses and an inverse square relationship with distance. A mathematical model for gravitation. Test: Extensive laboratory experiments with capacity to falsify the hypothesis in good Popperian fashion. Astronomic observations which (along with other Newtonian mechanics) generalised Keppler’s third law. Result: no falsification of the hypothesis from observations in the lab and celestial data, and hence the acceptance of a model which would be considered valid and useful for several hundred years. The fact that Newtonian mechanics were shown to have limited validity as one approached SOL in no way detracts from the beauty and utility of Newton’s laws of motion and of universal gravitation. Since this mathematical model passed the critical test of being compatible with all observations, then its application to a calculation of the mass of a planetary body is a scientifically credible “extrapolated deduction”.
Now, let us compare the canonical form of the validation of (Patrick 027’s example of) Newtonian mechanics with that of the hypothesis: “manmade CO2 causes dangerous global warming”. Observation(s): CO2 is an effective absorber and emitter of LW in certain wavebands. An increase in CO2 should therefore inhibit radiative cooling and hence cause an overall rise in tropospheric temperatures, and a decrease in stratospheric or TOA emission temperatures. The surface temperature has been getting warmer since the early 18th century, and a lot warmer since the 1970’s. Hypothesis: Manmade CO2 causes dangerous global warming. To avoid hair splitting, and to permit falsifiability of the hypothesis, I will define here “dangerous” to mean more than 1.2 deg K per doubling of Co2, (an arbitrary definition which can certainly be challenged, but one intended in context here to consider whether in reality we see positive or negative feedback relative to a K&T clear skies radiative calculation for CO2 on its own). Test: Because of the complexity of the system, vertical radiative calculations are parameterised into GCM’s. GCMs (generally, although not true of all elements for all models) assume a constant relative humidity, a neutral or positive feedback from cloud formation, an aerosol level during the decades from the 40s to the 70s which is used as a matching parameter to explain the observed temperature decrease over that period, and that the only variation introduced by the sun is via direct radiative forcing due to variation in TSI (no accounting for any amplification affects due to GCR or increased stratospheric absorption of UV) . OK so far. From these models, it is concluded that the hypothesis is validated.
Now a necessary condition for validating any numerical model is that it matches the observed data. Kevin Mc Kinney argues that this has been done. With the greatest respect to Kevin Mc Kinney, I have been looking for this now through scientific papers for about 4 months, and have not found it. On the contrary, I see (a) constant relative humidity assumption is validated only at surface levels (where one expects it to be because of primary model match to surface temperature and near-surface availability of water ). Mid and upper tropospheric levels appear to show a flat or declining trend. Since most of the positive feedback effect on CO2 comes from water vapour, this appears to be a critical failing in the argument for large positive feedback and needs to be resolved, (b) several papers have reported on the fact that mid- and upper tropospheric tropical temperature measurements are significantly different from the increased rate of rise predicted by the AR4 GCMs relative to surface temperature measurements, (c) one well-researched paper has seriously examined regional predictions from GCMs – which have negligible correlation to observed measurements – and has questioned whether one can sensibly believe an aggregate result from such models (d) Antarctic cooling and increased ice extent does not match model projections (e) the recent post-2001 cooling trend was not predicted by any of the AR3 models and there is still no good explanation of why we see simultaneously a cooling trend and a flat or declining total ocean heat content over a period where there was a step change increase in manmade production of CO2 and no volcanic activity to provide an aerosol explanation of the short-term temperature decrease, (f) ERBE data strongly suggests that all of the GCMs are underestimating the actual level of outgoing longwave radiation (surely a critical primary matching parameter??) (g) Lindzen (2009) uses the ERBE non-scanner data to deduce a negative feedback (h) stratospheric cooling is a necessary but not sufficient condition for validation of the hypothesis, and there has been a cooling trend observable over the last 30 years; however, examination of the trend is disturbing, since it reveals a series of step changes associated with volcanic events. If the temperature trend is broken down naturally into periods of cooling/flat/heating, rather remarkably from a statistical perspective even, there are very few occasions when we have seen simultaneously a decrease in stratospheric temperatures and an increase in SST, as predicted by the GCMs.
I would have to conclude from this that the statement: “The observed responses of the various climate systems let us know that the “deductions” are largely correct.” Is a triumph of optimism over the reality.
Patrick 027 says
Re 256 – “Spectator” – is this the UK version of Washington Times? Holy @$%@! I’m on the verge of thinking there’s no hope for these people – education would be a lost cause. We’ll just have to keep enough of a political majority and steamroll over them.
Jacob Mack says
P. Wilson if you feel focused and adventerous, read the intro here:Elements of Physical Chemistry By Peter Atkins, Julio de Paula, and then peruse chapter 19, (and I do mean peruse, it gets complex at this pont, but the qualitative nature is explained too with some simple numerical data, as well)perhaps after reviewing chapter 1 as well, which is the properties of gases chapter. Then pick up a good physical or world geography book and review the atmosphere, oceans and coupling of the two. Robert Christopherson is excellent. Oh and see here: http://www.fas.org/spp/military/docops/afwa/ocean-U1.htm for an excellent overview of water bodies covering of course, evaporation, conduction, convection, salinity, radiation, specific heat, and heat capacity among other integral topics.
tamino says
Re: #262 (Paul)
When you say things like “the recent post-2001 cooling trend” it reveals that when it comes to trends, you’re clueless.
When you further suggest that this is evidence that computer models are inadequate, it reveals that you’re similarly clueless about models. They show exactly the kind of variation we’ve observed since 2001. It’s not a trend, it’s noise.
All you’ve done is construct an elaborate fantasy that enables you to deny the reality and danger of global warming. Pity.
Patrick 027 says
Paul –
You are quite incorrect on several fronts.
Much more is known than you think. There is the paleoclimatic record. GCMs do not assume approximate relative humidity, etc. – those are results of GCMS. Water vapor does seem to be increasing in the upper troposphere. Then there is the snow/ice albedo feedback. Recent trends are within the range of what is to be expected given interannual variability. That’s not necessarily all, but I’ve got to go now.
Patrick 027 says
… And I should point out that the greenhouse gas absorption spectrum’s effect on Earth’s radiation to space can be directly observed with satellites.
Jacob Mack says
Paul #262: There is no “cooling trend,” post 2001, as this is not a long enough time period to constitute a climate trend.Relative humidity is not just an “assumption,” as water vapor levels do tend towards equilibrium, so while it is true that air can “hold” (not exactly true, as it is not holding per se; see here:hyperphysics.phy-astr.gsu.edu/HBASE/Kinetic/relhum.html)more water vapor, this does not equate to either an immediate “run away” effect, or an equilibration that leads to zero net warmign effect either.There is ofcourse equlibration, and specific humidity does go on the incline and decline within obervable paramaters in most cases, and some GCM’s do include flucuatng SH as well. However, you must also consider adiabatic processes in saturated air.
For more detailed info see this: Physics of climate By José Pinto Peixoto, Abraham H. Oort; starting on page 54 on Google Books; the aforementioned Peter Atkins reference and Robert Christopherson references would also serve you well (Atkins, 2009 is on Google books, but Christopherson is not found in good previews).
More recent: The Earth’s Atmosphere By Kshudiram Saha (2008) starting on page 44. There are later chapters covering this in more detail; try your habd at a few calculations and concepts, then if you have time we can discuss. The whole book is well written and accurate, so enjoy what you can.(Also chapter 3.3 takes a look at the specific heat of gases).
This a brief response, but I wish to continue this discussion with you as I think you are inquisitve, but lacking crucial data and uderstanding. Linzdzen has time and again been falsified as well, but we will hold off on that aspect of this discussion, if you choose to accept.
Jacob Mack says
Paul # 22, see my #264 post first before the latest installment addressed directly to you. Those references are helpful for you too.
Chris Colose says
Off-topic but I though RC readers might get a kick out of this
http://chriscolose.wordpress.com/2009/08/09/victorian-paleontologist-challenges-climate-change-and-is-letting-the-aps-know-about-it/
Jacob Mack says
Paul # 262, not 22…
Hank Roberts says
Paul, you say that you’ve been looking for 4 months for climate papers; what did you find?
Give us a list of the reading you’ve done that you found and how you found it; that will help make sense.
(So would more paragraph breaks, but citing your sources is the main thing you can do to be distinguishable from the people who just post beliefs here).
Does this describe one of the papers you found?
http://www.grist.org/article/Looking-for-validation/
Hank Roberts says
PS to paul — sometimes a useful exercise: take the terms used, and search
First Google
failing+”large+positive+feedback”+humidity+climate
Recognize any of the top hits that you get there?
(actually put double quotes around the string “large positive feedback” when you do the search; that breaks the link so you can’t click on a search that includes a quoted string; apparently a WordPress bug.
Well, then do the same with Google Scholar
Interesting difference, isn’t it? You find you’re in a different ballpark.
Now try the exact same search with Google Image Search
q=failing “large positive feedback” humidity climate
Do you find you’re back in the first ballpark again? Recognize the same blog names as from the first Google search?
So it’s interesting to know what your sources are and how you find them; that often determines your view of what’s out there to be found. The same people who pop up in the ordinary Google search also really completely dominate the image search.
Innarestin’ innit?
Rely on Scholar to start with.
Brian Dodge says
re 255 J. Bob — 9 August 2009 @ 9:49 AM
You mean like this?
http://www.woodfortrees.org/plot/noise/from:1900/to:2000/fourier/low-pass:11/high-pass:6/inverse-fourier/offset:-0.1/scale:2/plot/sidc-ssn/from:1900/to:2000/scale:0.001/fourier/inverse-fourier/plot/hadcrut3vgl/from:1900/to:2000/fourier/low-pass:11/high-pass:6/inverse-fourier/offset:-0.1
“You can also use these methods for correlation analysis, but that’s another story.”
I downloaded the numbers using the “raw data” link at the bottom of the woodfortrees page, plugged them into a google docs spreadsheet, and ran the correlation.
sunspot number to noise correlation = -0.126
sunspot number to temperature correlation = 0.063
Not exactly a Tung & Camp level of significance.
sidd says
I probably shouldnt ask, but please may I have citations from Mr. Paul illuminating his loooong screed discrediting CO2 forced global warming ?
the closest thing i see to a citation is Lindzen(2009) which i sincerely hope is not a reference to Lindzen’s address to the ICCC confederacy of dunces…
Mark says
re 259, that’s the point, isn’t it, though. Despite having a system that could not have bees flying, the models used WERE good enough to make aeroplanes and helicopters that *could* fly.
Mark says
“#243 – Brian – The purpose was to show a simple example of how Fourier methods can be used to do some pretty good filtering”
But filtering merely fits a pattern to a set of data.
Rather like the fitting of solar activity to the temperature data. Works when you find it (because you’re looking for a pattern and if it’s not there, you’ll look for a different pattern) but doesn’t predict anything.
Your fourier analysis is “interesting” in so far as it exercises fourier analysis.
It is completely uninteresting when it comes to what’s going on in the climate.
Carsten Brinch says
Hank and others called for answers and references. I sent it (should have had number 255), but the moderator obviously didn’t find it worth publishing. Which I absolutely do not understand.
Kindly
Carsten Brinch
[Response: sorry. I think Imay have deleted that by mistake. repost if you want. – gavin]
Mark says
RodB 261, that may be your point, but it has nothing to do with anything else on this thread.
Someone here says that GCR’s are a cause and doesn’t think it needs them to prove it, just say it.
So your “point” applies to THEM: they make the statement and must prove it. the current CO2 AGW theory DOES explain what’s going on and IS proven. If the GCR is doing it, they have to prove it, not just go “I’m not going to dot your i’s and cross your t’s”. Because neither are we. We aren’t going to dot Svenmark’s i’s or cross his t’s. And he hasn’t done it either. And we do not see him having proven anything, just fitted curves and made some assumptions with no reason behind them other than “we need to fit this curve”.
If someone wants to support Svenmark’s paper they need to dot the i’s and cross the t’s for that paper. Without that, the idea is unproven and unphysical.
CTG says
Re 262 Paul.
I assume the length of your post was an attempt to put as many inaccuracies into one post as possible.
I’ll just deal with one, as I’m sure others will be along soon to pick apart the rest of your diatribe.
You say “GCMs assume a constant relative humidity”
From the RealClimate model FAQ:
“Do models assume a constant relative humidity?
No. Relative humidity is a diagnostic of the models’ temperature and water distribution and will vary according to the dynamics, convection etc. However, many processes that remove water from the atmosphere (i.e. cloud formation and rainfall) have a clear functional dependence on the relative humidity rather than the total amount of water (i.e. clouds form when air parcels are saturated at their local temperature, not when humidity reaches X g/m3). These leads to the phenomenon observed in the models and the real world that long-term mean relative humidity is pretty stable. In models it varies by a couple of percent over temperature changes that lead to specific humidity (the total amount of water) changing by much larger amounts. Thus a good estimate of the model relative humidity response is that it is roughly constant, similar to the situation seen in observations. But this is a derived result, not an assumption. You can see for yourself here (select Relative Humidty (%) from the diagnostics).”
If you have spent 4 months researching this, but can still only regurgitate long-debunked sceptic talking points, it doesn’t say much for your research abilities. Took me all of 30 seconds to find that.
Ray Ladbury says
@262 Paul, now that was just sad. First, anthropogenic causation of warming is not a “hypothesis”. Rather, it is a consequence of what we know about climate. Constant humidity is not an “assumption.” Rather, it is a consequence of the physics and will be true on average. Finally, you claim that because you haven’t been able to find validation of the models, the models are falsified. OK anybody want to help Paul spot the logical fallacy in this argument.
You know, Paul, when you get tired of beating up on your little straw men, the science will be there for you. So will reality. Amazing how well they coincide.
Kevin McKinney says
Paul, #262–I’ll just mention one point: the ocean cooling thing mentioned in your point e) didn’t work out. Instrumental bias. . .
Here’s the actual paper; there’s a nice account of the story on Earth Observatory.
http://oceans.pmel.noaa.gov/Pdf/hc_bias_jtech_v3.pdf
Abstract:
“Two significant instrument biases have been identified in the in situ profile data used to estimate globally integrated upper-ocean heat content. A large cold
bias was discovered in a small fraction of Argo floats along with a smaller but more prevalent warm bias in eXpendable BathyThermograph (XBT) data. These biases appear
to have caused the bulk of the upper-ocean cooling signal reported by Lyman et al. (2006) between 2003 and 2005. These systematic data errors are significantly larger than
sampling errors in recent years, and are the dominant sources of error in recent estimates of globally integrated upper-ocean heat content variability. The bias in the XBT data is
found to be consistent with errors in the fall-rate equations, suggesting a physical explanation for that bias. With biased profiles discarded, no significant warming or cooling is observed in upper-ocean heat content between 2003 and 2006.”
Rene says
RE: #265 (Tamino)
We’ve now had ~10 years of non-increasing temperatures. How much longer would this need to continue before the upward trend hypothesis programmed into the models is abandoned? Would another 10 or 20 years of no upward trend do it?
DavidK says
Rene #283
Do you really think an “upward trend hypothesis (is) programmed into the models”?
Take a look here;
http://tamino.wordpress.com/2009/06/26/embarrassing-questions/
Tamino’s good at stats, but I would be surprised if the “trend” doesn’t point skywards with a vengeance by 2015 (I’ll lay odds it will only be another year or two).
Richard Steckis says
Ray Ladbury says:
“@262 Paul, now that was just sad. First, anthropogenic causation of warming is not a “hypothesis”. Rather, it is a consequence of what we know about climate. Constant humidity is not an “assumption.” Rather, it is a consequence of the physics and will be true on average.”
1. Anthropogenic warming IS a hypothesis. There has been no study to date that has advanced it to the theoretical stage let alone to fact.
2. It is likely that humans have had an impact on the climate. Not by the production of small amounts of GHGs compared to nature but by the act of modifying the landscape through land clearing, agriculture, the building of large cities etc.
3. You talk about water vapour feedback and constant humidity being a consquence of the physics as if that puts paid to any contrary view. However, the current level of understanding of how the physics actually works in a complex and dynamical system such as the climate system is not fully understood. If it were, there would be no further need for climate research. The fact is that what may be an easy explanation by theoretical physics may be a more complex and dynamic problem in the real world.
[Response: We don’t know everything so therefore we must know nothing. Brilliant. – gavin]
Rene says
OK, so after two more years you would begin to ask questions, and after six more Tamino would.
What are other> people’s timelines?
And does the IPCC have one?
J. Bob says
274-Brian
That’s basically the idea. I assume the harmonic numbers are the cut off freq. I’ve not spent the time at wood4trees as I would like, but Paul and I have discussed using Fourier methods with phase compensated recursive filters to tease more information out of some of the raw data sets. It looks like you can “detrend” the data prior to applying the transform, to reduce potential errors. It would be good to look at the spectral plot, if possible, to note any significant frequncies.
I assume the red chart is just filtered random noise.
No it’s not the best of correlations, nor is a single data set. It would be fun to toss in some other items like the NAO.
Hank Roberts says
Rene, see Robert Grumbine’s answer to your question about how long it takes to detect a trend:
http://moregrumbinescience.blogspot.com/2009/01/results-on-deciding-trends.html
Mark says
“We’ve now had ~10 years of non-increasing temperatures.”
Except they HAVE been increasing.
the last 10 years average is 0.2C warmer than the 10 years previous.
Petro says
RodB stated:
“His conclusion … is based in part on observation and supported by numerous other studies (which I can’t verify)”
Indeed, Rod, can you ever verify anything, except occasional quotes on the articles you do understand a zip?
There is no straw that thin there is a denialist hanging on it.
John P. Reisman (OSS Foundation) says
#274 J. Bob
I have a general question for you. Since the world is now cooling according to many sources on the internets:
– Why are we losing ice mass in the Arctic?
– Why is there global glacial mass loss?
– Why has the temperature been observed to rise 1.2 degrees F
Rather than getting lost in the noise, let’s just hear your explanation for these three.
I don’t think you realize it (or maybe you do), but you, and many others, are nitpicking at things that are not relevant to the recognizable signal, separate from the noise.
Richard Steckis says
Gavin says:
“[Response: We don’t know everything so therefore we must know nothing. Brilliant. – gavin]”
Gavin. I said nothing of the sort. Please be mindful that we actually do know very little about the natural world. That is why we become scientists. To learn and understand.
Having said that there is the common quote that “it a wise man who knows enough to know that he knows nothing.
Rod B says
Petro (290), how on earth could I be able to verify someone else’s sources? How on earth could you? Or what possible difference could it make? Talk about your strawman!
Hank Roberts says
Steckis — there is? Where?
Quote the string and paste it into Google.
John P. Reisman (OSS Foundation) says
#285 Richard Steckis
Huh, you’re a scientist?
http://www.merriam-webster.com/dictionary/theory
Gravity IS a hypothesis:
3 : the general or abstract principles of a body of fact, a science, or an art
AGW IS a theory:
1 : the analysis of a set of facts in their relation to one another
Patrick 027 says
Richard Steckis:
There is a lot we don’t know, of course. But:
1. some of the most useful portions of our bodies of knowledge are useful approximations (in fluid dynamics in particular – different density, viscocity, electromagnetism, corilis effect, space and time scales bring various different patterns of behavior into dominance; for some purposes, radiation can be ignored for weather processes over the course of hours or days, but radiation is essential for longer term climate conditions; you need General Relativity to describe two neutron stars in a tight orbit, but Newtonian mechanics works just fine for designing buildings; the curvature of the Earth is not so important for local maps).
2. Even if/when what we know is a drop in the bucket compared to what there is, it can still be plenty for some purposes. Consider that we have been able to rely on computers without experimental confirmation or falsification of superstrings for quite some time.
There are mountains of data on climate. A lot is known. How much more do you need to be convinced?
Mark says
“Please be mindful that we actually do know very little about the natural world.”
We do?
Please tell us how wide our lack of knowledge is.
Can’t, can you.
Arguing from personal incredulity.
David B. Benson says
Richard Steckis (285) — Here is how BPL briefly puts it.
Barton Paul Levenson:
1. CO2 is a greenhouse gas (Tyndall 1859).
2. CO2 is rising (Keeling et al. 1958).
3. The new CO2 is mainly from burning fossil fuels (Suess 1955).
4. Temperature is rising (NASA GISS, Hadley CRU, UAH, RSS, etc.).
5. The increase in temperature correlates with the increase in CO2 (60–76% for temp. anomaly and ln CO2 for 1880-2007). See
http://bartonpaullevenson.com/Correlation.html
Looks soundly established to me. Also, I recommend you to read “The Discovery of Global Warming” by Spencer Weart:
http://www.aip.org/history/climate/index.html
Carsten Brinch says
Carsten Brinch says:
10 August 2009 at 2:06 AM
Hank and others called for answers and references. I sent it (should have had number 255), but the moderator obviously didn’t find it worth publishing. Which I absolutely do not understand.
Kindly
Carsten Brinch
[Response: sorry. I think Imay have deleted that by mistake. repost if you want. – gavin]
That’s ok Gavin. But in the country where I live – it’s called Denmark – we don’t usually copy posts, before we send it. We treat each others words/notes with respect. I have never experienced a note being deleted by mistake here. And I have contributed in public debate for more than 20 years. In several countries.
You’re not wrong! We’re not wrong. We’re just different. And that’s ok. But I can’t repost what’s gone! It’s a sort-of laid back attitude, that I do not quite understand. So I must try harder.
Kindly
Carsten Brinch
(Therapist)
Jacob Mack says
Paul, it seems you have plenty of reading material and posts to read. If you have further questions or comments, this is where to good data and papers.