We’ve had a policy of (mostly) not commenting on the various drafts, misquotes and mistaken readings of the Fourth Assessment report (“AR4” to those in the acronym loop) of the IPCC. Now that the summary for policy makers (or “SPM”) has actually been published though, we can discuss the substance of the report without having to worry that the details will change. This post will only be our first cut at talking about the whole report. We plan on going chapter by chapter, hopefully explaining the key issues and the remaining key uncertainties over the next few months. This report will be referenced repeatedly over the next few years, and so we can take the time to do a reasonable job explaining what’s in it and why.
First of all, given the science that has been done since the Third Assessment Report (“TAR”) of 2001 – much of which has been discussed here – no one should be surprised that AR4 comes to a stronger conclusion. In particular, the report concludes that human influences on climate are ‘very likely’ (> 90% chance) already detectable in observational record; increased from ‘likely’ (> 66% chance) in the TAR. Key results here include the simulations for the 20th Century by the latest state-of-the-art climate models which demonstrate that recent trends cannot be explained without including human-related increases in greenhouse gases, and consistent evidence for ocean heating, sea ice melting, glacier melting and ecosystem shifts. This makes the projections of larger continued changes ‘in the pipeline’ (particularly under “business as usual” scenarios) essentially indisputable.
Given all of the hoopla since the TAR, many of us were curious to see what the new report would have to say about paleoclimate reconstructions of the past 1000 years. Contrarians will no doubt be disappointed here. The conclusions have been significantly strengthened relative to what was in the TAR, something that of course should have been expected given the numerous additional studies that have since been done that all point in the same direction. The conclusion that large-scale recent warmth likely exceeds the range seen in past centuries has been extended from the past 1000 years in the TAR, to the past 1300 years in the current report, and the confidence in this conclusion has been upped from “likely” in the TAR to “very likely” in the current report for the past half millennium. This is just one of the many independent lines of evidence now pointing towards a clear anthropogenic influence on climate, but given all of the others, the paleoclimate reconstructions are now even less the central pillar of evidence for the human influence on climate than they have been incorrectly portrayed to be.
The uncertainties in the science mainly involve the precise nature of the changes to be expected, particularly with respect to sea level rise, El Niño changes and regional hydrological change – drought frequency and snow pack melt, mid-latitude storms, and of course, hurricanes. It can be fun parsing the discussions on these topics (and we expect there will be substantial press comment on them), but that shouldn’t distract from the main and far more solid conclusions above.
The process of finalising the SPM (which is well described here and here) is something that can seem a little odd. Government representatives from all participating nations take the draft summary (as written by the lead authors of the individual chapters) and discuss whether the text truly reflects the underlying science in the main report. The key here is to note that what the lead authors originally came up with is not necessarily the clearest or least ambiguous language, and so the governments (for whom the report is being written) are perfectly entitled to insist that the language be modified so that the conclusions are correctly understood by them and the scientists. It is also key to note that the scientists have to be happy that the final language that is agreed conforms with the underlying science in the technical chapters. The advantage of this process is that everyone involved is absolutely clear what is meant by each sentence. Recall after the National Academies report on surface temperature reconstructions there was much discussion about the definition of ‘plausible’. That kind of thing shouldn’t happen with AR4.
The SPM process also serves a very useful political purpose. Specifically, it allows the governments involved to feel as though they ‘own’ part of the report. This makes it very difficult to later turn around and dismiss it on the basis that it was all written by someone else. This gives the governments a vested interest in making this report as good as it can be (given the uncertainties). There are in fact plenty of safeguards (not least the scientists present) to ensure that the report is not slanted in any one preferred direction. However, the downside is that it can mistakenly appear as if the whole summary is simply up for negotiation. That would be a false conclusion – the negotiations, such as they are, are in fact heavily constrained by the underlying science.
Finally, a few people have asked why the SPM is being released now while the main report is not due to be published for a couple of months. There are a number of reasons – firstly, the Paris meeting has been such a public affair that holding back the SPM until the main report is ready is probably pointless. For the main report itself, it had not yet been proof-read, and there has not yet been enough time to include observational data up until the end of 2006. One final point is that improvements in the clarity of the language from the SPM should be propagated back to the individual chapters in order to remove any superficial ambiguity. The science content will not change.
Had it been up to us, we’d have tried to get everything together so that they could be released at the same time, but maybe that would have been impossible. We note that Arctic Climate Impact Assessment in 2004 also had a similar procedure – which lead to some confusion initially since statements in the summary were not referenced.
How good have previous IPCC reports been at projecting the future? Actually, over the last 16 years (since the first report in 1990), they’ve been remarkably good for CO2 changes, temperature changes but actually underpredicted sea level changes.
When it comes to specific discussions, the two that are going to be mostly in the news are the projections of sea level rise and hurricanes. These issues contain a number of “known unknowns” – things that we know we don’t know. For sea level rise the unknown is how large an effect dynamic shifts in the ice sheets will be. These dynamic changes have already been observed, but are outside the range of what the ice sheet models can deal with (see this previous discussion). That means that their contribution to sea level rise is rather uncertain, but with the uncertainty all on the side of making things worse (see this recent paper for an assessment (Rahmstorf , Science 2007)). The language in the SPM acknowledges that stating
“Dynamical processes related to ice flow not included in current models but suggested by recent observations could increase the vulnerability of the ice sheets to warming, increasing future sea level rise. Understanding of these processes is limited and there is no consensus on their magnitude.”
Note that some media have been comparing apples with pears here: they claimed IPCC has reduced its upper sea level limit from 88 to 59 cm, but the former number from the TAR did include this ice dynamics uncertainty, while the latter from the AR4 does not, precisely because this issue is now considered more uncertain and possibly more serious than before.
On the hurricane/tropical strorm issue, the language is quite nuanced, as one might expect from a consensus document. The link between SST and tropical storm intensity is clearly acknowledged, but so is the gap between model projections and analyses of cyclone observations. “The apparent increase in the proportion of very intense storms since 1970 in some regions is much larger than simulated by current models for that period.”
We will address some of these issues and how well we think they did in specific posts over the next few weeks. There’s a lot of stuff here, and even we need time to digest it!
Steve Bloom says
Re #197: Martin, you keep overstating the significance of the Roesch results. Can you quote something from the paper that backs up your POV? The abstract sure doesn’t seem to:
“Surface albedo (ALB), snow cover fraction (SCF) and snow water equivalent (SWE) of state-of-the-art coupled climate models are compared and validated against ground-based and remote-sensed climatologies. Most IPCC AR4 climate models predict excessive snow mass in spring and suffer from a delayed spring snow melt while the onset of the snow accumulation is generally well captured. This positive SWE bias is mainly caused by too heavy snowfall during the winter and spring season. Seasonal cycles of snow cover area (SCA) at continental scales are captured reasonably well by most participating models. Two models clearly overestimate SCA over both Eurasia and North America. Year-to-year variations are reasonably well captured over both Eurasia and North America in winter and spring. The most pronounced underestimation in the interannual SCA variability is generally simulated during snow melt. The pronounced negative SCA trend that has been observed from 1979 to 2000 is only partly reproduced in the AR4 model simulations. Furthermore, the computed trends show a large spread among the models. Results from time slice simulations with the ECHAM5 climate model suggest that accurate sea surface temperatures are vital for correctly predicting SCA trends. Simulated global mean annual surface albedos are slightly above the remote-sensed surface albedo estimates. The participating AR4 models generally reproduce the seasonal cycle of the surface albedo with sufficient accuracy while systematic albedo biases are predicted over both snow-free and snow-covered areas, with the latter being distinctly more pronounced. The study shows that the surface albedo over snow-covered forests is probably too high in various state-of-the-art global climate models. The analysis demonstrates that positive biases in SCA are not necessarily related to positive albedo biases. Furthermore, an overestimation of area-averaged SWEs is not necessarily related to positive SCA anomalies since the relationship between SWE and SCF is highly nonlinear.”
Also, to say that the WG1 SPM “ignored the draft reviewers” on this point is a rather strong statement. Evidence for that?
Steve Bloom says
Re #200: Dave, the AR process is designed to be conservative, and this is a good example of why. New research that only has one or two years of data behind it (as is the case with the dynamical melting) and appears in the last year before publication isn’t going to be reflected very well, and the problem is compounded by the long lag between ARs. OTOH I think the climate science community recognizes the importance of this research, and as we saw with the well-timed Science article (that got picked up in much of the coverage) can be relied upon to give it appropriate emphasis. That emphasis will increase greatly if the next year or two of data bear out the apparent trend, and at that point I don’t think it will be a problem that the AR4 didn’t say much about it.
Eric (skeptic) says
Re 186, 188: Steve, not sure. But my 2nd link in #138 is obviously inadequate since atmospheric CO2 can’t go to zero over time without fixation. I thought I was only considering diffusion and transfer in the impulse response. Hank, thanks for the link, that will get me started.
SteveB says
The Report says that “very likely” means more than 90%. But what does 90% mean, in a scientific sense? How was it calculated? I know they don’t mean that 9 out of 10 worlds would have a certain level of warming. I also don’t think it means that 9 out of 10 people on the panel believe something, or that 9 out of 10 computer models find something. So I am puzzled. How did they derive this 90% number?
BarbieDoll Moment says
RE: “rate of oceanic CO2 fixing”
Have a try at these.
DOE (1994) Handbook of methods for the analysis of the various parameters of the carbon dioxide system in sea water.Version 2, A. G. Dickson & C. Goyet, eds. ORNL/CDIAC-74
http://andrew.ucsd.edu/co2qc/handbook.html
“Program Developed for CO2 System Calculations (Ernie Lewis and Doug Wallace of Brookhaven National Laboratory. ORNL/CDIAC-105)”
http://cdiac.ornl.gov/oceans/co2rprt.html
And you could also refer to
GLobal Ocean Data Analysis Project
http://cdiac.ornl.gov/oceans/glodap/Glopresult.htm
Oceanographic Numeric Data Packages
http://cdiac.ornl.gov/oceans/doc.html
llewelly says
Steve Reynolds, you link to the draft. It is not the final document; it cannot resolve this confusion.
Rod B. says
re the speculation that rising sea levels from GW will innudate 2000 Indonesian islands: Doesn’t the sea level around Indonesia already vary by 1/2 to 1 meter just from the El Nino oscillations?
Pat Neuman says
‘There is so much ice here that if it all melted, sea levels globally would rise hugely – perhaps as much as 80m. Say goodbye to London, New York, Sydney, Bangkok, Rio… in fact, the majority of the world’s major cities.
But will it happen?’
http://news.bbc.co.uk/2/hi/science/nature/4315968.stm
– My answer is yes. I don’t know the specific timing of the jumps to higher levels but I’m confident that the governments should begin major action to get ready for them. – comment on Newsvine at:
http://stevenwandrews.newsvine.com/_news/2007/02/05/553769-climate-change-in-graphics#c513798
Rod B. says
Just to clarify re #s 95, 100, et al, which is kind of a side issue but I think important to these discussions. The basis of the debate from many/most skeptics is, of course, economic. Despite the often poo-pooing by many AGW proponents of the economic consequences (some it seems even delight in it), those consequences are in fact tremendous. (Most of) us skeptics naturally and properly think this should not be accepted because of the warnings from a few (O.K. a lot) scientists without extreme scrutiny, even if they prove correct. I see nothing perverse or pernicious about that. I see it as responsible.
As an aside, (most of) us skeptics prefer not to “attack” the science/scientists. Scrutiny and attack are not the same thing. Though I admit the process, unfortunately, seems to be progressing to attack – re-attack – attack back, and is leaving scientific discourse, heated as it might be, in the dust.
Steve Reynolds says
llewelly> Steve Reynolds, you link to the draft. It is not the final document; it cannot resolve this confusion.
OK, but it appears to me that either the SPM is wrong or the draft figure 2.4 is wrong on the issue of radiative forcing % change. I guess we will find out in May how IPCC handles it.
Is RC staff who have seen a more recent draft allowed to comment on this?
Pavel Chichikov says
Re: # 34
” Would the moderators consider deleting the ignorant, sneering, hostile, insulting, content-free and completely worthless remarks from the flame troll identifying himself as “juandos”? Such drivel belongs on Free Republic or some other right-wing hangout, not here. ”
I don’t agree at all, since I found the response to juandos’ insinuation most valuable. Now, if people accuse IPCC scientists of screwing the world for money I have some information to use.
Rob Jacob says
I’d like to use this thread to make a small complaint about the IPCC website and the confusion I think they’ve sown about what was released last Friday.
The graphic at the top of that page says “The first volume will be released” and above that is “Paris, 2 February 07”. This makes one think its WG1 that will be released. But we know it’s just the SPM of AR4’s WG1 report.
And its needlessly difficult to find out when the actual WG1 report will be released. You first have to click on “About IPCC” and then “Working Group I” which takes you to this page where you finally find that the report will be on the web in May and in book form in late June.
A final annoyance (and I hope embarrassment for the IPCC webmasters), if you click on “Calendar of Events” on the main IPCC website you go to a calendar of meetings…for 2006.
Great job on the publicity leading up to the SPM, turning off the lights on the Eiffel Tower and all that. And this was just the SPM. What will they do for the real WG1 report?
Ike Solem says
RE#183 and the role of physics in climate,
It’s true that climate involves a lot of physics, but it also involves a lot of biology – so climate science falls into the ‘interdisciplinary category’. One of the most interesting and noteworthy pieces of work on this is the recent report, http://www.sciencedaily.com/releases/2007/01/070131204349.htm which describes the use of the tropospheric emission spectrometer to get a unique picture of water vapor and ozone distributions.
The abstract on their research is at Nature, Feb 1 2007, “Importance of rain evaporation and continental convection in the tropical water cycle”, John Worden, David Noone and Kevin Bowman and The Tropospheric Emission Spectrometer science team and data contributors.
To quote from the ScienceDaily report, “The team also found evidence that water transported upwards by thunderstorm activity over land originates from both plant “exhalation” in large forests and evaporation over nearby oceans. The balance between these two different sources tells us how vegetation interacts with climate and helps maintain regional rainfall levels.
“This link between vegetation, hydrology and climate has implications for how societies choose to manage their ecological resources,” said Noone. “Our measurements provide a baseline against which future changes in vegetation-climate interactions can be measured.”
The link between the biosphere and the physical climate system has always been a matter of contention, but here we see how the different traditional branches of science all cooperate in the study of this system, from engineering to physics to biology and chemistry. This also demonstrates the threat of deforestation on a global basis to the climate system, i.e. less net uptake of CO2 as well as regional drying and drought.
It also demonstrates the importance of continuing to monitor the Earth from space – and since one of the main uncertainties in the IPCC report was the meridional overturning circulation changes, a network of ocean sensors is also needed. However, funding is lacking and many satellites need to be replaced, according to the National Academy of Sciences:
Aging weather satellite fleet at risk: According to a new study, crucial weather and environmental satellites soon will fail, and their replacements are insufficient and behind schedule.
This is an unbelievable state of affairs. Is it gross incompetence, or a deliberate effort to prevent data from being collected?
Klaus Ragaller says
As a former, now retired researcher I follow with great respect the highly professional and fast advancing work on a scientific understanding of the climate. I feel a responsibility to help communicating the results to the broader public. A completely out-of-perspective emphasis is put by many skeptics on the IPCC 90 % statement of human influence. Behind the unspecified 10% a lot of people try to hide. I think that examples of scenarios in this 10% space and the underlying unrealistic combinations of parameters would be helpful. Is this possible? I would be glad, if someone from the research community could comment on this?
Martin Lewitt says
Re: Steve Bloom #201,
To quote from the Roesch full text:
“The mean annual surface albedo of the 15 AR4 models amounts to 0.140 with a standard deviation of 0.013. All AR4 models are slightly above the mean of PINKER (0.124) and ISCCP-FD (0.121). However, on a global scale, differences among the models, as well as the biases between the models and the remote-sensed climatologies, are small. Three (MRI-CGCM2.3.2, INM-CM3.0, and CSIRO-Mk3.0) out of the 15 AR4 models are more than 1 standard deviation above the all-model mean; two models (GISS-EH and PCM)are more than 1 standard deviation below the all-model average.”
Notice, that ALL of the models are above are above the surface albedo data, even those that are more than one standard deviation below the all model mean. I don’t know what lead Roesch to characterize the model errors as “small”. Perhaps if he had characterized them as large, or as several times the net annual global energy imbalances, he would have had a harder time getting published. The positive surface albedo biases of the models are 0.140 minus 0.124 and 0.140 minus 0.121 against the two data sets respectively. Apply these globally/annually averaged surface albedo biases to the corresponding solar surface flux of 198W/m^2 and you get the correlated model biases of 2.8 to 3.8 W/m^2.
As to my “ignored the draft reviewers” claim. I am one of the draft reviewers, and pointed out that this correlated bias was several times the energy imbalances we are trying attribute and project. I pointed out that not only was the paper relevant to Chapter 9, for which it had been submited but also for Chapter 10 on global projections. My calculations on the time were based on TOA fluxes, but in a follow up letter I translated that to the surface fluxes, and suggested that if they couldn’t bring themselves to drop Chapter 10 since the models had been invalidated, that they at least delay AR4 until the models could be corrected and the scenerios rerun.
The solar surface flux is from:
http://www.grida.no/climate/ipcc_tar/wg1/fig1-2.htm
Charles Muller says
#204 Steve, you should read the Guidance Notes for AR4 lead authors therafter. Likelihood (point 14) express either a quantitative analysis, or elicitation of expert views. The second kind of likelihood is of course more subjective.
http://www.ipcc.ch/activity/uncertaintyguidancenote.pdf
Charles Muller says
#183
Tamino : “Regarding the sun: its variations are indeed far too small to account for the temperature increase in the modern global warming era. But they are not negligible, and are included in all realistic climate models. Without accounting for solar variations, the models fail to match the early-20th-century warming very well; with them, the models match the entire 20th century with stunning accuracy.”
Yes, but have a look at Table 10.2.1 in Second Draft. It summaries the radiative forcing agents models take account of, for the 20th simulation and 21st projection. If I correctly understand, solar forcing is omitted or put as constant in the great majority of models. So, how do you expect these models are correct, notably for 1915-45 warming?
Risto Linturi says
#90 and #157, I wish to add another viewpoint to the usability of coal. Our current infrastructure prefers oil and gas, but much of the growth in China is based directly on coal. Coal can be turned into oil and gas – the overhead cost per barrel is about 15 dollars and investment requirement is about 50 thousand dollars per one barrel/day calculated from the latest plant being built in China. The total investment cost to double todays natural oil production by using coal would be only a few trillion dollars. There are already several large plants being built for this purpose in China. These investments are generally considered profitable when price of crude oil is constantly higher than 60-70 dollars per barrel. The higher the price, the faster the return of investment. When crude oil is over 90 dollars, this industry has so high profitability / return on investment that there is scarcely money left for any other investments even though they also would be profitable. So the scenario that alternatives to fossile fuels would become economically attractive when oil gets scarce is not clear. And if you ask where all these trillions come from – it is easy. Oil producers get several trillion each year. There is money, it is sucked away from every other profitable investment scenario.
This is not a good omen for voluntary controls. If I look at the different scenarios from the viewpoint of a general futurist, it seems that the A1FI is most likely based on existing trends and facts. All lower scenarios include wishful thinking that we could get our act together and get strong enough mandatory global controls. This is naturally my wish also but from the times of TAR, according to Shell Global Scenarios to 2025, the energy efficiency of the world has decreased and I claim we have followed A1FI and still are. And converting coal to oil will only make things worse from the efficiency perspective. I understand that this is a discussion for climate issues but as the different scenarios in the IPCC reports are so crucially different it might be a good idea to start one discussion on the different scenarios and assumptions behind them – and perhaps analyse which scenario we have followed from the time these scenarios were first introduced in IPCC reports.
Eli Rabett says
Re 95 and 120, not only does the problem of climate change have a high procrastination penalty, in the context of US politics it has immediate political penalties. The US lost its chance to actually do something at low economic cost, not in this administration, but in the 1990s. The Republican Congress bears much of the blame, but the decision of the Clinton/Gore administration to push the science while spending no political capital on actually doing anything was a disaster whose fruits we reap today.
John L. McCormick says
RE #218
Risto, you provide a stark view of the post-oil world feeding greater demand as living standards improve in two of the most populated nations. I strongly agree with your comment. Coal to liquids and gas are here and now.
RC threads are challenged when contributors try to launch comprehensive discussions on oil-gas-coal-nuclear options because they attract the simple solution advocates of energy efficiency and renewables. Regrettably those approaches are not sufficient to meet the AGW future and increasing demands of an expanding global population. And, yes, they help and will more so when a carbon tax is finally implemented.
The dedicated sponsors of this web page (contributing their valuable time) are not strong on energy policy but, in the modeling work, they rely upon projections borne out of scenarios. I do not have much faith in the IPCC scenarios because IMO they do not give enough focus to coal as the alternative to oil and gas. It is inevitable because the nearly 500 million car global fleet will not run on wind and solar (OK, a few will).
If you are making a motion to the RC managers, I second it and urge them to invite contributors such as Jae Edmonds to lead the discussion.
Jeff Weffer says
Just noting that solar variability is vastly understated.
Just look at this chart. The 11-year cycle varies by about 1 W/m2 over the cycle (averaged) but the daily/annual irradiance varies by as much as 4 W/m2.
If you look at solar irradiance proxies over time, the average varies by as much as 4 W/m2. Converting this variance into percentage terms (0.1 per cent which is actually 1-4 W/m2), just masks the total energy variation which is more than CO2 currently.
http://en.wikipedia.org/wiki/Image:Solar-cycle-data.png
[Response: Wrong. You are (again) confusing TSI variation with the radiative forcing which is smaller by a factor of 0.7/4 = 0.175 (due to albedo and geometric effects). -gavin]
tamino says
Re: #217
When you say “Table 10.2.1 in Second Draft” what exactly are you referring to?
If you go to this page, and scroll down to the third graph (actually part “a” of the 2nd figure), you’ll see that the time series of effective forcing due to solar irradiance not only shows a secular increase in the early 20th century, it also shows a cyclic variation (due to the solar cycle). You can even get the numeric data here.
Nick Gotts says
Re #95 “I think there is a constructive role in the debate for conservative economic opinion. We certainly will not be able to address climate change if our economy is weakened.”
I’m very dubious about the first sentence here, although I suppose it could depend what kind of conservative economic opinion you mean: serious measures against climate change are going to require at the very least controls on the operation of “free markets” of a scale and depth unprecedented in capitalist economies. This is why the right-wing thinktanks are full of denialists: anthropogenic climate change shows up their economic nostrums (if everyone follows their own selfish interests, and governments stop interfering, the magic of the market will make us all rich and happy) for the nonsense they are. The second sentence is also doubtful: it is the strongest economies that are contributing most to the problem. Suppose some new disease were to wipe out humanity next week, removing the economy entirely: result, so far as I can see, a fall in GHG emissions faster than anyone has proposed trying to achieve. On a smaller scale, a 1929-style crash and subsequent slump would at least temporarily reduce emissions. There are plenty of good reasons not to want this to happen, but it would give us a few more years to bring emissions under control. Of course one can argue that a slowdown would reduce investment in new technologies, but until we can be confident such investment will reduce rather than increase emissions, I think the truth is more or less the opposite of what you claim: the stronger the economy, the faster the problem will get harder to solve. The basic problems are not technological, but political: unless and until both governments and publics of the major emitters are ready for serious action, emissions will go on climbing unless there is a slump; once they are, the technical problems are surmountable.
Paul M says
How about using a 15 mile long nanoshooter orbiting the earth that collects CO2 and methane and sends it escape velocity to mars? This would help Mars’ atmosphere and rid ours of that pesky stuff that is heating us up. It would give many scientists research jobs and could be funded by the upcoming carbon tax imposed on large users.
Barton Paul Levenson says
[[ It is inevitable because the nearly 500 million car global fleet will not run on wind and solar ]]
Unless the wind and solar are used to generate hydrogen, or plug-in electric cars come into widespread use.
Barton Paul Levenson says
[[How about using a 15 mile long nanoshooter orbiting the earth that collects CO2 and methane and sends it escape velocity to mars? This would help Mars’ atmosphere and rid ours of that pesky stuff that is heating us up. It would give many scientists research jobs and could be funded by the upcoming carbon tax imposed on large users. ]]
Huh? What? Come again?
Michael Tobis says
The reader discussion on the New Tork Times (free registration required, I believe) is very revealing about the public reaction to the SPM and anthropogenic climate change issues.
http://questions.blogs.nytimes.com/2007/02/02/a-reader-forum-on-climate-change/
It seems to present a broad sample of opinion, perhaps rather broader than we see on focused sites like this one. There is quite a range of understanding, as might be expected, though the “truth must be somewhere in between these two extremes” camp is perhaps underrepresented. People repeating even the most baseless fabrications of the naysayers present an interesting if discouraging sample. For example, the NYT discussion seems to have reached a vague consensus that glaciers are growing worldwide. Those who are familiar with climate science and unfamiliar with climate politics would do well to grit their teeth and plow through the discussion to really understand what we are up against.
I think that climate change is in itself very important, of course, but I also think the way that the debate has proceeded is itself a matter of great concern.
Even if we manage to muddle through the greenhouse problem, the ways in which organized promulgation of misinformation can so effectively damage public discourse will continue to limit the capacity of democratic process to cope with the increasing complexity we face. Humanity has become the dominant force in an increasingly artificial environment. Willy-nilly, we have replaced nature. It’s hard to understand how we can do a decent job of it if the forces of confusion and doubt can be so much more effectively funded and armed in the public debate than the forces of reason and evidence.
For me it comes down to this. If private interests are encouraged to devote as much PR effort as may be profitable to promulgate their self-interest, while public agencies are effectively enjoined from making comparable efforts to elucidate and promote the public interest on the grounds that it might intersect with their self-interest, how is the public interest to be effectively weighed in the public conversation?
Dan says
re: 224. Why does a version of Marvin the Martian’s “Illudium Q-36 Explosive Space Modulator” come to mind when reading that? ;-)
Martin Lewitt says
Re: My own post #215,
Upon rereading Roesch (again), I find his characterization of the globally and annually averaged albedo bias as “small”, to probably be “small” relative to the much larger local albedo errors he documented in the Snow Cover Area, and other data. The globally averaged albedo biases of 0.016 and 0.019 are “small” compared to the local albedo changes when going from a Snow Cover Area to one that isn’t, as high as 0.7 or 0.8. Roesch found that even within Snow Cover Areas, the model albedos were also significantly higher when those areas were forested.
Although Roesch discussed how these discrepencies might be diagnostic of the problems the AR4 models have especially at northern latitudes, he did not appear to notice the significance of the impact on their global attribution and projection. He also found positive surface albedo biases in the tropical deserts, but did not discuss the impact of these at all, probably because he is somewhat of a snow specialist. I am curious about what it is that is confounding the modelers there.
Steve Bloom says
Move over, Wally! Former skeptic (and NHC head) Hugh Willoughby has come up with the new best AGW metaphor:
“Hugh Willoughby, senior scientist at the International Hurricane Research Center at Florida International University, a one-time global warming skeptic himself (15 years ago, before the data became overwhelming), said Monday that what worries him, more than the known problems cited by the panel, are what he calls ‘the unknown unknowns,’ the unanticipated climate changes.
“Weather patterns and ocean currents, the product of ‘unevenly heated rotating fluids,’ are already difficult enough to predict. ‘It’s like being at a bank when a crack addict robs it. You don’t want to get the crack addict excited.‘”
Ike Solem says
RE#218 and #220,
The issue of coal-to-liquid and ‘clean coal’ also illustrates some of the problems at preeminent US universities. Take (as the outstanding example) Stanford Universities Global Climate and Energy Project (GCEP) which looks pretty good at first glance – a program designed to lower greenhouse gas emissions.
However, the details are disturbing. The lead sponsor for the program is Exxon Mobile, and not only do they (along with GE and Schlumberger oil services, and Toyota) control decisions on which projects are to be funded, they also control all patents created through the research:
From the agreement:
6.05 Subject to paragraphs 6.07 and 8.04, the University and each Sponsor will have, without restriction and in its sole discretion and without conferring with or accounting to anyone, a perpetual, nonexclusive, worldwide, irrevocable, royalty-free right and license to use, disclose, publish, republish, distribute, copy, prepare derivative works, sell, or otherwise transfer without limitation to any third party, whether affiliated or not, all or any part of the Project Technology, with or without extending to that third party the right to sublicense, sell, or otherwise transfer the Project Technology to other third parties.
It’s hard to find exactly how they distribute funds, but their support for Advanced Coal certainly does not seem to fit their stated goal of ‘reducing greenhouse emissions’.
In any case, universities engaged in renewable energy research using any public funds should be required to make any patents produced through such research available to all interested parties using non-exclusive licensing of university-owned intellectual property – in other words, everyone gets to use the knowledge produced.
Harold Brooks says
Re: 230
For clarification, Hugh Willoughby was never head of the National Hurricane Center. He was head of the Hurricane Research Division of the Atlantic Oceanographic and Meteorological Laboratory before going to Florida International University.
Paul M says
This is why the world is going to fry like an ant under a dirty seven yr olds magnifying glass. 15 yrs ago, Hugh Willoughby, whatever he was, head of hurricane research or hurricane center, or a hooters hurricane drink mixer, was ostensibly a global warming skeptic himself. Now he is an advocate for change. that may make a good made for TV movie and even his anecdodal analogies to a bank robber are cute, it does absolutely nothing for remedying the global crisis we have. So get a chuckle out of Hugh’s cute remark, but remember Rome isn’t burning, the planet is burning. A twenty yr old student today can realize the earth’s climate is changing, but fifteen yrs ago might have had a different opinion while he or she was peeing on their kindergarten teachers floor. Just like they may have been playing policeman or fireman fifteen yrs ago, Hugh was playing climatologist, and he got it wrong. Again, call a novelist, it is quite human, but it does nothing for change. Stop the mental masturbation with the coal, oil co2, answers, that is not even close to the answer. We need a new paradigm and we need it fast. Unfortunately I don’t think humans, or at least anyone over the age of 16 is going to do it. So someone better start thinking outside of the outside of the box, or be prepared to take a dirt nap with an electric blanket like feel.
David B. Benson says
Re #208(?): Pat Neuman — Thanks for the link to the BBC article. The number given there for the melting of the West Antarctic ice sheet, 5 m, is seriously smaller than the estimates in the Wikipedia article.
As an amateur, I’ll opine that the East Antarctic ice sheet is not going to melt, even under ‘business-as-usual’ scenarios. So supposing that just the West Antarctic ice sheet and the Greenland ice sheet melt, both totally, that gives a sea stand rise of 12 meters.
That will be bad enough…
Ike Solem says
The sea level change issue does indeed seem to be understated – from the IPCC summary paleoclimate section, we learn that global sea levels in the last interglacial were very likely about 4-6 m higher then they are today mainly due to retreat of polar ice; polar temperatures were 3°-5°C higher then they are today.
So, the question then is will it really take 1000 years for the Greenland and West Antarctice Ice Sheets to respond to the higher temperatures? Temperature anomalies relative to the 1961-1990 baseline period in the Arctic are already at 4C in the summertime:
http://www.bom.gov.au/bmrc/ocean/results/SST_anals/SSTA_20060730.gif
There is no evidence that this trend is going to return to lower values; warming equatorial waters continue to export heat to polar regions, and more water vapor in the atmosphere means an increase in latent heat transport.
So, what is the response time of the Greenland Ice Sheet under these conditions? When does the buffering capacity of the Antarctic Ice Sheet on southern polar temperatures get overridden? Antarctic sea ice extent is probably a good indicator of this, but detailed knowledge of temperature changes in the Southern Circumpolar current would also be good to have.
It’s worth going back and looking at the 3rd IPCC 2001: “Results from ice sheet models for the last 500 years indicate an ongoing adjustment to the glacial-interglacial transition of Greenland and Antarctica together of 0.0 to 0.5 mm/yr. These ranges are consistent. We therefore take the ongoing contribution of the ice sheets to sea level rise in the 20th and 21st centuries in response to earlier climate change as 0.0 to 0.5 mm/yr. This is additional to the effect of 20th century and future climate change.”
While this current version is the summary, and not the detailed report, they probably should have included a statement on the failure of ice models to predict the dynamic behavior of the ice sheets. Is it possible that the response time is closer to 100 years then to 1000 years?
Julia R says
Question about Table SPM-1: The column labeled “Likelihood of human contribution” rates all the phenomena as “Likely” or “More likely than not”. While on page 8, under “Understanding and Attributing Climate Change” it says that “most of the observed increase in globally averaged temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.” Then it refers back to SPM-1. Can someone explain the discrepancy? How is “human contribution” different from “anthropogenic gases” and how are we supposed to use SPM-1 as a supporting table to this paragraph?
And by the way, thanks for this posting. It’s great to have a place to post questions to better understand the document. And many thanks to any of you who’ve been involved in IPCC. This is very important work and it’s heartening to see so many come together like this and work to get real information out so we can actually hope to address the problems.
pete best says
Apparantly 70 countries across the globe have large coal reserves compared to 60 % of Oil and Gas reserves being found in the middle east that amounts to around 155 years worth at present usage levels. As demand for energy surges in the coming years and as Oil and Gas peak Coal will be asked to fill the energy gap maybe but as James Hansesn states that without clean coal technology using coal would be madness and only server to burn available stocks faster. Coal could well run out by 2100 if usage levels for whatever rason increase significantly as they surely will unless viable alternatives are found. So do we embrace climate change and hope that the use of all fossi fuels end up producing viable alterntives or do we go for efficiency gains of which there are many to prolong the life of fossi fuels, bring online current alternatives to offset increased demand in the coming years and hope and pray that ethenol and hydrogen can save the day before 2150 – 2200 and we run out of fossil fuels anyway.
I opt for the latter, energy efficiency gains coupled with all viable alternatives and then R&D into creating the real energy future (if it exists) and also hope for fusion to work at some point before either it all becomes to expensive and politics take over or we get the energy breakthroughs we require ?
lars says
Global Warming is not due to human contribution of Carbon Dioxide
Global Warming: The Cold, Hard Facts?
Dr. Tim Ball, Chairman of the Natural Resources Stewardship Project (www.nrsp.com), is a Victoria-based environmental consultant and former climatology professor at the University of Winnipeg. He can be reached at letters@canadafreepress.com
Global Warming, as we think we know it, doesn’t exist. And I am not the only one trying to make people open up their eyes and see the truth. But few listen, despite the fact that I was the first Canadian Ph.D. in Climatology and I have an extensive background in climatology, especially the reconstruction of past climates and the impact of climate change on human history and the human condition.â??Few listen, even though I have a Ph.D, (Doctor of Science) from the University of London, England and was a climatology professor at the University of Winnipeg.â?? . For some reason (actually for many), the World is not listening. Here is why.
No sensible person seeks conflict, especially with governments, but if we don’t pursue the truth, we are lost as individuals and as a society. That is why I insist on saying that there is no evidence that we are, or could ever cause global climate change. And, recently, Yuri A. Izrael, Vice President of the United Nations sponsored Intergovernmental Panel on Climate Change (IPCC) confirmed this statement. So how has the world come to believe that something is wrong?
http://www.canadafreepress.com/2007/global-warming020507.htm
Charles Muller says
# tamino 222
> When you say “Table 10.2.1 in Second Draft” what exactly are you referring to?
Table in the Second Draft of AR4 (released in April 2006), chapter 10, pg 10. title : “Radiative forcing agents in the multi-model global climate projections”. This table offers a complete list of the factors included by models for “the simulations of the 20th century and of the future”
If you check the column “Solar”, you observe that GISS-ER and GISS-EH do include a “time-varying forcing”, but all others are “constant” or “omitted”. As you explained, I use to read in attribution-detection papers that 1900-1950 warming cannot be simulated by anthropic factors alone (but eventually by natural factors alone, according to Min and Hense 2006 thereafter). So my question is: how a majority of AR4 models correctly simulates XXth century trends with a contant or omitted solar forcing? Maybe this table is confusing (no solar factor is logic for projection, because we cannot anticipate sun’s behavior; but quite strange for simuations 1900-2000)
Min, S.-K., A. Hense (2006), A Bayesian assessment of climate change using multi-model ensembles. Part I: Global mean surface temperature, J. Climate, 19, 3237-3256
Charles Muller says
Some questions. I often read from RC the “no trend in past 50 yrs of solar activity” argument. I remember here a Rasmus article on that point, and also a guess text by Muscheler. But should we consider this argument as a “likely”, “very likely”, “more likely than not”… assertion? More broadly, a “consensus position” (in spite of Solanki, Usoskin, Krivova, Scafetta & West, etc.)? And, on a more theoretical point of view, what is more important for a comprehensive view of climate response: trends between each cycle (19/20, 20/21, 22/23, etc.), or multidecadal comparisons (19-22 compared to 15-18 for example)?
In fact, the AR4 SPM has aggravated my lack of understanding about solar factor. Model intercomparisons usually conclude that we need natural forcings in order to simulate 1900-50 (and eventualy a minor part of 1950-2000). But if Maunder/Modern difference is 0,2-0,4 K TOA (and 1750-2000 0,1 K TOA), I guess 1900-50 solar forcing is totally negligible. So, what are the “natural forcings” necessary to simulate 1900-50 (0,41 K warming between 1916 and 1945 according to Nasa Gistemp, not so far from 0,49 K on the same base for 1977-2006)?
Steve Bloom says
Re #232: Thanks for the correction, Harold. I’ve confused them before, but hopefully now I’ll remember.
Ike Solem says
RE#240,
You say, “Model intercomparisons usually conclude that we need natural forcings in order to simulate 1900-50 (and eventually a minor part of 1950-2000). You seem to be implying that solar forcing is needed to explain the record, which simply isn’t the case.
The issue relates to the following statement within the IPCC 2007 summary report: (pg 9)
It is very unlikely that climate changes of at least the seven centuries prior to 1950 were due to variability generated within the climate system alone. A significant fraction of the reconstructed Northern Hemisphere interdecadal temperature variability over those centuries is very likely attributable to volcanic eruptions and changes in solar irradiance, and it is likely that anthropogenic forcing contributed to the early 20th century warming evident in these records.
If we compare this to the paleoclimate summary on pg 8:
Average Northern Hemisphere temperatures during the second half of the 20th century were very likely higher than during any other 50-year period in the last 500 years and likely the highest in at least the past 1300 years. Some recent studies indicate greater variability in Northern Hemisphere temperatures than suggested in the TAR, particularly finding that cooler periods existed in the 12 to 14th, 17th, and 19th centuries. Warmer periods prior to the 20th century are within the uncertainty range given in the TAR
First of all, this means that internal climate variability doesn’t explain the current warming trend, and secondly, that on a historical basis changes in solar irradiance wouldn’t be expected to explain the current warming trend either.
There was a previous post on this issue at RC: Did the Sun hit record highs over the last few decades? Guest commentary by Raimund Muscheler
“The 14C tree ring records indicate that today’s solar activity is high but not exceptional during the last 1000 years.”
and also:
“Regardless of any discussion about solar irradiance in past centuries, the sunspot record and neutron monitor data (which can be compared with radionuclide records) show that solar activity has not increased since the 1950s and is therefore unlikely to be able to explain the recent warming.”
In addition, wouldn’t solar forcing be expected to warm the stratosphere, which is actually cooling?
Charles Muller says
#242 Ike
– I agree with your quote of SPM AR4 : “significant fraction of the reconstructed Northern Hemisphere interdecadal temperature variability over those centuries is very likely attributable to volcanic eruptions and changes in solar irradiance”. So we’re OK, there is a significant contribution of solar forcing to climate during the past 7 centuries (I don’t speak of internal or intrinsic or chaotic climate variability, but of a radiative forcing from natural or anthropic factor, in a classical IPCC style, natural being either volcanic or solar).
– Stratosphere cooling…
> …is recently measured (1978-present),
> …may be associated to ozone depletion (remember, the great concern of 80s and 90s, and still recent breakthrough in records: the less ozone, the less UV reaction, the less strato. warming I guess, and there is not much cooling in the UAH past 10 yrs),
> …eventually inform us of 21-to-23 solar cycles trends (no problem, it’s weak if any), not of 19-23 cycles compared to priors (no reason to expect a linear and direct response of surface temperature, if not we should observe a semi-11 yr response at each cycle, and we don’t observe it, at least for global surface T).
– If solar irradiance change 1750-2000 implies a 0,1 W/m2 TOA forcing (AR4 new estimate), it’s negligible (inferior to the difference between a minimum and a maximum of Schwabe cycle, 1 W/m2 TSI > 0,18 W/m2 TOA). So, why should I expect any solar influence on climate, after all? The “significant” contribution of solar forcing is therefore a problem.
– R. Muscheler says one thing, Solanki or Usoskin other things. As a layman, how can I favour one and dismiss other ? Solanki leads the MP Institute for solar studies, I suppose we should take care of his conclusion, don’t you?
Ike Solem says
RE#243
Yes, but temperature variability over that period is significantly less, with dips into cooler periods – unless you have problems with the paleoclimatology-based temperature reconstruction?
The issue is not “Solanki vs. Muscheler” but rather what the data & analysis shows – it’s not a question of competing expert opinions – especially on this particular site, we should be able to work through the various arguments.
See Climate: The Vanishing Solar Factor, by Dan Whipple Boulder CO (UPI) Jul 26, 2004.
What did Solanki & Usokin actually do? They analyzed the concentration of Beryllium-10, (an isotope of Be produced by cosmic ray radiation in the atmosphere) in ice cores. However they only analyzed two cores, one from Greenland and one from Antarctica, and only used the last 100 years from Greenland, not from Antarctica.
We can also take Solanki’s own comments on this issue:
In the 2002 Harold Jeffreys Lecture to the Royal Astronomical Society in London, Solanki said: After 1980, however, the Earth’s temperature exhibits a remarkably steep rise, while the sun’s irradiance displays at the most a weak secular trend. Hence the sun cannot be the dominant source of this latest temperature increase, with man-made greenhouse gases being the likely dominant alternative.
As well as those of Raimund Muscheler and Caspar Ammann:
My conclusion about past solar activity based on radionucleide records would be the following: Solar activity was relatively high during the last 50 years, but there were similar periods during the last 1,000 years, Muscheler said.
Ammann added: If you would take averages of all the ice cores, you would not get this increase in (solar activity) in the last 50 years, but it would stay relatively flat. It is only one core that shows the rise. This is not the common feature of all of them.
There are multiple other lines of evidence that the solar forcing factor is actually overestimated in climate models; I suggest reading the above article, and reconsidering your position on this issue.
Ike Solem says
After reading over the IPCC Fourth Assessment SPM, I have to say there is something I find disturbing, to say the least – and if someone could explain it to me, please do.
Why are they using the time period 1980-1999 as the baseline for their “temperature change predictions”?
pg10
Table SPM-2
Figure SPM-5
Figure SPM-6
I’ve been wondering why NOAA switched their baseline from the widely accepted 1961-1990 baseline period to the 1971-2000 baseline, and now the IPCC has switched their baseline up to the 1980-1999 period – an even bigger distortion? I still haven’t gotten an answer from NOAA as to who made this change or what their rationale was, and there is no stated rationale for this change in the IPCC either???
I mean, let’s go back to the 2001 IPCC report and look at their discussion of appropriate baselines:
Climate Change 2001, Working Group I: The Scientific Basis – Section 13.3: Defining the Baseline
13.3.1 The Choice of Baseline Period
The choice of baseline period has often been governed by availability of the required climate data. Examples of adopted baseline periods include 1931 to 1960 (Leemans and Solomon, 1993), 1951 to 1980 (Smith and Pitts, 1997), or 1961 to 1990 (Kittel et al., 1995; Hulme et al., 1999b).
There may be climatological reasons to favour earlier baseline periods over later ones (IPCC, 1994). For example, later periods such as 1961 to 1990 are likely to have larger anthropogenic trends embedded in the climate data, especially the effects of sulphate aerosols over regions such as Europe and eastern USA (Karl et al., 1996). In this regard, the ‘ideal’ baseline period would be in the 19th century when anthropogenic effects on global climate were negligible. Most impact assessments, however, seek to determine the effect of climate change with respect to ‘the present’, and therefore recent baseline periods such as 1961 to 1990 are usually favoured. A further attraction of using 1961 to 1990 is that observational climate data coverage and availability are generally better for this period compared to earlier ones.
Whatever baseline period is adopted, it is important to acknowledge that there are differences between climatological averages based on century-long data (e.g., Legates and Wilmott, 1990) and those based on sub-periods. Moreover, different 30-year periods have been shown to exhibit differences in regional annual mean baseline temperature and precipitation of up to ±0.5ºC and ±15% respectively (Hulme and New, 1997; Visser et al., 2000; see also Chapter 2).
How can you compare the 2001 report to the 2007 report if you change the baseline?
I’m honestly flabbergasted – if someone, anyone could explain or justify this, I’d love to hear about it.
Michael Mott says
6.05 Subject to paragraphs 6.07 and 8.04, the University and each Sponsor will have, without restriction and in its sole discretion and without conferring with or accounting to anyone, a perpetual, nonexclusive, worldwide, irrevocable, royalty-free right and license to use, disclose, publish, republish, distribute, copy, prepare derivative works, sell, or otherwise transfer without limitation to any third party, whether affiliated or not, all or any part of the Project Technology, with or without extending to that third party the right to sublicense, sell, or otherwise transfer the Project Technology to other third parties.
This is the sort of Langauge that is the reason for the current state of the planet in my humble opinion. It does nothing for the layman. It is the layman that ultimately makes the decisions that will shape our future. The everyman will vote in the next election wherever on the planet.And the everyman has tremendous power, not because they are strong or powerfull or influential, but because they are numerous. You folk who are at the helm of science need to give the everyman something that they really can understand. The present situation of the climate is one that most of us can observe in our own backyards.
One does not need to be a rocket scientist to see this. In the last 20 years I have seen with my own eyes the melting back of glaciers in the Canadian Rocky Mountains that I was taught would take hundreds of years. I am reminded of the Launching of the Titanic “she is Unsinkable” to paraphrase the thinking at the time. Well we all know what happened to the Titanic!
The policies of the next governments of the global community need to hear in a loud voice what we the everyman need to tell them. we need the information about the science in clear understandable language that will help us make the right decisions. regarding our future.
regards Michael
Urs Neu says
Re 236
Very likely is mentioned in relation to the human impact on global mean temperature. Table SPM-1, however, is about extreme events. The mentioning of Table SPM-1 on page 8 only refers to the last sentence: Discernible human influences now extend to other aspects of climate (than global mean temperature), including … (some of the following phenomena are listed in table SPM-1). While the attribution of global mean temperature is very likely, the attribution of most of the extremes is not more than likely (until now).
Re 245
Ike, you are confusing two different things. The 1961-1990 baseline period is for records of climate anomalies. That means you compare measurements to a reference period to calculate anomalies.
The 1980-1999 baseline period, however, refers to the starting point of model projections, which is 1990, and also has been in the TAR. Model results for a certain year (1990, 2030, 2100, etc.) always represent the mean over 10-30 years around this year to account for interannual variations, which are of course also present in the models. In the TAR, projections of temperature increase were allways given with respect to 1990 (which in fact was the 1980-1999 mean). In AR4 they have not changed the baseline, but only reveal what they in fact calculate. Thus warming 1990 to 2100 in the TAR is the same as 1980-1999 to 2090-2099 in AR4.
1990 is the latest starting point representing today’s state, if you have to calculate a mean around that year, or at least was in 1995 and 2001. But for reasons of comparision, this starting point has not been changed.
pete best says
Re #238, So Lars what is it due to then or is there no warming at all and all of the instruments are somehow calibrated incorrectly?
Whay would 000’s of climate scientists be wrong and you right if what I always ask and why does RC and others manage to always successfully defend their position against the skeptics?
Adam says
Re 245: NOAA probably switched to the 1971-2000 baseline because (IIRC) that’s WMO policy. Thirty year baselines are used in Meteorology, and every ten years or so, the baseline is moved on by ten years. This is because most of the time, the met organisation wants to indicate anomalies to the population based on what is relevant. As twenty or thirty year old baselines become beyond the recollection of those for whom you’re forecasting or reporting, there is little point in using them. This is especially true in a changing climate.
If you are doing long-term comparisons, then it is useful to stick with a standard baseline, but that becomes climatology rather than the grey area where meteorology & climatology mix. Besides it shouldn’t change the graph much, but shift it on the y-axis (earlier anomalies will tend to be more negative, and more recent ones less positive).
Charles Muller says
#244, Ike
I think you rather than me should reconsider more closely the case and be more cautious before drawing definitive conclusion. After all, IPCC SPM istelf acknowledges a low level of undestanding of solar forcing. Any reason to think our real level is medium or high ?
Muscheler and Solanki “dispute” (Nature, 2005) shows that they differ in the statistical analysis of the trends / correlation (neutron monitors, Cheltenham chamber, C14, Be10, Ti44, etc.) of Q and phi. If you precisely know which interpretation is closest to reality, chapeau! Thereafter, pdf of the debate.
Unusual activity of the Sun during recent decades compared to the
previous 11,000 years S. K. Solanki, I. G. Usoskin, et al. Nature 431,
28/10/04. http://cc.oulu.fi/~usoskin/personal/nature02995.pdf
Muscheler responds: R. Muscheler et al. Nature 436, 28/7/05.
http://www.cgd.ucar.edu/ccr/raimund/publications/Muscheler_et_al_Natu…
Solanki et al. reply. Reply to: R. Muscheler et al. Nature 436, 28/7/05.
http://cc.oulu.fi/~usoskin/personal/sola_nature05.pdf
Same remarks for the “overestimation” of solar forcing. See comments #217 and #239 on how the GCMs currently take account of it (if my interpretation of Table 10.2.1 Second Draft is OK). And some model intercomparisons (e.g. Stott 2003) observed, to the contrary, an underestimation of solar factor.
My point is not solely the precise detection-attribution of 1950-2005 warming but, more broadly, the compatibility between a very weak solar forcing 1750-2000 (0,1W/m2, comparable to minimum-maximum intracycle difference) and a significant influence on surface temperature trends during the past 250 yrs (#240, 243). You do not precisely answer to these questions. If your own hypothesis is no influence at all (or nearly), that’s coherent.