By Stefan Rahmstorf and Martin Vermeer
The scientific sea level discussion has moved a long way since the last IPCC report was published in 2007 (see our post back then). The Copenhagen Synthesis Report recently concluded that “The updated estimates of the future global mean sea level rise are about double the IPCC projections from 2007″. New Scientist last month ran a nice article on the state of the science, very much in the same vein. But now Mark Siddall, Thomas Stocker and Peter Clark have countered this trend in an article in Nature Geoscience, projecting a global rise of only 7 to 82 cm from 2000 to the end of this century.
Coastal erosion: Like the Dominican Republic, many island nations are
particularly vulnerable to sea level rise. (Photo: S.R.)
Semi-empirical sea level models
Siddall et al. use a semi-empirical approach similar to the one Stefan proposed in Science in 2007 (let’s call that R07) and to Grinsted et al. (2009), which we discussed here. What are the similarities and where do the differences come from?
For short time scales and small temperature changes everything becomes linear and the two new approaches are mathematically equivalent to R07 (see footnote 1). They can all be described by the simple equation:
dS/dt is the rate of change of sea level S, ΔT is the warming above some baseline temperature, and a and b are constants. The baseline temperature can be chosen arbitrarily since any constant temperature offset can be absorbed into b. This becomes clear with an example: Assume you want to compute sea level rise from 1900-2000, using as input a temperature time series like the global GISS data. A clever choice of baseline temperature would then be the temperature around 1900 (averaged over 20 years or so, we’re not interested in weather variability here). Then you can integrate the equation from 1900 to 2000 to get sea level relative to 1900:
There are two contributions to 20th C sea level rise: one from the warming in the 20th Century (let’s call this the “new rise”), and a sea level rise that results from any climate changes prior to 1900, at a rate b that was already present in 1900 (let’s call this the “old rise”). This rate is constant for 1900-2000 since the response time scale of sea level is implicitly assumed to be very long in Eq. 1. A simple matlab/octave code is provided below (2).
If you’re only interested in the total rise for 1900-2000, the temperature integral over the GISS data set is 25 ºC years, which is just another way of saying that the mean temperature of the 20th Century was 0.25 ºC above the 1900 baseline. The sea level rise over the 20th Century is thus:
Compared to Eq. 1, both new studies introduce an element of non-linearity. In the approach of Grinsted et al, sea level rise may flatten off (as compared to what Eq 1 gives) already on time scales of a century, since they look at a single equilibration time scale τ for sea level with estimates ranging from 200 years to 1200 years. It is a valid idea that part of sea level rise responds on such time scales, but this is unlikely to be the full story given the long response time of big ice sheets.
Siddall et al. in contrast find a time scale of 2900 years, but introduce a non-linearity in the equilibrium response of sea level to temperature (see their curve in Fig. 1 and footnote 3 below): it flattens off strongly for warm temperatures. The reason for both the long time scale and the shape of their equilibrium curve is that this curve is dominated by ice volume changes. The flattening at the warm end is because sea level has little scope to rise much further once the Earth has run out of ice. However, their model is constructed so that this equilibrium curve determines the rate of sea level rise right from the beginning of melting, when the shortage of ice arising later should not play a role yet. Hence, we consider this nonlinearity, which is partly responsible for the lower future projections compared to R07, physically unrealistic. In contrast, there are some good reasons for the assumption of linearity (see below).
Comparison of model parameters
But back to the linear case and Eq. 1: how do the parameter choices compare? a is a (more or less) universal constant linking sea level to temperature changes, one could call it the sea level sensitivity. b is more situation-specific in that it depends both on the chosen temperature baseline and the time history of previous climate changes, so one has to be very careful when comparing b between different models.
For R07, and referenced to a baseline temperature for the year 1900, we get a = 0.34 cm/ºC/year and b = 0.077 cm/year. Corresponding values of Grinsted et al. are shown in the table (thanks to Aslak for giving those to us!).
For Siddall et al, a = s/τ where s is the slope of their sea level curve, which near present temperatures is 4.8 meters per ºC and τ is the response the time scale. Thus a = 0.17 cm/ºC/year and b = 0.04 cm /year (see table). The latter can be concluded from the fact that their 19th Century sea level rise, with flat temperatures (ΔT(t) = 0) is 4 cm. Thus, in the model of Siddall et al, sea level (near the present climate) is only half as sensitive to warming as in R07. This is a second reason why their projection is lower than R07.
Model |
a [cm/ºC/year]
|
b
[cm /year] |
“new rise” [cm] (25a)
|
“old rise” [cm] (100b)
|
25a+100b
[cm] |
total model rise [cm]
|
Rahmstorf |
0.34
|
0.077
|
8.5
|
7.7
|
16.2
|
16.2
|
Grinsted et al “historical” |
0.30
|
0.141
|
7.5
|
14.1
|
21.6
|
21.3
|
Grinsted et al “Moberg” |
0.63
|
0.085
|
(15.8)
|
(8.5)
|
(24.3)
|
20.6
|
Siddall et al |
0.17
|
0.04
|
4.3
|
4
|
8.3
|
|
Performance for 20th Century sea level rise
For the 20th Century we can compute the “new” sea level rise due to 20th Century warming and the “old” rise due to earlier climate changes from Eq. 3. The results are shown in the table. From Grinsted et al, we show two versions fitted to different data sets, one only to “historical” data using the Jevrejeva et al. (2006) sea level from 1850, and one using the Moberg et al. (2006) temperature reconstruction with the extended Amsterdam sea level record starting in the year 1700.
First note that “old” and “new” rise are of similar magnitude for the 20th Century because of the small average warming of 0.25 ºC. But it is the a-term in Eq. (2) that matters for the future, since with future warming the temperature integral becomes many times larger. It is thus important to realise that the total 20th Century rise is not a useful data constraint on a, because one can get this right for any value of a as long as b is chosen accordingly. To constrain the value of a – which dominates the 21st Century projections — one needs to look at the “new rise”. How much has sea level rise accelerated over the 20th Century, in response to rising temperatures? That determines how much it will accelerate in future when warming continues.
The Rahmstorf model and the Grinsted “historical” case are by definition in excellent agreement with 20th Century data (and get similar values of a) since they have been tuned to those. The main difference arises from the differences between the two sea level data sets used: Church and White (2006) by Rahmstorf, Jevrejeva et al. (2006) by Grinsted et al. Since the “historical” case of Grinsted et al. finds a ~1200-year response time scale, these two models are almost fully equivalent on a century time scale (e-100/1200=0.92) and give nearly the same results. The total model rise in the last column is just 1.5 percent less than that based on the linear Eq. 3 because of that finite response time scale.
For the Grinsted “Moberg” case the response time scale is only ~210 years, hence our linear approximation becomes bad already on a century time scale (e-100/210=0.62, the total rise is 15% less than the linear estimate), which is why we give the linear estimates only in brackets for comparison here.
The rise predicted by Siddall et al is much lower. That is not surprising, since their parameters were fitted to the slow changes of the big ice sheets (time scale τ=2900 years) and don’t “see” the early response caused by thermal expansion and mountain glaciers, which makes up most of the 20th Century sea level rise. What is surprising, though, is that Siddall et al. in their paper claim that their parameter values reproduce 20th Century sea level rise. This appears to be a calculation error (4); this will be resolved in the peer-reviewed literature. Our values in the above table are computed correctly (in our understanding) using the same parameters as used by the authors in generating their Fig.3. Their model with the parameters fitted to glacial-interglacial data thus underestimates 20th Century sea level rise by a factor of two.
Future projections
It thus looks like R07 and Grinsted et al. both reproduce 20th Century sea level rise and both get similar projections for the 21st Century. Siddall et al. get much lower projections but also strongly under-estimate 20th Century sea level rise. We suspect this will hold more generally: it would seem hard to reproduce the 20th Century evolution (including acceleration) but then get very different results for the 21st Century, with the basic semi-empirical approach common to these three papers.
In fact, the lower part of their 7-82 cm range appears to be rather implausible. At the current rate, 7 cm of sea level rise since 2000 will be reached already in 2020 (see graph). And Eq. 1 guarantees one thing for any positive value of a: if the 21st Century is warmer than the 20th, then sea level must rise faster. In fact the ratio of new sea level rise in the 21st Century to new sea level rise in the 20th Century according to Eq. 2 is not dependent on a or b and is simply equal to the ratio of the century-mean temperatures, T21/T20 (both measured again relative to the 1900 baseline). For the “coldest” IPCC-scenario (1.1 ºC warming for 2000-2100) this ratio is 1.3 ºC / 0.25 ºC = 5.2. Thus even in the most optimistic IPCC case, the linear semi-empirical approach predicts about five times the “new” sea level rise found for the 20th Century, regardless of parameter uncertainty. In our view, when presenting numbers to the public scientists need to be equally cautious about erring on the low as they are on the high side. For society, after all, under-estimating global warming is likely the greater danger.
Does the world have to be linear?
How do we know that the relationship between temperature rise and sea level rate is linear, also for the several degrees to be expected, when the 20th century has only given us a foretaste of 0.7 degrees? The short answer is: we don’t.
A slightly longer answer is this. First we need to distinguish two things: linearity in temperature (at a given point in time, and all else being equal), and linearity as the system evolves over time. The two are conflated in the real world, because temperature is increasing over time.
Linearity in temperature is a very reasonable assumption often used by glaciologists. It is based on a heat flow argument: the global temperature anomaly represents a heat flow imbalance. Some of the excess heat will go into slowly warming the deep ocean, some will be used to melt land ice, a tiny little bit will hang around in the atmosphere to be picked up by the surface station network. If the anomaly is 2 ºC, the heat flow imbalance should be double that caused by a 1 ºC anomaly. That idea is supported by the fact that the warming pattern basically stays the same: a 4 ºC global warming scenario basically has the same spatial pattern as a 2 ºC global warming scenario, only the numbers are twice as big (cf. Figure SMP6 of the IPCC report). It’s the same for the heating requirement of your house: if the temperature difference to the outside is twice as big, it will lose twice the amount of heat and you need twice the heating power to keep it warm. It’s this “linearity in temperature” assumption that the Siddall et al. approach rejects.
Linearity over time is quite a different matter. There are many reasons why this cannot hold indefinitely, even though it seems to work well for the past 120 years at least. R07 already discusses this and mentions that glaciers will simply run out of ice after some time. Grinsted et al. took this into account by a finite time scale. We agree with this approach – we merely have some reservations about whether it can be done with a single time scale, and whether the data they used really allow to constrain this time scale. And there are arguments (e.g. by Jim Hansen) that over time the ice loss may be faster than the linear approach suggests, once the ice gets wet and soft and starts sliding. So ultimately we do not know how much longer the system will behave in an approximately linear fashion, and we do not know yet whether the real sea level rise will then be slower or faster than suggested by the linear approach of Eq. 1.
Getting soft? Meltwater lake and streams on the Greenland Ice Sheet near 68ºN at 1000 meters altitude. Photo by Ian Joughin.
Can paleoclimatic data help us?
Is there hope that, with a modified method, we may successfully constrain sea level rise in the 21st Century from paleoclimatic data? Let us spell out what the question is: How will sea level in the present climate state respond on a century time scale to a rapid global warming? We highlight three aspects here.
Present climate state. It is likely that a different climate state (e.g. the glacial with its huge northern ice sheets) has a very different sea level sensitivity than the present. Siddall et al. tried to account for that with their equilibrium sea level curve – but we think the final equilibrium state does not contain the required information about the initial transient sensitivity.
Century time scale. Sea level responds on various time scales – years for the ocean mixed layer thermal expansion, decades for mountain glaciers, centuries for deep ocean expansion, and millennia for big ice sheets. Tuning a model to data dominated by a particular time scale – e.g. the multi-century time scale of Grinsted et al. or the multi-millennia time scale of Siddall et al. – does not mean the results carry over to a shorter time scale of interest.
Global warming. We need to know how sea level – oceans, mountain glaciers, big ice sheets all taken together – responds to a globally near-uniform forcing (like greenhouse gas or solar activity changes). Glacial-interglacial climate changes are forced by big and highly regional and seasonal orbital insolation changes and do not provide this information. Siddall et al use a local temperature curve from Greenland and assume there is a constant conversion factor to global-mean temperature that applies across the ages and across different mechanisms of climate change. This problem is not discussed much in the paper; it is implicit in their non-dimensional temperature, which is normalised by the glacial-holocene temperature difference. Their best guess for this is 4.2 ºC (as an aside, our published best guess is 5.8 ºC, well outside the uncertainty range considered by Siddall et al). But is a 20-degree change in Greenland temperature simply equivalent to a 4.2-degree global change? And how does local temperature translate into a global temperature for Dansgaard-Oeschger events, which are generally assumed to be caused by ocean circulation changes and lead to a temperature seesaw effect between northern and southern hemisphere? What if we used their amplitude to normalise temperature – given their imprint on global mean temperature is approximately zero?
Overall, we find these problems extremely daunting. For a good constraint for the 21st Century, one would need sufficiently accurate paleoclimatic data that reflect a sea level rise (a drop would not do – ice melts much faster than it grows) on a century time scale in response to a global forcing, preferably from a climate state similar to ours – notably with a similar distribution of ice on the planet. If anyone is aware of suitable data, we’d be most interested to hear about them!
Update (8 Sept): We have now received the computer code of Siddall et al (thanks to Mark for sending it). It confirms our analysis above. The code effectively assumes that the warming over each century applies for the whole century. I.e., the time step for the 20th Century assumes the whole century was 0.74 ºC warmer than 1900, rather than just an average of 0.25 ºC warmer as discussed above. When this is corrected, the 20th Century rise reduces from 15 cm to 8 cm in the model (consistent with our linear estimate given above). The 21st Century projections ranging from 32-48 cm in their Table 1 (best estimates) reduce to 24-32 cm.
Martin Vermeer is a geodesist at the Helsinki University of Technology in Finland.
Footnotes
(1) Siddall et al. use two steps. First they determine an equilibrium sea level for each temperature (their Eq 1, and shown in their Fig. 1). Second, they assume an exponential approach of sea level to this equilibrium value in their Eq. 2, which (slightly simplified, for the case of rising sea level) reads:
dS/dt = (Se(T) – S(t)) / τ.
Here S is the current sea level (a function of time t), Se the equilibrium sea level (a function of temperature T), and τ the time scale over which this equilibrium is approached (which they find to be 2900 years).
Now imagine the temperature rises. Then Se(T) increases, causing a rise in sea level dS/dt. If you only look at short time scales like 100 years (a tiny fraction of those 2900 years response time), S(t) can be considered constant, so the equation simplifies to
dS/dt = Se(T)/ τ + constant.
Now Se(T) is a non-linear function, but for small temperature changes (like 1 ºC) this can be approximated well by a linear dependence Se(T) = s * T + constant. Which gives us
dS/dt = s/τ * T + constant, i.e. Eq (1) in the main post above.
R07 on the other hand used:
dS/dt = a * (T – T0), which is also Eq. (1) above.
Note that a = s/τ and b = –a*T0 in our notation.
(2) Here is a very basic matlab/octave script that computes a sea level curve from a given temperature curve according to Eq. 2 above. The full matlab script used in R07, including the data files, is available as supporting online material from Science
% Semi-empirical sea level model - very basic version
T1900=mean(tempg(11:30)); T=tempg-T1900;
a=0.34; % sea level sensitivity parameter [cm/degree/year]
b=0.077; % note this value depends on a and on the temperature
% baseline, here the mean 1890-1909
% rate of rise - here you need to put in an annual temperature time series T
% with same baseline as chosen for fitting b!
dSdt = a*T + b;
% integrate this to get sea level over the period covered by the temperature series
S = cumsum(dSdt); plot(S);
(3) Here is a matlab/octave script to compute the equilibrium sea level curve of Siddall et al. Note the parameters differ in some cases from those given in the paper – we obtained the correct ones from Mark Siddall.
% Siddall et al equilibrium sea level curve, their Fig. 1, NGRIP scenario
A = 15.436083479092469;
b = 0.012630000000000;
c = 0.760400212014386;
d = -73.952809369848552;
Tdash=[-1.5:.05:2];
% Equilibrium sea level curve
Se=A*asinh((Tdash+c)/b) + d;
% Tangent at current temperature
dSe=A/sqrt(1+((0+c)/b)^2)/b;
Se0= A*asinh((0+c)/b) + d;
Te=dSe*Tdash + Se0;
plot(Tdash, Se, 'b', Tdash, Te, 'c', Tdash, 0.0*Se, 'k', [0 0], [-150 40], 'k')
xlabel('Dimensionless temperature')
ylabel('Equilibrium sea level (m)')
fprintf(1, 'Slope: %f m/K, Sensitivity: %f cm/K/year, zero offset: %f m\n\n', dSe/4.2, 100*dSe/4.2/2900, Se0);
(4) We did not yet receive the code at the time of writing, but based on correspondence with the authors conclude that for their values in Fig. 3 and table 1, Siddall et al. integrated sea level with 100-year time steps with a highly inaccurate numerical method, thus greatly overestimating the a-term. In their supporting online information they show a different calculation for the 20th Century with annual time steps (their Fig. 5SI). This is numerically correct, giving an a-term of about 4 cm, but uses a different value of b close to 0.12 cm/year to obtain the correct total 20th Century rise.
References
Rahmstorf, S. Response to comments on “A semi-empirical approach to projecting future sea-level rise”. Science 317 (2007).
Siddall, M., Stocker, T. F. & Clark, P. U. Constraints on future sea-level rise from past sea-level change. Nature Geoscience (advance online publication, 26 July 2009).
Nicolas Nierenberg says
#145, cce the figure you are referring to is 10.34 (not 10.43 as that doesn’t exist contains projections out to the year 3000. Perhaps that is what Dr. Pachauri was referring to, but I don’t think most people would think that his remarks were referencing projections to the year 2200 to 3000. I also don’t see any way to use that chart to connect 2C to that estimate. The models that have an equilibrium around 2C seem to have a thermal expansion of less than .7M.
#147 Hank, I am aware that Rahmstorf came up with .5 to 1.4, but it certainly wasn’t just thermal expansion, and it also certainly isn’t the IPCC consensus. As I have pointed out in earlier posts on this thread the Rahmstorf estimate doesn’t even pass an internal test of withholding part of the data to estimate the rest.
CTG says
Re 149. Martin, you’re just not thinking like a denialist. In J.Bob’s universe, there can never be under-estimates, only over-estimates, and a guesstimate that is 1% wrong is just as bad as a guesstimate that is 100% wrong. You just need to let go of any notion of logic or sanity, and then everything makes sense.
John P. Reisman (OSS Foundation) says
#151 Nicolas Nierenberg
Nicholas, I don’t know precisely what he is referring to, but I also understand that he is watching the leading edge of the science rather intently. Maybe he was not referring only to AR4? If he was speaking extemporaneously then maybe he was mixing knowledge from AR4 with new work? Honestly I don’t know. You might consider the possibility that he has seen something that is not published that may have further relevance. Maybe you should consider being a little patient? The 5th report is coming but good science apparently takes time.
Personally, although unscientific, I think it is not unreasonable to see that we could be in the 2 to 5 meter range in 2100. The sensitivities do seem to indicate that the system in general is more sensitive than we can detect clearly at the moment. So while it is important to continue to clarify the dynamical models and the empirical evidence, even more important is the application of reason when placed in the context of reasonably assessed impacts when synthesized with consideration of energy, economy and environment together.
There were a lot of things I heard this week and I might get some of it wrong due to mixed memory and relationships but here is a grossly inadequate overview:
The world bank and Munich re and a few others did talks on insurance and assessment of risk. It is going to become clearer, but there was much talk of reasonable projections and probability as well, and this is something human intelligence is not incapable of.
There was a lot of talk about refining the resolution. There was talk about getting better information to users. I talked a lot about communicating the message to the public and policy makers, others talked about processing capacity to avoid degrading the resolution in the ocean between the satellite and the argo floats.
But there were a few messages that I noticed that rang clear. We need to work together, this is a global problem and therefore global solutions are in order. We all live in this community so we all have a stake in it. We will not get out unscathed but the faster we act the less costly it will be. It will create jobs. I think Ban Ki Moon said something about a coal plant is 7 persons per megawatt while a solar farm is 23 persons per megawatt. Rapid action is required to avoid higher costs. That is just a taste of the messages I heard. I don’t know when the conference statements will all be online but they are in process.More info can be found here:
http://www.wmo.int/wcc3
I continue to examine from a human relevance perspective with regard to impact potentials.
The UK Met office did a poster presentation (last Wed.) showing that their recent models are showing that the heat wave of 2003 is projected to be happening every other year by 2040 (soon after to be replaced by every year, and rising…).
Let’s hope Copenhagen moves us more than a little bit forward.
Hank Roberts says
Nicholas, the goalpost moved.
The quote is from a speech:
“… there were some dimensions of the scientific reality of the 2°C target that were really not covered and which I think would have added a great deal if they had been addressed in this particular meeting. Two of them that I would like to mention are the fact that firstly even with the 2°C target the IPCC has assessed that on account of thermal expansion alone we could get sea level rise of 0.4 to 1.4 meters. This is due to thermal expansion alone….”
That’s, if the transcript posted above is correct, a bit of what he said.
First — check the transcript, rather than assume it’s correct.
Check the transcript for the context as well.
The IPCC last published report had a low sea level rise number — and known to be — when published. They said so.
Nobody has come in with an argument for a lower bound than that known underestimate.
There’s good reason for the concern expressed, and for the concern he voices in his speech — that this issue wasn’t discussed at that particular meeting. That appears to be what he’s talking about — best information we have _now_ is that it’s in the range mentioned. Next IPCC Report will be some years from now. Those numbers will change.
Bets as to whether they’ll go down or up might be taken, but not here.
Anyone who doubts this level of risk is a concern can go back and read the opening post of this topic.
Finding the complete transcript and reading it would seem the best place to start. Else there’s only the strong wish this weren’t likely to happen.
We all share that wish. But wishing doesn’t stop the problem.
Hank Roberts says
> as I have pointed out
You mean here, where you received an inline response? Worth reading, if so:
2 September 2009 at 7:22
Nicolas Nierenberg says
Hank,
#154, I’m just interested in accuracy here. If he wants to make a new estimate that’s fine, but he shouldn’t ascribe it to the IPCC.
What reference can you give me for the IPCC saying that “they” knew the sea level rise number was low?
With regards to new lower estimates I believe that is what this thread was about in the first place, although I tend to agree that the original IPCC estimates are probably still the best.
With reference to the Rahmstorf paper I already followed up on the in line comment. Stefan referenced charts in his reply to comments. I pointed out that the coefficients are a much better statistical test than visual inspection of charts. I received no reply to that assertion.
Jim Eaton says
Re: 119 Thomas says:
4 September 2009 at 10:24 PM
” I live not too far from the California Delta. If I had to make a wild eyed guess as to what level of sea level rise to plan for, I would simply double the IPCC AR4 numbers.”
On August 31st, the Sacramento Bee published an article “Delta facing a sea change,” which stated that the new administration’s Army Corps of Engineers is now requiring that projects look at a sea level rise of 55 inches (140 cm) by 2100.
http://www.sacbee.com/1268/story/2149351.html
This is critically important to California as most of the fresh water delivered to Southern California comes from the rivers in the north. The pumps that now send the water south are nearly at current sea level, so any increase in sea level threatens to shut off the water supply to Los Angeles and farmlands in the San Joaquin Valley. And global warming projections also call for drought in this part of the country, so things are looking a little bleak for Southern California.
Lawrence Coleman says
By what I have read of the comments above..these sea level predictions are still based on a linear business as usual aproaches..what about the consequences of major tipping points being crossed?..what time frames are we looking at before things get uncomfortable? Ice albedo by it’s nature is exponential..it’s like compound interest, the less ice the warmer the oceans..the faster the remaining ice melts until it’s all annual ice in the winter and permanent and complete loss of arctic sea ice in the summer. This will undoubtedly have rapid and massive effects on the climate in the northern hemisphere as the huge expanses of dark ocean water keep heating up unrestricted. This will cause greatly accelerated melting of the jakobshavn and other glaciers on the greenland mainland. You can bet your bottom dollar that the release of methane hydrates in the norhern temperate and even the arctic latitudes will be well off the scale. What effect will this localised addition of millions of tonnes of CH4 have to the climate of the nothern hemi? Saw in Science daily that it was presumed that most of the methane being released would dissolve back in to the ocean water on the way up..but latest research finding show that the methane is bubbling up too fast and escaping into the atmosphere without much diffusion back into water. This is but one of the tipping points which are being crossed as we type and will and are making an absolute mockery of the IPCC predictions.
simon abingdon says
#153 John P. Reisman (OSS Foundation)
“It will create jobs. I think Ban Ki Moon said something about a coal plant is 7 persons per megawatt while a solar farm is 23 persons per megawatt.”
Hank Roberts says
Lawrence, there’s good coverage here for those issues — see for example
https://www.realclimate.org/index.php/archives/2008/09/how-much-will-sea-level-rise/ — following the cites there forward touches all your issues.
There’s plenty of hand-wringing without citation elsewhere too.
This topic — for those of us trying to follow the math — helps understand _how_ this area of science is done, as published.
John P. Reisman (OSS Foundation) says
#153 John P. Reisman
[correction]
It was Pachauri that mentioned the “7 persons per megawatt while a solar farm is 23 persons per megawatt”. Biomass energy has some good potential also in my opinion but I can not conjecture on quantitatives at this time.
I have gone over the speech a few times and believe the transcript to be reasonably accurate. Download here:
http://www.ossfoundation.us/projects/environment/global-warming/myths/images/2009-statements/High-Level-Segment-of-WCC3_3rdSeptember09_edit.doc/view
Nicholas, please try to understand that he is coming from a perspective of stronger understanding than myself, certainly and probably many of us, and possibly not all.
Videos:
http://www.wmo.int/wcc3/rec_videos_en.php
Thurs 03 (Opening and Plenaries) has Ban Ki-moon and Pachauri statements along with others. One of the Swiss presidents also and a minister also made very powerful statements.
Remember, context is key.
cce says
#151,
Figure 10.34b shows global mean surface warming, and 10.34c shows the associated SLR due to thermal expansion.
Pachauri said that if temperatures rise 2 degrees, then we’d get SLR in those ranges (although Pachauri was talking about preindustrial, and 10.34 is relative to 2000). That is, do X, get y.
Regarding SLR from all causes, by my last count, there have been 5 papers published recently (including the 2 discussed in this post) expecting much higher values than what was in AR4.
Kinematic Constraints on Glacier Contributions to 21st-Century Sea-Level Rise
http://www.sciencemag.org/cgi/content/abstract/321/5894/1340
0.8 m “‘most likely’ starting point.” 2.0 m max by 2100.
A Semi-Empirical Approach to Projecting Future Sea-Level Rise
http://www.pik-potsdam.de/~stefan/Publications/Nature/rahmstorf_science_2007.pdf
0.5 – 1.4 m by 2100
Reconstructing sea level from paleo and projected temperatures 200 to 2100AD
http://www.glaciology.net/Home/PDFs/Announcements/gslprojection
0.9 to 1.3 m by 2100
High rates of sea-level rise during the last interglacial period
http://www.umces.edu/President/STWG/Rohlingetal2007.pdf
Average 1.6 m per century under sustained temperatures 2+ degrees above present.
Rapid early Holocene deglaciation of the Laurentide ice sheet
http://pubs.giss.nasa.gov/docs/2008/2008_Carlson_etal.pdf
Current projections should be considered “minimum” even without ice sheet dynamics.
Aaron Lewis says
RE#150 Martin,
We very much need to address the source of the problem – and that will involve massive and rapid changes in society. Getting society to accept changes that some will see as painful, requires a spokesman with more authority than science currently has.
Currently, anybody can say, ”I am a scientist, and AGW is a non-sense.” If climate scientists and weathermen were registered, only climate scientists that had pasted exams and an internship could stand up and say that. : ) With fewer denialists, we would be more likely to get public support for good public policy.
We need to give climate science more status. We did it for medicine. We did it for law. We did it for engineering. Part of that status is that each of these groups carries professional insurance – members of these professions do post a performance bond.
If you were a registered climate professional, and I was not; you could just tell me that this model was your expert opinion, and I should just go seek another provider. However, if this is all “just science” then I have standing to criticize your model, as a “peer”. After all, I was a “Senior Scientist” for years and years. As a registered professional, you would be in a much higher status position vise-a-vie people who were not registered climate professionals.
[Response: The problem with your proposal in my view is that for medicine, law and engineering you’re talking about a registration for people practicing these professions following established codes of practice. You’re not talking about cutting-edge scientific research, for which the freedom of science holds. There is already established codes of conduct for scientists, of course, but requiring some kind of registration to be allowed to do scientific research would curtail the freedom of science. Science has its established professional training and status system – for example, a PhD is usually considered the level where you have acquired the credentials to do independent research; while working on the PhD you still need a supervisor. And a PhD (at least in New Zealand, where I obtained mine) comes with swearing to pursue the scientific truth. But climatology being a very interdisciplinary topic, it is still not so clear-cut who can call themselves a climatologist – my PhD is in physical oceanography. In the end, there is a much simpler criterion for who is a climate scientist worth listening to: someone who has a good publication and citation record on relevant topics. You can find out within minutes via Google Scholar. -stefan]
David B. Benson says
John P. Reisman (OSS Foundation) (160) — I’ve been through the numbers on biomass potential as an energy source. Here is a good place to start:
http://www.eoearth.org/article/Global_human_appropriation_of_net_primary_production_(HANPP)
Generally speaking, there is no possiblity of meeting the entire human populations energy “needs” from biomass alone. I do encourage the devlopment of biomass solutions, espec8ially those which can become carbon negative in that, at the end, the carbon dioxide is permanently sequestered away from the active carbon cycle. But laternatives must be found as well.
Donald Oats says
Re: Chris Dudley, you are of course correct; I had a doh! moment not long after I posted it.
Marion Delgado says
OT: David Archer’s Long Thaw cover is the best cover I’ve seen in years. My compliments to the designer.
Pierre-Andre Morin says
I have a fundamental question.
What is the impact of global warming on companies?
To state it otherwise: If Global warming is truth, what is the impact on the stock market?
Point of this: Companies have the greatest interest to have a growing economy but the global warming scenario, the GDP would go down.
As a side question, can companies who are promoting deniers views can be held accountable for their agenda?
Assuming that Global warming can kill people, can CEO be legally held responsible, in the future (5-15 years) for their actions?
wili says
Can anyone give an authoritative update on the methane situation that Lawrence mentioned?
John P. Reisman (OSS Foundation) says
#162 David B. Benson
My apologies for not providing context to my post. I was not suggesting biomass could meet the human populations energy needs, at least not at these consumption levels, but rather was suggesting that in the general context of the post that it could also provide a good amount of jobs. I had more than a few conversations at WCC3 about these potentials.
The solutions are still a big pie and biomass in only a slice. Generally speaking I still believe we need:
– energy consumption reduction (non utilitarian first, etc.)
– sustainable renewable
– potentially thorium as a bridge energy source
– new innovations, etc.
Primarily education remains the key to all solutions. Politicians don’t do things unless they think it will get them votes. So education remains the top priority if we are going to deliver a good pie.
FurryCatHerder says
JPR OSS @ 160:
Jobs per KWh will decline as scaling improves. Bad for the unemployed, good for buyers.
simon abingdon says
Nuclear power can easily provide ALL the world’s energy needs (excepting aircraft). We can cover the landscape with windfarms and artificial trees and gigantic arrays of solar panels or spend a little effort thinking about how to dispose of nuclear waste. E=mc2 you dimwits.
[Response: Perhaps it could, but do the sums. Currently (as of 1 August 2009) there are 435 nuclear power stations in the world (9 fewer than in 2002). They supply 2% of the global end energy use. So to supply ALL (assuming just the current level, no increase in demand) you’d need about 50 times the number of nuclear power stations – roughly 20,000. How fast can you build those – in time to solve the climate crisis? Where do you want to build those – say, Iran? How do you cope with the nuclear proliferation and waste problems, if you scale the current nuclear power industry up by a factor of 50 and spread it to a lot of less-than-well-governed countries? -stefan]
simon abingdon says
#171 “How do you cope with the nuclear proliferation and waste problems, if you scale the current nuclear power industry up by a factor of 50 and spread it to a lot of less-than-well-governed countries? -stefan”
Start by putting it top of the agenda at Copenhagen.
Mark says
“What is the impact of global warming on companies?”
Uhm, LSE and NYSE have to close.
Or we genetically modify stocktraders to breathe in water.
Mark says
Simon Monkton, #169, you’re forgetting the externalised cost of CO2 pollution from Coal and the 4Bn dollars the US alone spends on coal subsidies.
John P. Reisman (OSS Foundation) says
#170 FurryCatHerder
True, but neither am I expecting all innovation to die off, or redistribution of labor needs based on various market shifts, resource changes, technology changes, etc. Not everyone has to work in the energy sector to keep things running.
Ray Ladbury says
Simon Abingdon,
Every Uranium and Thorium atom in the Universe had it’s origin in a supernova–so there really isn’t a lot of the stuff out there. True, Earth’s crust is enriched in the actinides due to their lithophilic nature, but even so, the supply is finite. Independent of issues of waste disposal, safety and proliferation, this is not a renewable resource. And as Stefan points out–the plants take forever to build.
Climate change is a difficult problem. Beware of purported two-word solutions to it.
John P. Reisman (OSS Foundation) says
#171 simon monckton
How rude of you to call us all dimwits.
Energy may very well equal mass times the speed of light squared, but from what I can tell you still have not figured out that
2 + 2 = 4
Of course add in, as Stefan pointed out, the cost of waste storage and handling and security. So, add additional weapons grade plutonium to a world of dwindling resources, shifting geopolitical borders and ever more desperate people just trying to get by due to the latitudinal shift.
What’s that add up to?
Anyone can quote Einstein’s equation. I’m just not confident in your math ability when it comes to simple reasoning.
Ray Ladbury says
Pierre says, “Companies have the greatest interest to have a growing economy but the global warming scenario, the GDP would go down.”
Who says? We are talking about creating an entirely new global energy infrastructure here–new transport, new energy sources, new grid… We are talking about a massive effort to mitigate climate change–new crops, carbon-capture and storage… Smart companies will see opportunity and benefit.
As to the question of liability for liars…well, we’ve already caught them in the lie. Once there are consequences, I would think that being a lawyer in class action against coal and petroleum giants might also be a lucrative profession.
Of course, this all assumes that we actually DO something to avert the worst consequences. If we do not, falling stock prices will be the least of your worries.
canbanjo says
Re 171 Stephan, why are you writing as if nuclear is an all or nothing decision?
For example Britain could easily generate much of its energy from nuclear.
Poor but sunny countries can be encouraged (paid) to harness the sun.
Where wind is most efficeient build turbines.
etc etc
If the c02 risk is as great as we are led to believe, surely the risks associated with nuclear are tiny in comparison?
Dick Veldkamp says
Re #171 Nuclear can provide ALL energy?
There’s also the nasty little problem that there’s only uranium for 50-100 years at current use. Increase nuclear 50-fold, and you go down to 1-2 years.
Sekerob says
Reading the BraveNewWorld is worthwhile -Stefan. New generation plants barely produce waste. It’s almost all recyclable. We have to start building them now, every coal plant in poisinous emissions worse than any nuke. The energy needs are not abating and looking at the preposterous prices for e.g. led lights, a 1 watts equivalent 7 watts asking 14.90 Euro’s manufacturers of course milking it to the last cent, don’t see that coming down. Led flat screen 3,000 Euro. Starting price for another example.
Chris Dudley says
John (#161),
The job numbers sound backwards to me since the Energy Return on Energy Invested is higher for solar. Perhaps these are construction jobs only since they are given per MW rather than per GWh? Solar and wind are especially good right now because the jobs are upfront when we need jobs. But their biggest (economic) benefit is in making energy easier, not harder, to get so that over all prosperity is boosted. One expects the effort per GWh to be lower than for coal, not higher.
One number that is easy to calculate is that given 104,626 coal mining fatalities between 1900 and 2007 in the US and 366,040 MW of nameplate coal generating capacity in 2007 in the US, there are about 0.3 coal mining fatalities in the US per MW of capacity. I do not know of any deaths related to solar farm construction but even if there were one heart attack upon looking at how impressive Nevada Solar One is, the deaths per MW would be less than 0.002 and shrinking. Since coal mining will become more difficult over the next twenty years when coal supply problems will kick in the number of deaths per MW capacity for coal mining is likely only going to increase.
If you come across the calculation that Pachauri was using for the jobs number, I’d be happy to hear about it. Obviously, domestic energy boosts domestic jobs which is a different consideration.
Hank Roberts says
It’s like Godwin’s Law, for climate–you can shut down any subject by starting in on your favorite hobbyhorse. Please don’t.
Look again at the question — have you anything on topic to say?
> the total 20th Century rise is not a useful data constraint on a,
> because one can get this right for any value of a as long as b is
> chosen accordingly. To constrain the value of a – which dominates
> the 21st Century projections — one needs to look at the “new rise”.
> How much has sea level rise accelerated over the 20th Century, in
> response to rising temperatures? That determines how much it will
> accelerate in future when warming continues.
Richard Steckis says
John P. Reisman (OSS Foundation) says:
6 September 2009 at 2:53 PM
#153 John P. Reisman
“[correction]
It was Pachauri that mentioned the “7 persons per megawatt while a solar farm is 23 persons per megawatt”. Biomass energy has some good potential also in my opinion but I can not conjecture on quantitatives at this time.
I have gone over the speech a few times and believe the transcript to be reasonably accurate. Download here:
http://www.ossfoundation.us/projects/environment/global-warming/myths/images/2009-statements/High-Level-Segment-of-WCC3_3rdSeptember09_edit.doc/view”
The litmus test is the experience in Spain which has had an aggressive foray into the so-called green jobs revolution. The outcome was reported by Alvarez et. al. 2009 (http://www.juandemariana.org/pdf/090327-employment-public-aid-renewable.pdf). In essence the outcome is that the US (or anyone else for that matter) can expect the loss of 9 jobs for every 4 jobs created by renewable energy projects that have public aid (as all of them need at this time). It is salient to note in the executive summary of their report:
“5. Despite its hyper-aggressive (expensive and extensive) “green jobs” policies it appears that Spain likely has created a surprisingly low number of jobs, two-thirds of which came in construction, fabrication and installation, one quarter in administrative positions, marketing and projects engineering, and just one out of ten jobs has been created at the more permanent level of actual operation and maintenance of the renewable sources of electricity.
7. The study calculates that since 2000 Spain spent €571,138 to create each
“green job”, including subsidies of more than €1 million per wind industry job. ”
The rest of the points are similarly pessimistic. This is the reality from a country that has tried it. This reality seems to have not caught the attention of Dr. Pachauri et. al.
Alastair McDonald says
Re #171 where simon abingdon says:
“Nuclear power can easily provide ALL the world’s energy needs”
and Stefan responded
“Perhaps it could, but do the sums. Currently (as of 1 August 2009) there are 435 nuclear power stations in the world (9 fewer than in 2002). They supply 2% of the global end energy use.”
According to Uranium 2005: Resources, Production and Demand Based on the 2004 nuclear electricity generation rate of demand the amount [of uranium] is sufficient for 85 years.
http://www.iaea.org/NewsCenter/News/2006/uranium_resources.html
Now, in 2009 there will be enough for 80 years at 2% of global energy consumption. If we could switch to using nuclear energy alone, the uranium would run out with two years!
Cheers, Alastair.
wili says
Let me restate my query to make it more on topic for this thread:
To what extent do the estimations of sea level rise take into account major feedbacks such as methane from the tundra and from sea beds? If they haven’t, what are the ranges of estimates for how much higher sea levels might rise with these feedbacks figured in?
Are there just too many unknowns to make even a stab at this?
Or is the possibility of significant methane release considered too remote?
(In spite of many recent studies to the contrary, for example
http://eprints.soton.ac.uk/64607/ “Warming of the northward-flowing West Spitsbergen current by 1°C over the last thirty years is likely to have increased the release of methane from the seabed by reducing the extent of the GHSZ, causing the liberation of methane from decomposing hydrate. If this process becomes widespread along Arctic continental margins, tens of Teragrams of methane per year could be released into the ocean.” )
Or are the consequences of such releases just too dire to contemplate?
simon abingdon says
The fast breeder reactor (FBR) is designed to breed fuel by producing more fissile material than it consumes. Depletion of uranium reserves therefore not a problem.
Martin Vermeer says
#181 Sekerob:
> New generation plants barely produce waste. It’s almost all recyclable
Actually, that’s not quite true (and I see this misconception all the time). These plants produce less long-lived transuranics by burning them on site — and these are the biggest headaches as they need hundreds of thousands of years of safe storage, not to mention the weapons potential of some of them. This is a good thing of course, but the bulk of the waste is fission products of intermediate atomic weight, mostly having relatively short half lives, like 90Sr with a half life of 29 years, requiring “only” three centuries or so of safe storage.
Yes, the problem can be managed in principle using sound engineering by competent people under good government. I have to side with Stefan here… nothing like a career in climatology to undermine one’s faith in the basic sanity of mankind :-(
#184 Alastair:
Do you actually read the links you post? Like,
“Fast reactor technology would lengthen this period to over 2500 years.”
“However, world uranium resources in total are considered to be much higher. Based on geological evidence and knowledge of uranium in phosphates the study considers more than 35 million tonnes is available for exploitation.”
And then there’s sea water… fuel shortage is never going to be a limiting factor for nuclear fission. That’s because the fuel cost, contrary to fossil fuels, is only a minute fraction of total operating cost. It could go up and up and up, and the economy of nuclear power would be totally unaffected.
simon abingdon says
#176 Ray Ladbury “Every Uranium and Thorium atom in the Universe had its origin in a supernova–so there really isn’t a lot of the stuff out there”
Ray, I can’t believe you said that.
Mark says
“Or is the possibility of significant methane release considered too remote?”
It’s more like saying when they’ll happen is an unanswerable question. When a wing of a jumbo jet develops a small internal crack, there’s no prediction of when it will fail, just that it will fail.
Same here.
It’s not as if the release of methane will REDUCE sea level rise by any measurable amount, is it.
It’s also very similar to Al Gore’s 10m SLR if 20% of the Antartic Ice Sheets melt (or whatever the values were…). It is correct and he even said that “***IF*** it occurs by 2050, then…”. Yet somehow this got spun as a lie as if he’d predicted that the ice sheet WILL melt, which isn’t supported by the science, and so therefore counted as one of the 9 “errors” in An Inconvenient Truth.
So official predictions leave it out since the denialosphere will jump at any conceivable way to misinterpret the statement of methane release so they can say “SEE! We TOLD you they were alarmists!!!!”.
Hank Roberts says
Wili, there’s good coverage here for your questions — see for example
https://www.realclimate.org/index.php/archives/2008/09/how-much-will-sea-level-rise/ — following the cites there forward touches all your issues.
Hank Roberts says
>> “How do you cope with the nuclear proliferation and waste problems
> Start by putting it top of the agenda at Copenhagen.
Original Copenhagen: How do we quit doing something stupid and expensive?
Simon’s Copenhagen: How can two expensive stupidities cost less than one?
John P. Reisman (OSS Foundation) says
#185 wili
https://www.realclimate.org/index.php/archives/2005/12/methane-hydrates-and-global-warming/
Nicolas Nierenberg says
#162, cce I found what Dr. Pachauri is referring to. In section 10.7.4.1 on page 829 there is a reference to a long term equilibrium value for thermal expansion of .2 to .6M per degree C. Multiplied by 2C this yields the range of .4 to 1.2M. This is a circa year 3000 steady state value assuming the ocean warms uniformly.
Figure 10.37 provides a shorter term (2300) view using the A1B scenario of about .2 to .8M from thermal expansion.
I still haven’t seen a reference from anyone showing that the IPCC “knew” that their sea level estimates were low. I suspect the obvious which is that there were some who thought they were low, and some who thought they were high.
John P. Reisman (OSS Foundation) says
#180 Dick Veldkamp
#184 Alastair McDonald
Do you really think it is fair that you point out how dim witted simon moncktons proposal is. Remember, it’s not fair to pick on those weaker than yourselves. Now run a long and play…
…sorry, my ‘iron is wondering y’ among other things…
John P. Reisman (OSS Foundation) says
On current nuclear technology or any nuclear technology that produces weapons grade uranium and has storage issues will mostly likely prove ultimately too expensive for a multitude of reasons.
It’s not just the availability of fuel and storage of waste, it’s the two combined with an risky and unpredictable future in a largely predictable resource scarce environment and an uncertain economic structure, while those still interested in power will pay those that can steal or otherwise attain what they want in the nuclear realm to achieve tier ends if so unconstrained by circumstance.
Alastair McDonald says
Martin,
I did read that “Fast reactor technology would lengthen this period to over 2500 years”, but that still applies to 2% of our current energy consumption, and so only gives us another 100 years if energy consumption continues to rise and all fossil fuel burning has to be ended.
But we do not have fast reactor technology yet! By the time it becomes universally available we will have had to burn all our fossil fuels just to maintain our increasing standard of living, and global warming will have passed well beyond dangerous levels.
How soon can we get fast reactor technology on line? How soon before global temperatures have passed the +2C and become dangerous.
In fact how soon will it be before wild fires threaten Melbourne,Athens, and Los Angeles, and drought brings starvation to Kenya?
http://www.guardian.co.uk/environment/2009/sep/03/climate-change-kenya-10-10
Phil Scadden says
164 –
“electricity generation rate of demand the amount [of uranium] is sufficient for 85 years”
Except the electricity generation is only part of the problem – transport fuel is more so.
Mackay’s “Sustainable Energy without the Hot Air” has good numbers on nuclear. FBR he concludes would deliver 33kWh/d for everyone on the planet for 1000 years without looking at ocean uranium, and ignoring the risks associated with FBR. His conclusion was that it was part of the solution, especially for some countries. It is not a cheap solution anymore than solar PV is cheap. Lets not have an all or nothing approach. What you need is every country making appropriate commitments to cut CO2- thats what Copenhagen is for. How to achieve that is for each country to sort out on the basis of it resources and cost-sensitivity.
Ray Ladbury says
Simon:
READ!
http://en.wikipedia.org/wiki/Abundance_of_the_chemical_elements
There are about 3 trillion hydrogen atoms out there for every Uranium atom. Nuclear powe is a finite resource. Period!
Hank Roberts says
Mark — please withdraw the statement attributed to Al Gore. You’re misremembering something, likely one of the common denial sites’ repetitions.
It’s a poor sort of memory we have. For everything else, there’s Google.
“There have certainly been incorrect assertions and headlines implying that 20 ft of sea level by 2100 was expected, but they are mostly based on a confusion of a transient rise with the eventual sea level rise which might take hundreds to thousands of years. And before someone gets up to say Al Gore, we’ll point out preemptively that he made no prediction for 2100 or any other timescale.”
https://www.realclimate.org/index.php/archives/2008/09/how-much-will-sea-level-rise/