@351.
I’ve cleaned up the graph a tad.
I hope it’s obvious the data is NASA GISS monthly global temperature anomalies. If you do a least squares regression with this data 1980 to Date1, you will get an “end point” at Date1 on a line of slope X. These end points are traced out in black & the slope in red.
If adding more recent data causes the regression slope to increase, it will be due to the temperature rise accelerating during the period of that more recent data. Such acceleration is evident up to July 2007.
“Scientists have long known of the existence of “Antarctic bottom water,” a dense, deep layer of water near the ocean floor that has a significant impact on the movement of the world’s oceans. Three areas where this water is formed were known of, and the existence of a fourth suspected for decades, but the area was far too inaccessible, until now, thanks to the seals.”
“seals foraged on the continental slope as far down as 1,800 meters (1.1 miles), punching through into a layer of this dense water cascading down the abyss”
A question about the jet stream. At Rutgers it was learned that as the plant warms and the differential between the equator and the poles decreases, the jet stream slows and thus we have higher amplitude waves and troughs that persist longer. We can see a little bit of this already in our weather patterns.
My question is, what do the models predict for the far future regarding the jet stream? Does it remain recognizable or does it change into something different entirely, or perhaps cease to exist? What do we know about past warm climates and the nature or absence of the jet stream then?
rephrasing part of John Batteen’s question — what if any proxies do we know of that could give an idea where the jet stream was and how it behaved, in the paleo record? I’d guess maybe specific pollen layers might show up or not show up “downwind” of particular kinds of forests, depending on the prevailing wind during the season, if the layers are that clear. Anything like that known to anyone?
[Response: Very very difficult thing to assess. On the other hand, looking at record of lakes and pollen — where it was wet, where it wasn’t — has constrained jet stream position pretty well on glacial-interglacial timescales. Getting at details on shorter timescales would be much harder.–eric
]
One of the key fundamental question in atmospheric dynamics right now is to understand meridional shifts of the zonally-averaged circulation (and its variability), including descending edge of the Hadley cell or the subtropical or eddy-driven jet. There’s also the emerging topic of how Arctic sea ice feeds back onto the mid-latitude dynamics…unfortunately it seems that only one idea about how that works has ended up hijacking all the popular discussion of the issue, when some people have shown for example that certain types of blocking events (driven by wave-breaking) might become less frequent…Elizabeth Barnes and Dennis Hartmann have some work on this. That’s still a big research topic though, and I quite frequently attend seminars with people saying different things.
One thing a lot of people forget is that the pole-to-equator temperature gradient really only decreases near the surface levels. That’s not true higher up. If you flew up to near the tropical tropopause in a jetpack and doubled CO2, you’d feel warming from increased latent heating (that was amplified relative to the surface). If you then stayed at that pressure level and started to take your jetpack and fly poleward, because the tropopause slopes downward, eventually you would get to a level that hasn’t warmed as much or even cooled if you get into the stratosphere. A lot of idealized experiments show sensitivity to that structure of heating (the shift is different in global warming experiments vs. El Nino events for example)
Regarding the large-scale circulation, a general rule is that stuff shifts poleward in a warming climate, including the descending branch of the Hadley cell, so you might expect to get drier if you sit right on the poleward edge. There’s a lot of details though. Paleoclimatically speaking, I agree with Eric, though even for the last millennium we have some constraints from speleothem (oxygen isotope) and lake sediment records. You have to be careful about the dynamics though and how you talk about it– in South America for example, you have to distinguish between the South American Summer (Austral, DJF) Monsoon and ITCZ displacements, which are geographically and dynamically distinct, but nonetheless the location of the ITCZ is of considerable importance for monsoon intensity.
There’s also the emerging topic of how Arctic sea ice feeds back onto the mid-latitude dynamics…unfortunately it seems that only one idea about how that works has ended up hijacking all the popular discussion of the issue, when some people have shown for example that certain types of blocking events (driven by wave-breaking) might become less frequent…Elizabeth Barnes and Dennis Hartmann have some work on this.
Chris, can you say more about this? I did a quick search on Google Scholar, but didn’t really turn up much of substance. Got any bibliography you can point us to?
Someone like Gavin will have a better opinion than me on that article, but this is a pet peeve of mine. Almost by definition, the true uncertainties cannot grow with time. As science progresses, what we know grows, and thus the uncertainty goes down. The only thing that can change is the perception of uncertainty. Once you hit a threshold of discovering new physics (or chemistry, biological interactions, etc) that are found to be important, then the perception of what was uncertain might grow since you now need to be thinking more carefully about a previous unknowns. But nature actually didn’t care 10 years ago what we perceived. The uncertainties were larger 10 years ago, period.
The Nature headline “Estimates of climate-change impacts will get less, rather than more, certain” therefore actually doesn’t make any sense. There are certain properties of the system (e.g., weather predictability months in advance) which almost certainly be impossible (for well-understood reasons), and thus there are things that have thresholds of understanding. At that point, you (or NSF) need to decide whether the ratio of prospective understanding to research demand and expense (“RUDE” as I’ll call it) is sufficiently high enough. I don’t think RUDE is anywhere near high enough on a plethora of topics (at least I hope so, other wise I’d be venturing into a dead field) and advertising that to be the case does little but shy young students away from the field or make room for political talking points.
I am also not impressed with the Ashutosh Jogalekar article you link to. Jogalekar says little more than ‘as a molecular modeller that knows nothing about climatology, I agree with what Maslin & Austin said.’
In providing this message, Jogalekar entirely (and dangerously) misses the main Maslin & Austin take-home message which was that we do know enough about AGW to know we need to act on it now and not tomorrow.
Having missed the point, perhaps it would have been better if Jogalekar had stuck with his explorations communicating with extraterrestrials using isotopes or his investigations into the chemistry of hedgehogs, somewhere where his ‘I’m an expert modeller‘ commentary wouldn’t give such mixed and unhelpful messages.
Uncertainty can grow with time. A mundane example would be uncertainty estimates on global average temperature during a period of warming as we are experiencing now and during a period of relative stability in the past,say the last 100 years and the 100 years before that. The variance will be higher in the more recent period.
That example is a physical change. We can also face reduced capability to measure. The sequester may hamper up-to-date weather forecasting owing to lack of staffing while a snafu trying to marry civilian and military satellite programs may lead to reduced remote sensing capability for climate rather soon. A degraded capability to make measurements increases uncertainty.
Changing satellites to do the same job can also introduce uncertainties. Ideally, one wants a period when both are operating to get a cross calibration, but that does not always happen. One may need to include an estimate of systematic uncertainty that was not required before in differential measurements.
The issue of error propagation when including more measurement based inputs in models can also lead to increased uncertainties in the output.
Sometimes you have to take a few steps back to start making progress again.
In terms of perception, it is a mark of wisdom to perceive that there is much more that you don’t know than you do know. So, even getting older increases uncertainty in that sense.
Ray Ladburysays
Two quotes by Richard Hamming:
“The purpose of computing is insight, not numbers.”
“In science if you know what you are doing you should not be doing it.
In engineering if you do not know what you are doing you should not be doing it.”
Anyone who is designing the infrastructure of tomorrow based solely on the output of a model–any model–is a fool. Models provide insight. They are not reality. Engineering is the art of turning the insight derived from models into robust but economical designs. Reality ain’t simple. Deal with it.
Chris Dudley and Chris Colose are grappling with different aspects of the uncertainty elephant.
I’d guess finding the right words — used by those who study these things — would help. Can you clarify which kinds of uncertainty are talked about?
Dan H.says
With regards to uncertainty, Chris and Chris are discussing different aspects. On general, an increase in observations or an improvement in the measurement system will lead to decreased uncertainty (unless some outside force has caused the measurements to vary on a mcuh greater basis). Also, adding another parameter to the equation will increase uncertainty. This can lead to greater accuracy as more inputs have been included in the final product, but a greater uncertainty range. To some that may sound strange, but that is how statistics work.
Jim Larsensays
373 Chris C said, ” the perception of what was uncertain might grow since you now need to be thinking more carefully about a previous unknowns. But nature actually didn’t care 10 years ago what we perceived. The uncertainties were larger 10 years ago, period.”
Yep. I find it humorous to read scientists’ uncertainly “declarations”, as they are almost always blatantly wrong. Perhaps this is because scientists have the habit of not adding in the “unknowns not studied in this particular paper”. If you didn’t study it, then you’ve GOT to add it in with HUGE error bars, preferably at the end after you’ve already come to the “limited conclusion” derived by what WAS studied – that is, if you care a whit about what your paper is used for in the real world. In days past, this didn’t matter, but today there are huge real world ramifications of spouting incomplete conclusions as if they were complete, or with the incompleteness not defined with error bars.
375 Chris D said, ” A degraded capability to make measurements increases uncertainty.”
Nope. We could stop taking measurements completely and our uncertainty would either remain constant or go down as our analysis of the pre-existing data gets better. Now, if you’re saying we trash all our old data, THEN you could be right.
Nope. Today’s inability to give good tornado warnings compared to yesterday’s better ability represents an increase in uncertainty owing to poorer measurement. Don’t twist meanings like that.
Changing variance in data, changing (abstract) detector noise, introduction of new systematics, model input error propagation (the original topic)and the phenomenon of learning more to understand less, Bohr’s favorite posture since enlightenment may follow.
flxiblesays
“Todays inability to give good tornado warnings compared to yesterdays better ability represents an increase in uncertainty owing to poorer measurement.”
“yesterdays better ability“??? “owing to poorer measurement”??? Need further elaboration on those please.
“… the NAO or North Atlantic Oscillation…. is in a highly negative phase right now and what that does is support blocking patterns in the north Atlantic. Hmmmm, we have a blocking pattern in the north Atlantic right now so this is starting to make sense. The Omega / REX block is exactly what we would expect with the negative phase of the NAO.
Right now we have a powerful low pressure over southern Canada with a strong cold front and Hurricane Sandy in the Carribean…..”
We’re talking about different issues. The examples you bring up aren’t ones of science per se, but one of tools. Clearly tools can help improve knowledge of how the world works, but I don’t see how the lack of tools subtracts knowledge. It’s just not very conducive for progress.
Without getting too philosophical about such things, I think it’s fair to say that we know much more about the climate than we did 10 or 20 years ago, and I have a hard time believing that technology will be anything but beneficial in keeping that trend going. This isn’t just true from a scientific community perspective, but with the exception of the WUWT’s and Curry’s of the internet, the nature of the questions that people are asking in the public-sphere has evolved tremendously as well.
381 Chris D said, “Today’s inability to give good tornado warnings compared to yesterday’s better ability represents an increase in uncertainty owing to poorer measurement. Don’t twist meanings like that.”
I was true to the subject of climate change. Methinks you’re twisting the meaning by changing the subject to weather. I agree that weather forecasting requires current data. I don’t agree that that even slightly transfers to climate change.
Hypothetical and unproven: in 1970 a top climate scientist could estimate how many tornados would occur in 2014 through 2023, and then make another estimate today using NO post 1970 data. My stance is that today’s 2014 through 2023 tornado estimates using 1970 data would have less uncertainty than the estimate made in 1970. Thus, even without any new data, climate uncertainty still declines.
“… but with the exception of the WUWT’s and Curry’s of the internet, the nature of the questions that people are asking in the public-sphere has evolved tremendously as well.”
I find it incredible that Google Scholar actually indexes the WUWT and Climate Etc blogs and their comment sections as citeable scholarly reference works. This really waters down the quality of searches.
CO2 isn’t the only pollutant, as we also have to face the prospect of information pollution.
——
… production of many fruit and seed crops that make diets interesting, such as tomatoes, coffee and watermelon, is limited because their flowers are not adequately pollinated,” says Harder. “We also show that adding more honey bees often does not fix this problem, but that increased service by wild insects would help.”
—— DOI: 10.1126/science.1230200
Should have said next week’s inability. People who predict severe weather will be furloughed as a result of the sequester. That makes the detection system less effective. So, the signal-to-noise ratio goes down.
It’s fatally flawed, but it’s a start. Our PUBLIC knowledge paid for by OUR tax dollars will actually be able to be seen by US! The fatal flaw? Journals still get to charge out the wazoo for a year after publication.
Even so, I still think that any scientist who does not post her work on a free site immediately upon publication is doing us all a GREAT disservice. Is there any legal reason why you guys don’t step up and do the right thing?
I’m with Bohr that measurement is fundamental to science. Change the way we measure things and the science changes. It can change for the worse. How solid is a result if it is no longer reproducible? We’d end up with arguments from authority. Our tools are our science in such a deep way that we can’t really speak of scientific knowledge without them.
I agree there has been progress in climate science over the last two decades. But some things, like how the non-narrow range of possible net aerosol climate effects seems to almost solidify, suggest that we are learning most that there is much more work to do.
flxiblesays
Hank @390 quotes inaccurate article. Tomatoes do not require insect pollination, they are self fertile, the slightest breeze does fine – and watermelon can as easily be pollinated by crawling insects like ants or by hand, like the whole Cucurbitaceae family – in fact seedless varieties need to be protected from flying insect pollination.
Patricksays
Hank, thanks for the Aldo Leopold quotes. Where are they from?
One thing about carefully observing things: one thing leads to another.
This book review in the current EOS is worth a look; a book worth thinking on:
(may be paywalled)
Fields and Streams: Stream Restoration, Neoliberalism, and the Future of Environmental Science
Martin Doyle
Article first published online: 26 FEB 2013
DOI: 10.1002/2013EO090013 http://onlinelibrary.wiley.com/doi/10.1002/2013EO090013/pdf
“… Lave argues that Rosgen’s success [an ‘outsider’ whose methods became the standard for stream restoration quite rapidly] is symptomatic of a new era when claims to expertise are validated outside the halls of universities or the pages of peer‐reviewed journals, outside of the typical model of how science is done. She argues that as market‐like approaches—“neoliberalism”—are used in environmental conservation, we should expect to see frequent Rosgen‐like situations, when the market decides who is the expert, not an associate editor or a department head…..
“…
“… this book is a mirror for making geomorphologists look at their own practices and claims to expertise. It is rare that a social scientist understands bankfull discharge, sediment transport, and why Luna Leopold—the godfather of modern fluvial geomorphology—is important. But in addition to her understanding of these concepts, by being an outsider, Lave has the capability and luxury to see the weaknesses in geomorphology as well. She can see the inconsistencies and dogmas that exist and how these are created and sustained by geomorphologists’ presumed practices that are used to establish credentials as an expert.
…
“Lave may be emerging as the knuckleballer of geomorphology. She has great skill—her writing is clear and concise—and she throws a pitch of ideas that few natural scientists will have ever seen before yet need to see….”
“… the ambition to change the very nature of knowledge production about both the natural and social worlds. Analysts need to take neoliberal theorists like Hayek at their word when they state that the Market is the superior information processor par excellence. The theoretical impetus behind the rise of the natural science think tanks is the belief that science progresses when everyone can buy the type of science they like, dispensing with whatever the academic disciplines say is mainstream or discredited science ….”
———-
simon abingdonsays
#385 Chris Colose
“I think it’s fair to say that we know much more about the climate than we did 10 or 20 years ago”.
I’m sure that must be true.
What’s more “I think it’s fair to say that we still need to know much more about the climate than we thought we did 10 or 20 years ago”.
The market for information can still be Freeped. How knowledge gets filtered is the important consideration. At some point, there will be enough semantic information available that knowledge sources will be automatically filtered like spam.
> Hank quoted “the Market ….”
(with trepidation, mind you, not approval)
I quoted the statement of the “theory”:
“The theoretical impetus behind the rise of the natural science think tanks is the belief that science progresses when everyone can buy the type of science they like, dispensing with whatever the academic disciplines say is mainstream or discredited science ….”
As a theory about science, that has failed every time: lead, asbestos, tobacco, antibiotic resistance, pesticide resistance
As an excuse for $$PROFIT$$ it’s succeeded, by externalizing and diluting costs to where they’re very hard to allocate.
But — Catch-22 — their theory says it’s right if it’s what they want.
Call it the Aleister Crowley approach to politics.
Hank Roberts says
> As the red trace in this (temporarily-linked) graphic shows
Please label that “temporarily linked” drawing here since it has no label or cite.
MARodger says
@351.
I’ve cleaned up the graph a tad.
I hope it’s obvious the data is NASA GISS monthly global temperature anomalies. If you do a least squares regression with this data 1980 to Date1, you will get an “end point” at Date1 on a line of slope X. These end points are traced out in black & the slope in red.
If adding more recent data causes the regression slope to increase, it will be due to the temperature rise accelerating during the period of that more recent data. Such acceleration is evident up to July 2007.
David B. Benson says
Extreme Weather Linked to Giant Waves in Atmosphere
http://www.livescience.com/27427-giant-air-waves-extreme-weather.html
Yup.
David B. Benson says
New Maps Depict Potential Worldwide Coral Bleaching by 2056
http://www.sciencedaily.com/releases/2013/02/130225122045.htm
Yup.
Jim Larsen says
“Scientists have long known of the existence of “Antarctic bottom water,” a dense, deep layer of water near the ocean floor that has a significant impact on the movement of the world’s oceans. Three areas where this water is formed were known of, and the existence of a fourth suspected for decades, but the area was far too inaccessible, until now, thanks to the seals.”
“seals foraged on the continental slope as far down as 1,800 meters (1.1 miles), punching through into a layer of this dense water cascading down the abyss”
http://www.newsdaily.com/stories/bre91p030-us-australia-antarctic-seals/
That’s 18 times as deep as the human record (101 meters), and the seals still have time to browse.
http://en.wikipedia.org/wiki/Constant_Weight_Apnea_Without_Fins
CL Pohlman says
Thank you everyone! I wasn’t sure whether this was a new denialist invention or not.
Chris Dudley says
David (#353),
Your link said the study was out yesterday, but I could not find it. This might be the eventual link. http://www.pnas.org/lookup/doi/10.1073/pnas.1222000110
From here: http://www.eurekalert.org/pub_releases/2013-02/pifc-we022513.php
Kevin McKinney says
#353–Thanks, DBB! As Stefan is a co-author, I hope we’ll get a post on this!
Chris Dudley says
Looks like January 2013 was the sixth warmest. http://data.giss.nasa.gov/gistemp/tabledata_v3/GLB.Ts+dSST.txt
Kevin McKinney says
#359–as compared with 9th-warmest for NCDC; ‘pretty toasty,’ as I called it when the NCDC analysis came out, but not a record.
John Batteen says
A question about the jet stream. At Rutgers it was learned that as the plant warms and the differential between the equator and the poles decreases, the jet stream slows and thus we have higher amplitude waves and troughs that persist longer. We can see a little bit of this already in our weather patterns.
My question is, what do the models predict for the far future regarding the jet stream? Does it remain recognizable or does it change into something different entirely, or perhaps cease to exist? What do we know about past warm climates and the nature or absence of the jet stream then?
Russell says
Those wishing to remind Larry Bell that his Forbes column is not immune to fisking will find him trying to terminate IPCC funding at :
http://www.forbes.com/sites/larrybell/2013/02/24/yes-we-should-defund-the-u-n-s-intergovernmental-panel-on-climate-change/
Hank Roberts says
rephrasing part of John Batteen’s question — what if any proxies do we know of that could give an idea where the jet stream was and how it behaved, in the paleo record? I’d guess maybe specific pollen layers might show up or not show up “downwind” of particular kinds of forests, depending on the prevailing wind during the season, if the layers are that clear. Anything like that known to anyone?
[Response: Very very difficult thing to assess. On the other hand, looking at record of lakes and pollen — where it was wet, where it wasn’t — has constrained jet stream position pretty well on glacial-interglacial timescales. Getting at details on shorter timescales would be much harder.–eric
]
Chris Colose says
John Batteen– Big question!!!
One of the key fundamental question in atmospheric dynamics right now is to understand meridional shifts of the zonally-averaged circulation (and its variability), including descending edge of the Hadley cell or the subtropical or eddy-driven jet. There’s also the emerging topic of how Arctic sea ice feeds back onto the mid-latitude dynamics…unfortunately it seems that only one idea about how that works has ended up hijacking all the popular discussion of the issue, when some people have shown for example that certain types of blocking events (driven by wave-breaking) might become less frequent…Elizabeth Barnes and Dennis Hartmann have some work on this. That’s still a big research topic though, and I quite frequently attend seminars with people saying different things.
One thing a lot of people forget is that the pole-to-equator temperature gradient really only decreases near the surface levels. That’s not true higher up. If you flew up to near the tropical tropopause in a jetpack and doubled CO2, you’d feel warming from increased latent heating (that was amplified relative to the surface). If you then stayed at that pressure level and started to take your jetpack and fly poleward, because the tropopause slopes downward, eventually you would get to a level that hasn’t warmed as much or even cooled if you get into the stratosphere. A lot of idealized experiments show sensitivity to that structure of heating (the shift is different in global warming experiments vs. El Nino events for example)
Regarding the large-scale circulation, a general rule is that stuff shifts poleward in a warming climate, including the descending branch of the Hadley cell, so you might expect to get drier if you sit right on the poleward edge. There’s a lot of details though. Paleoclimatically speaking, I agree with Eric, though even for the last millennium we have some constraints from speleothem (oxygen isotope) and lake sediment records. You have to be careful about the dynamics though and how you talk about it– in South America for example, you have to distinguish between the South American Summer (Austral, DJF) Monsoon and ITCZ displacements, which are geographically and dynamically distinct, but nonetheless the location of the ITCZ is of considerable importance for monsoon intensity.
Pete Dunkelberg says
Energy enthusiasts, you might want to engage in a discussion here:
http://blogs.law.harvard.edu/ellachou/2013/02/04/keyenergyissues/
I made a starting (actually third or so there) awaiting moderation. I mentioned this
http://blogs.law.harvard.edu/ellachou/2013/02/04/keyenergyissues/ first, and then other things.
Pete Dunkelberg says
I need more practice making simple comments. I meant that at the other place I mentioned this
http://thinkprogress.org/climate/2011/01/10/207320/the-full-global-warming-solution-how-the-world-can-stabilize-at-350-to-450-ppm/
first, and then went on.
Kevin McKinney says
#364–
Chris, can you say more about this? I did a quick search on Google Scholar, but didn’t really turn up much of substance. Got any bibliography you can point us to?
grypo says
Kevin,
http://barnes.atmos.colostate.edu/FILES/MANUSCRIPTS/Barnes_Hartmann_2012_JGR.pdf
Pat Cassen says
With regard to the possible connection between arctic sea ice and mid-latitude atmospheric blocking events, perhaps Stefan Rahmstorf could give us an update with a post on this recently accepted paper:
Petoukhov, V., Rahmstorf, S., Petri, S., Schellnhuber, H. J. (2013): Quasi-resonant amplification of planetary waves and recent Northern Hemisphere weather extremes. Proceedings of the National Academy of Sciences
Chris Colose says
Kevin- Also, here is a more recent discussion looking at the CMIP5 models
Kevin McKinney says
#368, 370–Thank you, gentlefolk!
David B. Benson says
Are climate change models becoming more accurate and less reliable?
http://blogs.scientificamerican.com/the-curious-wavefunction/2013/02/27/are-more-accurate-climate-change-models-worse/
Useful reminders about models in general.
Chris Colose says
David Benson (372)–
Someone like Gavin will have a better opinion than me on that article, but this is a pet peeve of mine. Almost by definition, the true uncertainties cannot grow with time. As science progresses, what we know grows, and thus the uncertainty goes down. The only thing that can change is the perception of uncertainty. Once you hit a threshold of discovering new physics (or chemistry, biological interactions, etc) that are found to be important, then the perception of what was uncertain might grow since you now need to be thinking more carefully about a previous unknowns. But nature actually didn’t care 10 years ago what we perceived. The uncertainties were larger 10 years ago, period.
The Nature headline “Estimates of climate-change impacts will get less, rather than more, certain” therefore actually doesn’t make any sense. There are certain properties of the system (e.g., weather predictability months in advance) which almost certainly be impossible (for well-understood reasons), and thus there are things that have thresholds of understanding. At that point, you (or NSF) need to decide whether the ratio of prospective understanding to research demand and expense (“RUDE” as I’ll call it) is sufficiently high enough. I don’t think RUDE is anywhere near high enough on a plethora of topics (at least I hope so, other wise I’d be venturing into a dead field) and advertising that to be the case does little but shy young students away from the field or make room for political talking points.
MARodger says
David B. Benson @372.
I am also not impressed with the Ashutosh Jogalekar article you link to. Jogalekar says little more than ‘as a molecular modeller that knows nothing about climatology, I agree with what Maslin & Austin said.’
In providing this message, Jogalekar entirely (and dangerously) misses the main Maslin & Austin take-home message which was that we do know enough about AGW to know we need to act on it now and not tomorrow.
Having missed the point, perhaps it would have been better if Jogalekar had stuck with his explorations communicating with extraterrestrials using isotopes or his investigations into the chemistry of hedgehogs, somewhere where his ‘I’m an expert modeller‘ commentary wouldn’t give such mixed and unhelpful messages.
Chris Dudley says
Chris (#373),
Uncertainty can grow with time. A mundane example would be uncertainty estimates on global average temperature during a period of warming as we are experiencing now and during a period of relative stability in the past,say the last 100 years and the 100 years before that. The variance will be higher in the more recent period.
That example is a physical change. We can also face reduced capability to measure. The sequester may hamper up-to-date weather forecasting owing to lack of staffing while a snafu trying to marry civilian and military satellite programs may lead to reduced remote sensing capability for climate rather soon. A degraded capability to make measurements increases uncertainty.
Changing satellites to do the same job can also introduce uncertainties. Ideally, one wants a period when both are operating to get a cross calibration, but that does not always happen. One may need to include an estimate of systematic uncertainty that was not required before in differential measurements.
The issue of error propagation when including more measurement based inputs in models can also lead to increased uncertainties in the output.
Sometimes you have to take a few steps back to start making progress again.
In terms of perception, it is a mark of wisdom to perceive that there is much more that you don’t know than you do know. So, even getting older increases uncertainty in that sense.
Ray Ladbury says
Two quotes by Richard Hamming:
“The purpose of computing is insight, not numbers.”
“In science if you know what you are doing you should not be doing it.
In engineering if you do not know what you are doing you should not be doing it.”
Anyone who is designing the infrastructure of tomorrow based solely on the output of a model–any model–is a fool. Models provide insight. They are not reality. Engineering is the art of turning the insight derived from models into robust but economical designs. Reality ain’t simple. Deal with it.
Hank Roberts says
Chris Dudley and Chris Colose are grappling with different aspects of the uncertainty elephant.
I’d guess finding the right words — used by those who study these things — would help. Can you clarify which kinds of uncertainty are talked about?
Dan H. says
With regards to uncertainty, Chris and Chris are discussing different aspects. On general, an increase in observations or an improvement in the measurement system will lead to decreased uncertainty (unless some outside force has caused the measurements to vary on a mcuh greater basis). Also, adding another parameter to the equation will increase uncertainty. This can lead to greater accuracy as more inputs have been included in the final product, but a greater uncertainty range. To some that may sound strange, but that is how statistics work.
Jim Larsen says
373 Chris C said, ” the perception of what was uncertain might grow since you now need to be thinking more carefully about a previous unknowns. But nature actually didn’t care 10 years ago what we perceived. The uncertainties were larger 10 years ago, period.”
Yep. I find it humorous to read scientists’ uncertainly “declarations”, as they are almost always blatantly wrong. Perhaps this is because scientists have the habit of not adding in the “unknowns not studied in this particular paper”. If you didn’t study it, then you’ve GOT to add it in with HUGE error bars, preferably at the end after you’ve already come to the “limited conclusion” derived by what WAS studied – that is, if you care a whit about what your paper is used for in the real world. In days past, this didn’t matter, but today there are huge real world ramifications of spouting incomplete conclusions as if they were complete, or with the incompleteness not defined with error bars.
375 Chris D said, ” A degraded capability to make measurements increases uncertainty.”
Nope. We could stop taking measurements completely and our uncertainty would either remain constant or go down as our analysis of the pre-existing data gets better. Now, if you’re saying we trash all our old data, THEN you could be right.
David B. Benson says
Pleased with this discussion.
Chris Dudley says
Jim (#379),
Nope. Today’s inability to give good tornado warnings compared to yesterday’s better ability represents an increase in uncertainty owing to poorer measurement. Don’t twist meanings like that.
Chris Dudley says
Hank (#377),
Changing variance in data, changing (abstract) detector noise, introduction of new systematics, model input error propagation (the original topic)and the phenomenon of learning more to understand less, Bohr’s favorite posture since enlightenment may follow.
flxible says
“Todays inability to give good tornado warnings compared to yesterdays better ability represents an increase in uncertainty owing to poorer measurement.”
“yesterdays better ability“??? “owing to poorer measurement”??? Need further elaboration on those please.
Hank Roberts says
> tornado warnings … poorer measurement
Same for large storms; the Europeans do it better than the USA nowadays:
http://fox41blogs.typepad.com/wdrb_weather/2012/10/update-to-potential-superstorm-hitting-us-impacts-to-kentuckiana.html
“… the NAO or North Atlantic Oscillation…. is in a highly negative phase right now and what that does is support blocking patterns in the north Atlantic. Hmmmm, we have a blocking pattern in the north Atlantic right now so this is starting to make sense. The Omega / REX block is exactly what we would expect with the negative phase of the NAO.
Right now we have a powerful low pressure over southern Canada with a strong cold front and Hurricane Sandy in the Carribean…..”
Chris Colose says
Chris Dudley-
We’re talking about different issues. The examples you bring up aren’t ones of science per se, but one of tools. Clearly tools can help improve knowledge of how the world works, but I don’t see how the lack of tools subtracts knowledge. It’s just not very conducive for progress.
Without getting too philosophical about such things, I think it’s fair to say that we know much more about the climate than we did 10 or 20 years ago, and I have a hard time believing that technology will be anything but beneficial in keeping that trend going. This isn’t just true from a scientific community perspective, but with the exception of the WUWT’s and Curry’s of the internet, the nature of the questions that people are asking in the public-sphere has evolved tremendously as well.
David B. Benson says
Historic Datasets Reveal Effects of Climate Change and Habitat Loss On Plant-Pollinator Networks
http://www.sciencedaily.com/releases/2013/02/130228155624.htm
Solitary bees in serious decline.
Jim Larsen says
381 Chris D said, “Today’s inability to give good tornado warnings compared to yesterday’s better ability represents an increase in uncertainty owing to poorer measurement. Don’t twist meanings like that.”
I was true to the subject of climate change. Methinks you’re twisting the meaning by changing the subject to weather. I agree that weather forecasting requires current data. I don’t agree that that even slightly transfers to climate change.
Hypothetical and unproven: in 1970 a top climate scientist could estimate how many tornados would occur in 2014 through 2023, and then make another estimate today using NO post 1970 data. My stance is that today’s 2014 through 2023 tornado estimates using 1970 data would have less uncertainty than the estimate made in 1970. Thus, even without any new data, climate uncertainty still declines.
WebHubTelescope says
CC said:
I find it incredible that Google Scholar actually indexes the WUWT and Climate Etc blogs and their comment sections as citeable scholarly reference works. This really waters down the quality of searches.
CO2 isn’t the only pollutant, as we also have to face the prospect of information pollution.
Hank Roberts says
Thank you David, for that bit of very bad news.
“The first rule of an intelligent tinkerer is to keep all of the pieces.”
— Aldo Leopold
Hank Roberts says
The study David cited above is:
DOI: 10.1126/science.1232728
More from the same issue:
http://www.sciencedaily.com/releases/2013/02/130228155622.htm
——
… production of many fruit and seed crops that make diets interesting, such as tomatoes, coffee and watermelon, is limited because their flowers are not adequately pollinated,” says Harder. “We also show that adding more honey bees often does not fix this problem, but that increased service by wild insects would help.”
—— DOI: 10.1126/science.1230200
Chris Dudley says
flxible (#383),
Should have said next week’s inability. People who predict severe weather will be furloughed as a result of the sequester. That makes the detection system less effective. So, the signal-to-noise ratio goes down.
Jim Larsen says
http://www.newsdaily.com/stories/bre91n01c-us-usa-whitehouse-information/
It’s fatally flawed, but it’s a start. Our PUBLIC knowledge paid for by OUR tax dollars will actually be able to be seen by US! The fatal flaw? Journals still get to charge out the wazoo for a year after publication.
Even so, I still think that any scientist who does not post her work on a free site immediately upon publication is doing us all a GREAT disservice. Is there any legal reason why you guys don’t step up and do the right thing?
Chris Dudley says
Chris (#385),
I’m with Bohr that measurement is fundamental to science. Change the way we measure things and the science changes. It can change for the worse. How solid is a result if it is no longer reproducible? We’d end up with arguments from authority. Our tools are our science in such a deep way that we can’t really speak of scientific knowledge without them.
I agree there has been progress in climate science over the last two decades. But some things, like how the non-narrow range of possible net aerosol climate effects seems to almost solidify, suggest that we are learning most that there is much more work to do.
flxible says
Hank @390 quotes inaccurate article. Tomatoes do not require insect pollination, they are self fertile, the slightest breeze does fine – and watermelon can as easily be pollinated by crawling insects like ants or by hand, like the whole Cucurbitaceae family – in fact seedless varieties need to be protected from flying insect pollination.
Patrick says
Hank, thanks for the Aldo Leopold quotes. Where are they from?
One thing about carefully observing things: one thing leads to another.
http://www.news.wisc.edu/releases/17740
http://www.plosone.org/article/info%3Adoi/10.1371/journal.pone.0053788#s1
Hank Roberts says
This book review in the current EOS is worth a look; a book worth thinking on:
(may be paywalled)
Fields and Streams: Stream Restoration, Neoliberalism, and the Future of Environmental Science
Martin Doyle
Article first published online: 26 FEB 2013
DOI: 10.1002/2013EO090013
http://onlinelibrary.wiley.com/doi/10.1002/2013EO090013/pdf
“… Lave argues that Rosgen’s success [an ‘outsider’ whose methods became the standard for stream restoration quite rapidly] is symptomatic of a new era when claims to expertise are validated outside the halls of universities or the pages of peer‐reviewed journals, outside of the typical model of how science is done. She argues that as market‐like approaches—“neoliberalism”—are used in environmental conservation, we should expect to see frequent Rosgen‐like situations, when the market decides who is the expert, not an associate editor or a department head…..
“…
“… this book is a mirror for making geomorphologists look at their own practices and claims to expertise. It is rare that a social scientist understands bankfull discharge, sediment transport, and why Luna Leopold—the godfather of modern fluvial geomorphology—is important. But in addition to her understanding of these concepts, by being an outsider, Lave has the capability and luxury to see the weaknesses in geomorphology as well. She can see the inconsistencies and dogmas that exist and how these are created and sustained by geomorphologists’ presumed practices that are used to establish credentials as an expert.
…
“Lave may be emerging as the knuckleballer of geomorphology. She has great skill—her writing is clear and concise—and she throws a pitch of ideas that few natural scientists will have ever seen before yet need to see….”
——
Aside — I recall a foreshadowing article here:
THE SOCIAL SCIENCE RESEARCH COUNCIL, JULY 2008
Philip Mirowski
The Rise of the Dedicated Natural Science Think Tank
http://www.ssrc.org/workspace/images/crm/new_publication_3/%7Beee91c8f-ac35-de11-afac-001cc477ec70%7D.pdf
“… the ambition to change the very nature of knowledge production about both the natural and social worlds. Analysts need to take neoliberal theorists like Hayek at their word when they state that the Market is the superior information processor par excellence. The theoretical impetus behind the rise of the natural science think tanks is the belief that science progresses when everyone can buy the type of science they like, dispensing with whatever the academic disciplines say is mainstream or discredited science ….”
———-
simon abingdon says
#385 Chris Colose
“I think it’s fair to say that we know much more about the climate than we did 10 or 20 years ago”.
I’m sure that must be true.
What’s more “I think it’s fair to say that we still need to know much more about the climate than we thought we did 10 or 20 years ago”.
WebHubTelescope says
HR quoted this statement:
Yet, in today’s Guardian it was revealed that 13 of the top 17 blogs nominated for best science blog in the Bloggies were aligned with AGW skepticism:
http://www.guardian.co.uk/environment/blog/2013/mar/01/climate-sceptics-capture-bloggies-science
The market for information can still be Freeped. How knowledge gets filtered is the important consideration. At some point, there will be enough semantic information available that knowledge sources will be automatically filtered like spam.
Hank Roberts says
> Hank quoted “the Market ….”
(with trepidation, mind you, not approval)
I quoted the statement of the “theory”:
“The theoretical impetus behind the rise of the natural science think tanks is the belief that science progresses when everyone can buy the type of science they like, dispensing with whatever the academic disciplines say is mainstream or discredited science ….”
As a theory about science, that has failed every time: lead, asbestos, tobacco, antibiotic resistance, pesticide resistance
As an excuse for $$PROFIT$$ it’s succeeded, by externalizing and diluting costs to where they’re very hard to allocate.
But — Catch-22 — their theory says it’s right if it’s what they want.
Call it the Aleister Crowley approach to politics.
Jeffrey Davis says
“the Market is the superior information processor par excellence.”
Which is why the market dumped trillions of dollars in the real estate derivative crash of 2008.