I expect that “writ[ing] a check to Heartland Institute” (@120) will not assist these efforts, since I have been unable to find any relevant research sponsored by that organization.
David Youngsays
Meow, Your first reference is indeed on the right track and Salinger is good. They know about all the good methods. However, if you read the fine print, they say that their preconditioner is actually slower than the baseline code and they are working on a better one. That’s good and we’ve been through that too in the 1980’s. The problem is that their baseline code is so complex, some terms are treated implicitly and other explicitly so its tough. Their spectral element method is used ONLY to test other methods that are used in the real codes. By the way, spectral element methods REQUIRE variable order, something which is often neglected.
However, the point is that rewriting at least one model is required so it is designed to use the best methods. I’ve done this myself. Tacking a modern accelerator onto a legacy code helps, but is orders of magnitude worse than really doing it fully implicitly. Certainly even the AR5 simulations are based on the legacy codes and I’ll wager you finite differences. The AR5 simulations are indeed sensitive to the period of calibration. See Judy’s Curry material for an example.
At any rate, the point is that the evidence from multiple sources says that the uncertainties are much larger than one would get the impression from reading the 1000 papers based on running the models.
You know, sometimes the scientific process involves strong criticism. There are 2 responses. The first is to really understand it and realize that you will be embarrased if you don’t fix the problem. The second is to try to surpress the criticism or even worse get editors fired, etc. Perhaps even Heartland has a place in the debate if the World Wildlife Fund does. It is a free country.
David Youngsays
Meow, look at the outlined box in the upper left hand corner of your first reference. What you see is that climate models have gone BACKWARD recently and have “returned to fully explicit methods.”
The authors are trying to address the problems caused by this seeming insistence on repeating history. Maybe I do need to write that check after all.
Rob Nichollssays
Hank Roberts, thanks v much for your help
re: SO2 emissions from China. (posts 128 and 133).
It does seem that sulphur dioxide emissions from China have fallen since 2006.
Re David Young – Floating the idea of supporting the Heartland Institute (which isn’t even all about climate (anti)science; you’ll have ‘collateral damage’) raises a red flag, doesn’t it? I don’t see how increasing the errors in public opinion would reduce errors in computer modelling.
First thanks for taking the time to respond. Sorry for not prefacing my video codec spiel, with the idea that Video Compression is not unlike a creating a model of real world event. The digitized data is a representation of the light reflecting from a series of objects. The compression effort is similar to converting the measured values to a grid. The issue with data compression is the attempt to create a high resolution data representation with 1% of the data. This would be similar to taking 7200 data collection stations having 10 variables with ranges that can span 15000 discrete values per min. and trying to represent a trend in just one of the varibles which is dependent on the others either in a group or directly.
The point being in order to maintain the smooth representation requires predictive analysis. Knowing that a value is changing, the rate that it is changing and the ability to determine if the change is an anomoly or noise or valid data. (Hence part of the purpose of the FT filtering.)
The tools you had been talking about so far appear rather rudimentry IMHO. The management of an evolving data set, growing out of a series of equations, that feed the next calculation point have to of advanced light years from a mere 20 years ago. I will review the references you have suggsted to see where I have gone wrong.
(I concur that reducing grid size as well as increasing steps, would be equivalent to increased resolution, hence error values would decrease. In the instance I was referencing regarded the sumation of a series of larger grids created from many discrete data points, while maintaining step size and encoding reversibility.
As to defensive, I believe you could characterize it that way. The problem is if you leave a crack some try to wedge it open to create a door. The flip side is that confusion or a lack of clarity of what is known or unknown may partially explain the reason that a defensive posture is required. (It is when there is a misunderstanding that doubt occurs, the hard part is trying to be open and at the same time not to have to defend what you have no control over.)
As this takes things way off topic I will quit here and attempt to review your points, not that it matters; but, so at least I better understand the point you are attempting to make. So far it is proving a struggle…
Thanks,
Dave Cooke
dhogazasays
David Young:
The second is to try to surpress the criticism or even worse get editors fired, etc.
Yeah, and he wonders why people get “defensive” when all he’s doing is making objective statements meant to help poor, ignorant climate scientists improve the quality of their work …
dhogazasays
David Young:
Now if I double the number of grid points, Delta X is halved and thus truncation error goes down by a factor of 4. However, in climate modeling there are 3 space dimensions, so to get this factor of 4, I need twice as many points in each of 3 directions, resulting in 8 times as many points and at least 8 times as much computer time.
Climate models build their atmosphere of pizza boxes, not cubes, as the atmosphere is thin. Decreasing the grid size doesn’t necessarily imply that the pizza boxes become thinner. I think NASA GISS Model E has a couple of dozen layers in its atmosphere but it’s been awhile since I’ve looked. If you care, you can go look at it yourself.
However, you somewhat surprisingly forgot to point out that the simulated time interval between steps needs to be reduced if the grid size is reduced, as that upper left box you like puts it:
“However, finer model grids require a superlinear reduction in the time step size to account for the smaller spatial scale and increased multiscale interactions ”
Which I think gets you your factor of 8 back.
David Youngsays
Hank, These things take a team, trust me on this. Open source is fine, but only if Hansen turned over his team to me would I think about such an endeavor. First order of business is get Gavin to go back to school (only kidding). If NCAR or DOE wants to start a new team, that’s the best approach. They could let John Bell or David Keyes or even Phil Collella lead it. There are younger guys around too. They would know what to do and not be infected by the Hansen doctrine that despite large numerical errors my results look good (and support something I already believed).
[Response: Please leave the strawman personalisations at home. If you want to taken seriously, imagining supposed ‘doctrines’ that have no reference, cite or antecedent and are actually contradicted by the person you ascribe them to, is neither interesting nor clever. – gavin]
dhogazasays
ldavidcooke:
However, finer model grids require a superlinear reduction in the time step size to account for the smaller spatial scale and increased multiscale interactions
And climate modelers have been reducing model grid sizes as time has gone on.
So we have at least two ways to reduce errors – increase resolution is one way. Using implicit methods another.
Let’s address David Young’s accusation that climate modelers are ignorant of the methodologies he touts, and therefore incompetent. From the upper left hand box at the link he likes:
To maintain scalability, a number of climate models have been returning to fully explicit methods developed several decades ago.
So obviously the modelers involved are not only aware of implicit methods, but have implemented at least partially implicit solutions and have made a conscious decision to go back to explicit methods. Presumably because they believe that the benefits of greater resolution due to smaller grid sizes outweighs the benefits of the implicit methods they’d incorporated into their models previously.
David Young might want to find out which models are being discussed and talk to the implementors before insisting this is a step backwards, that the modelers are obviously ignorant of improved methods, incompetent, not up to David Young’s level of expertise, not fit to sit at his table, etc.
You might find out they know more than you claim they do, and aren’t as incompetent as you claim they are …
Ray Ladburysays
David Young, So, again, let me get this straight. You are going to write a check to the professional liars at the Heartland Institute because their model is so much better… Oh, wait. That’s right, they don’t have a model. They don’t have any researchers, or research or evidence. They just have…well, lies.
Dude, you sure you’ve thought this through?
David B. Bensonsays
I am quite unsure just what various commenters are concerned about regarding the solution of various PDEs. However, (numerical) dispersion and dissipation are treated (rapidly) in http://www.physics.arizona.edu/~restrepo/475B/Notes/source/node39.html
which has a link to the interesting table of contents.
ldavidcookesays
RE: 161
Hey dhogaza,
Do not get me wrong, I am not saying one is right the other wrong. Nor do I suggest that the expertise of the current systems are inferior. (BTW, the quote you associate with me is not mine.)
What I am saying is, if we disregard the attitude, is there any validity to the criticism? So far from my limited view point probably not. The hard part is trying to understand before making a judgement.
When we consider that some of the base code of many of the current feeders are likely Fortran running on a mainframe in the basement, it may be possible that the climate team could use a bit of extra support. As to a reorg. nada, the folks who have been doing the heavy lifting have struggled with shortages in personal and systems since the early ’90s. They are well equipped to maximize output with minimal capacity. (They make up the difference by thinking smarter.)
The conversion from implicit to explicit modeling is likely more for the purpose of openly showing the engine and feeds. It’s a bit like showing your work, which was a criticism prior to the UEA fiasco. An implicit differential would be more economic coding. However, the request for climate science to be open, drives the need for un-integrated data engines that specialize in solving one function. From there you can build an over-arching system which manages the feeds and data tables.
This allows the distribution of processing, right sizing of the resources and allows very small steps of discrete functions. At the same time the data filters in the data tables can be adjusted without propogating error to dependent functions.
As to opinions expressed either way, it is pretty difficult to understand if you have no knowledge of the motivation for change. Though those of us who have had a horse in the race may not always agree on who is the horse to beat; but, it is clear that those who are doing the job are the designated experts, else they would not be on the poll.
Cheers!
Dave Cooke
David Youngsays
I struggled with whether or not to respond. Really, I’m not saying anyone is incompetent. However, I do have my concerns about you Dhogaza. The reason for the work in the reference you took out of context is to GO TO IMPLICIT methods because of their well known advantages. The difficulties are immense, but its important. The comment about scalability is just about massive parallelism. Implicit methods can be made parallel too. In any case, for a stiff system, implicit is so much better, parallelism hardly matters.
I get the feeling here that the regulars on this web site are not the scientists, but others. Either the scientists already know what I’m telling them (and that’s fine) or they don’t want to respond to someone outside their community (and that’s fine too) or for them this is just a venue to keep the dhogaza’s of the world attacking their perceived opponents. Or maybe its a venue to demonize legitimate players in the debate. I don’t know but perhaps I should have checked before naively assuming that the site was about the science. I’ve received no feedback here indicating that any of the scientists understood what the issues were. I’m sure the better ones know, like Paul Williams. The question is what is the next step.
The bottom line is that multiple sources of evidence that I’ve uncovered quite easily give strong evidence that the numerical errors in climate models are larger than I think is generally acknowledged by people in the field in their statements to the general public. It’s in the literature, but is perhaps not emphasized here. Further, the idea that there is no legitimate scientific controversy about these things is probably not something even Gavin Schmidt would endorse. I would recommend Climate Etc. as a better place for a discussion of all aspects of this issue. It’s a much friendlier place without the lack of a sense of humor that is quite obvious here. And some of the guest posts there like the one on chaos, ergodicity, and attractors show obvious deep knowledge of the latest theory in the field.
siddsays
We are blessed, indeed, to have such an intellect as David
Young to tutor us. How best ought we avail of it ?
It has been suggested that he join the Clear Climate Code
initiative or the Open code initiative. We are fortunate that
he did not take more offense at the idea. The pedestrian
efforts at Clear/Open Climate code are far below his stature and
ability.
Another measure of his greatheartedness is that he kindly considers
taking over the “Hansen team”, before regretfully discarding them as
being “infected by the Hansen doctrine.” No, infected drudges such as
Dr. Schmidt must go back to school, one taught by such luminaries as
himself, where they may be cleansed, if that is possible, of heresy.
If not, he will, no doubt, have further suggestions in mind for their
reeducation.
What, then, are we to to do ? Let us drink from his font of wisdom again.
We must assemble a completely new, untainted group of acolytes who are
capable of fully absorbing the radiant effulgence of his genius. Then,
and only then, can climate science be delivered from the abyss of
ignorance and error.
This will, of course, be very expensive, but such a beneficial
project will will surely be amply funded by the DOE, NSF, NCAR and
all the others. Mr. Young has modestly suggested others for the lead
role, but of course, he is the only one for the job. At the very least
I am sure that he would accept a senior advisory position, with, at
minimum, a seven figure salary. Any less would be an insult.
Certainly, he is even now finishing his proposal for the project and
when it arrives on the desks of the heads of the funding agencies,
a great singing of Hosannas will echo in the halls and also a deafening
scratching sound of pens upon checkbooks.
Unfortunately, I suspect the gray reality will be a letter asking him to
reapply after he has actually published something relevant to climate
science. Such is the fate of genius in its own time.
But all may not be lost. Mr. Young is a man of means, and has previously
offered, out of his innate generosity, to fund the research with a check
to the Heartland Institute. Why does he need the (possibly corrupt)
agencies of a smothering government ? He can do this himself, and such
a man of conviction will rise to the challenge.
I await news of his continuing success, and anticipate accolades and
laurels showered upon him and rose petals under his feet, and, perhaps
even, (dare I say it) an appointment in Sweden with the King.
Be still, my beating heart.
sidd
David B. Bensonsays
There are lots of papers with different approaches. Here is a quotation from A low numerical dissipation immersed interface method for the compressible Navier–Stokes equations by K. Karagiozis, R. Kamakoti & C. Pantano in J. Comput. Physics: In this paper, we partially address the stability problem of an immersed interface method for the compressible Navier–Stokes equations by utilizing the theory of summation-by-parts (SBP) operators [69] and [70] to derive a stable immersed interface approximation for the advection derivatives. Numerical experiments suggest that this approach prevents the appearance of spurious numerical instabilities, which otherwise create shock-like regions around complex boundaries. Moreover, different from IIM formulations, the new approach completely eliminates the need to deal with jumps at the object boundary. Finally, the method is combined with semi-implicit time integration to remove any stiffness present in the operators and the implicit equations are solved explicitly for the particular case of constant transport properties.
[The reCAPTCHA oracle proclaims escape orisatt.]
TFKsays
The defensiveness about the Heartland Institute is interesting. They read the data in a way that many scientists outside the mainstream do. Sort of like Copernicus did. Yet they are branded ‘liars’?
If the WWF have a seat at the table, so should Heartland. And David Youngs comment (I’ll write a check) was meant to elicit a response. I’m surprised you took his bait.
The lack of serious response to the David Young comments is revealing.
Meowsays
@165:
In any case, for a stiff system, implicit is so much better, parallelism hardly matters.
Got any cites to support this extraordinary proposition? And have you loaded up on Intel put options?
CAPTCHA: ingMEn perfectly
Brian Dodgesays
I find that the scariest metric of model inaccuracies come from the ocean – the Arctic ocean -http://www.woodfortrees.org/plot/nsidc-seaice-n/from:1979.6/every:12. The slope triples after 1995.
Perhaps the damping of the aperiodic oscillations of energy transport by what David Young believes to be shockingly unstudied large numerical errors in climate models underestimates summer Arctic sea ice loss (as well as glacier loss, ice shelf loss, and Greenland & Antarctic ice sheet loss) because of nonlinear mechanisms. E.g. if your model give a correct average temperature of below zero over the Arctic, but damps the peak values of temperatures above zero in the actual data (presently much above, compared to the historic record), it will underestimate melt rates. The decrease in albedo from melting and corresponding increase in energy available for melt during the summer when the sun is shining isn’t balanced by the increase in albedo from freezing during the winter – because the sun is no longer shining. The nonlinearities of heat versus temperature from melting may share some similarities to evaporative nonlinearities, and underestimation due to overdamped response to aperiodic oscillations. There is observational evidence that the models underestimate the increase in rainfall[1] and extreme precipitation[2].
Model inaccuracies may lead some to be cheerily optimistic. A billion here, a billion there, and pretty soon you’re talking real money.[3]
Do dissipative economic models lead to probability distributions that falsely preclude Black Swans?
[1] Science 13 July 2007: Vol. 317 no. 5835 pp. 233-235 DOI: 10.1126/science.1140746 “How Much More Rain Will Global Warming Bring?”, Frank J. Wentz*, Lucrezia Ricciardulli, Kyle Hilburn and Carl Mears
[2]Science 12 September 2008: Vol. 321 no. 5895 pp. 1481-1484 DOI: 10.1126/science.1160787 “Atmospheric Warming and the Amplification of Precipitation Extremes”, Richard P. Allan and Brian J. Soden
[3] http://www.ncdc.noaa.gov/img/reports/billion/timeseries2011prelim.pdf
Meowsays
@165: Please write a paper describing your hypothesis and the evidence supporting it. Show how implementing it would improve a current model. Consider collaborating with the CAM-HOMME team. Or download CCSM4 and do it yourself. We eagerly await your well-researched and -reviewed insights.
Martin Vermeersays
TFK #168
Yet they are branded ‘liars’?
They are liars. You haven’t done your homework.
If the WWF have a seat at the table, so should Heartland.
To have a seat at the table where a problem is addressed, you must acknowledge the problem.
The lack of serious response to the David Young comments is revealing.
Why would baiting require a serious response? Any ‘scientist’ addressing a scientific issue with a political argument is not himself being serious. Such individuals are best ignored, as they are rarely any good at science, lousy at intellectual honesty, and high maintenance. This heuristic has served me well — life’s just too short.
> the regulars on this web site are not the scientists
Often true; I’m not. Some of us try to take the obvious questions and be helpful. Some, well, I’ve posted this before as a comment when the fierceness level starts to bother me. http://www.cartoonstock.com/newscartoons/cartoonists/amc/lowres/amcn39l.jpg
Others have other reactions.
The sidebar link identifies the Contributors, all scientists: Contributors
Visitors who are scientists often mention their background; some link to their publications/website.
Patrick 027says
Re TFK The defensiveness about the Heartland Institute is interesting. They read the data in a way that many scientists outside the mainstream do. Sort of like Copernicus did.
Did Copernicus cherry-pick his data or rely on misunderstandings? Was it harder to explain the observations using Copernicus’s idea than it was to use the idea he was replacing? Was there an entrenched industry that would be hurt by continued reliance on Ptolemy?
(Not everyone who disagrees with mainstream ideas has reasons to back up that decision; they don’t all turn out to be a Galileo, Newton, or Einstein. Just because an idea becomes accepted doesn’t make it wrong.)
David B. Bensonsays
Everybody attempts to write their numerical codes so as to satisfy the dictates of Emmy Noether’s (first) Theorem. I’ve previously posted a reference to books describing how climate GCMs are built; I’ve also provided two comments with links describing various means of avoiding numerical dissipation. I opine that quite a bit is known by everybody working on such codes in a wide selection of technical fields.
Ray Ladburysays
David Young, what utter complete horsecrap! Some of us, Sir, are real scientists–many of us, in fact, and we realize that for a complicated system, you must look at ALL the evidence.
The attribution of climate change to anthropogenic CO2 is not dependent on GCMs. As I said before, a relatively simple, two-box model and basic physics are sufficient for that. I would note that neither Arrhenius nor Tyndall had need of a GCM.
In fact, GCMs are among the most effective tools for limiting climate sensitivity on the high side. Don’t like the models? Well, you ought to be more worried rather than less.
In any case, you seem to be utterly uninformed about how climate scientists actually use their models. Like most scientific models, their utility is in providing understanding of physical systems rather than for “answers”. But, I’m sure you don’t care. You are more interested in speaking to your own denialist echo chamber at Heartland. Me, I’ll stick with the real scientists.
Lawrence Colemansays
I’d just like to bring this governmental scam in Australia to your attention. In light of the recent terrible droughts we have had over the past decade the gov’ has built desalination plants in Queensland, Western Australia and New South Wales plants to supplement the water supply in case of another crippling drought. Now these plants were paid for wholly and solely by the australian tax payers..but now here’s the rub… These plants were only designed to be used in the advent of a drought but they are still being used, actually taking precedence over natural water flowing in our now abundant river systems after the record deluge we suffered last year and up to january this year. If fact water from a large river is being pumped out to sea (wasted) while copious amounts of fossil fuels are being used to keep these desalination plants running 24/7..now wait for it…the residents in many of our major cities are being slugged high water rates and raising at up to 26%/y for the privilage of using this wonderful desalinated water. The amount of power these plants consume is horrendous and comes from coal/oil sources. All our scientific bodies have stated that these plants only be used when there is insufficent water in our river systems which is clearly not the case now. These plants and our populace are simply being used as cash cows despite the environmental damage being done.
This is an environmental travesty on a huge scale. How can we as a country possibly say that are being environmentally responsible when this wholesale wastage of fossil fuels in taking place.
dhogazasays
David Young:
This:
I struggled with whether or not to respond. Really, I’m not saying anyone is incompetent.
Earlier this
I guess there is no comment from Gavin on the startling evidence (both empirical and theoretical) of large numerical errors in climate models.
Along with a bunch of crap suggesting that climate modelers are unaware of research dating back to the 1980s. And that if Gavin learned to be competent, he should view it as “job security”.
Yes, you did, and have continuous, accused them of being incompetent and climate science, at large, of being guilty of academic misconduct (you’re “get editors fired” comment).
You struggle with whether or not to respond because you’re going to get ripped open due to your inconsistency …
Like most other deniers.
Look, if you don’t like the heat, don’t enter the kitchen.
Meowsays
FWIW, the possibility that earth’s climate system contains chaotic attractors is a good reason to refrain from excessively tampering with it, lest we unknowingly nudge the system onto a trajectory toward a particularly unfriendly one.
But I’m sure that the “researchers” at Heartland have considered this issue and have an airtight case for all the attractors being centered on temperate modern conditions. Or at least they will if we’re considerate enough to write sufficient checks to fully fund their “research”.
CAPTCHA: sight mosedli
dhogazasays
ldavidcooke:
Do not get me wrong, I am not saying one is right the other wrong. Nor do I suggest that the expertise of the current systems are inferior. (BTW, the quote you associate with me is not mine.)
Actually, I think the quote was, the first one, but … I’m not disagreeing with you. You understand that David Young is, ummm, “pushing it”, to put it mildly.
dhogazasays
ldavidcooke:
The conversion from implicit to explicit modeling is likely more for the purpose of openly showing the engine and feeds. It’s a bit like showing your work, which was a criticism prior to the UEA fiasco. An implicit differential would be more economic coding. However, the request for climate science to be open, drives the need for un-integrated data engines that specialize in solving one function. From there you can build an over-arching system which manages the feeds and data tables.
Well, my bet was on the modelers coming to realize that the problems that trip up some explicit models don’t apply the their particular GCM, and therefore have chosen to drop the computational expense of doing implicit methods in favor of greater resolution yielded by more efficient, finer-grained explicit models.
I mean, it’s clear that Young’s first claim, that climate modelers are unaware of the explicit vs. implicit dynamic is false. So there must’ve been some reason for these particular modelers to move “backwards” as Young puts it (and I’m sure I’m not the only one who thinks that they had good reasons to do so, and that “backwards” is not the proper label).
Anyway, I’m sure that David Young will contact the appropriate researchers, freely making available his incredible world-class expertise in modeling to them, so the models can be further approved …
dhogazasays
Either the scientists already know what I’m telling them
I previously pointed out that some climate modelers, at least, are aware of the power of implicit methods, and that the upper left box you like points out that some modelers have moved back to explicit methods.
This is not evidence of ignorance. Rather, I suspect it’s evidence of greater expertise than you hold.
In other words, you’re all about “implicit methods are the only approach that works”, while the modelers referenced in that link obviously are willing to change methodology depending on circumstance (which I suspect may hinge on available computing power, but I don’t know)
Their understanding is nuanced.
Your position is absolutist.
I suspect that you’re the lightweight here … sorry, just judging by the evidence on the table.
dhogazasays
@165:
Young: In any case, for a stiff system, implicit is so much better, parallelism hardly matters.
Meow:Got any cites to support this extraordinary proposition? And have you loaded up on Intel put options?
Actually, from my 15 minute skimming of explicit as opposed to implicit methods … it’s not so extraordinary.
But, David Young has not mathematically proven that explicit modeling methods of climates must yield a stiff system …
Nor has he mathematically shown that if true, it actually causes climate models to fall apart as he insists they do.
He just pontificates that this is true, then accuses climate modelers of being ignorant of the existence of implicit methods.
I think he’s probably about an hour or so ahead of me in google time, we can all probably catch up to his bullshit if we’re willing to put in a little study time. Thus far he’s just smoke and mirrors …
David B. Bensonsays
I’m reading the paper by K. Karagiozis, R. Kamakoti & C. Pantano I cited in an earlier comment. I’m beginning to understand why researchers prefer explicit and semi-implicit methods for solving the compressible Navier–Stokes equations.
Possibly some commenters here should actually become familiar with the available literature before commenting? Otherwise it begins to look to me a stuff fit on for The Bore Hole.
ldavidcookesays
RE:181
Hey dhogaza,
There are many advantages related to explicit functions. As to issues of numerical errors or artifacts, many can be avoided by simply running the equation and using the shortest cycle time/step value to feed a model engines data array/table.
It is a way to define a broad set of grid sizes and depending on the fastest changing variable for that grid size, step the model in a manner to create an accurate representation of an analog process digitally. In essence, you can run variables at different step sizes, sampling the data at any two points and get a very good replication. This allows for more accurate representation of changing or cyclical variables without injecting hetrodyning.
Put another way, it allows for functions to change at a “natural” rate and yet interact when they hit the edges of their range, without interacting when closer to their mean.
A Large Scale example: You might not want to code a hard and fast rule into your model, that when there was a neg. PDO moving to a neutral state that the NH Southern Jet Stream drifted North and a Blocking High took up residence centered at 104 deg. W. and 32 deg. N. If you could modify the rule to make a condition that when a volcano belched 1cu km of light gray aerosols, in the 165um range, with a density of 1000ppm/km^3 at an altitude of 8km at 65 deg. N the year before, well that might be a valuable modeling element.
The problem with many implicit models is you are unlikey to be able to have high resolution (regional/local) events represented well, when all cycles have the same start/stop points and no means of injecting long cycle functions. You would have to take many small steps for all of your calculations, stop the run, modify a function, start the run, get to the next injection point….
In my experience, wrt to modeling multiple functions/events, implicit modeling it is not very efficient either in cpu cycles, nor man hours, if the functions have different time domains or grid sizes.
Cheers!
Dave Cooke
Harmensays
For something completely different…
I found this interesting lecture about Tyndall..
Tyndall: His work and scientific heritage
EPAIreland op 8 okt 2011
Professor Richard Somerville, Distinguished Professor Emeritus and Research Professor at Scripps Institution of Oceanography at the University of California, San Diego. http://richardsomerville.com/
This public lecture celebrated the life of John Tyndall and served as an introduction to the 3 day Tyndall Conference held from September 28th – 30th 2011. For more information, see http://tyndallconference2011.org/
FWIW, the possibility that earth’s climate system contains chaotic attractors is a good reason to refrain from excessively tampering with it, lest we unknowingly nudge the system onto a trajectory toward a particularly unfriendly one.
This is a thought that comes always to me, when I see one of the “skeptics” pet arguments that “climate is too complex to be simulated on a computer”.
You put some force on a system you depend on, which you yourself claim is by any means too complicated to be understood… how can that lead to a position that it is not advisable to a least limit the force?
To assert the system does not respond at all or little, and that newly awkward behaviour it shows has nothing to do with the forcing, presumes that the system is quite understandable to you or at least you are quite confident you understand it.
In my opinion this is clearly a contradiction
Marcus
wilisays
If we can pause briefly from engaging with bore-hole worthy Heartland Institute supporters for a moment–Does anyone have links or references for recent papers on the expected effects of an increasingly ice-free Arctic on Northern Hemisphere weather patterns? Thanks ahead of time.
“… while the semi-arid forest can cool itself well enough to survive and take up carbon, it both absorbs more solar radiation energy (through the albedo effect) and retains more of this energy (by suppressing the emission of infrared radiation)….
…
… what happens when the opposite process – desertification – takes place? … desertification, instead of hastening global warming, as is commonly thought, has actually mitigated it, at least in the short term. By reflecting sunlight and releasing infrared radiation, desertification of semi-arid lands over the past 35 years has slowed down global warming by as much as 20%, compared with the expected effect of the CO2 rise over the same period. And in a world in which desertification is continuing at a rate of about six million hectares a year, that news might have a significant effect on how we estimate the rates and magnitude of climate change….”
CMsays
On credibility:
David Young #165,
If you really want a discussion just about the science, uninterrupted by the peanut gallery, well, duh, don’t try to have it on a blog; and if you do, don’t pander to the peanut galleries of certain other blogs by telling the host his work is meaningless, he should go back to school, and you’d be happy to take over the team he works on. Given your deliberate choices to ignore these simple precautions, forgive me if I find your dismay at the reactions just a tad disingenuous. The exchanges between you and Gavin on the previous thread were interesting, but this posturing is just boring.
Dhogaza #183,
> he’s probably about an hour or so ahead of me in google time
Well, if he is who he says, he’s published work on computational fluid dynamics and probably knows quite a bit about the problems of modeling turbulent flows; those are good credentials to have for the issues he’s raised. From what has transpired here, though, he hasn’t done much homework about climate modeling before presuming to lecture Gavin.
Here are some instructive results from Google Scholar advanced search for the author “DP Young” plus a few pertinent terms, with “GA Schmidt” as a control:
“fluid dynamics”: DP Young 24 hits, GA Schmidt 17.
“numerical instability”: GA Schmidt 2, DP Young 0.
“climate models”: GA Schmidt 71, DP Young 0.
dhogazasays
CM:
From what has transpired here, though, he hasn’t done much homework about climate modeling before presuming to lecture Gavin.
That’s the point … it’s clear he has a good theoretical grounding in the subject but he’s done nothing to show that the *potential* problems that can arise from explicit methods actually apply to the climate models he claims are broken. The fact that at least some groups have moved from explicit to implicit to explicit methods makes it clear that Young’s claims that climate modelers are ignorant of such potential problems is bogus. It seems clear enough that they’re making informed decisions …
Hank, thanks, but I did actually already search google scholar with those words, and nothing comes up, at least not in the first few pages, of relevance. So if you have an actual article, or better terms to search under, that would be helpful.
It would seem to me that helping the world better understand what it is in store–as the planet changes from one with a mostly icy Arctic all year round to one where the Arctic is increasingly more open than icy for more and more of the year–would be a central function of a site such as this (rather than constantly bantering with obvious trolls).
Again, any real help toward finding such studies would be appreciated.
Wili, did you notice these, in the first page of results to your question?
http://www.sciencedirect.com/science/article/pii/S0921818111000397
“… periods of Arctic amplification are evident from analysis of both warm and cool periods over at least the past three million years. Arctic amplification being observed today is expected to become stronger in coming decades, invoking changes in atmospheric circulation, vegetation and the carbon cycle, with impacts both within and beyond the Arctic….”
Arctic Warming Ripples through Eurasia
JA Kelmelis – Eurasian Geography and Economics, 2011 – Bellwether Publishing
… One of the largest changes that can be expected is a drastic alteration of marine ecosystem composition …
Nature Geoscience | Letter
Solar forcing of winter climate variability in the Northern Hemisphere
doi:10.1038/ngeo1282
Received 18 April 2011
Accepted 07 September 2011
Published online 09 October 2011
More links in the article at New Scientist:
—excerpt follows—
“The authors emphasize that cooler temperatures in Northern Europe are accompanied by warmer ones further south, resulting in no net overall cooling. “It’s a jigsaw puzzle, and when you average it up over the globe, there is no effect on global temperatures,” Adam Scaife, head of the UK Met Office’s Seasonal to Decadal Prediction team, told BBC News.
The UV measurements could lead to better forecasting. “While UV levels won’t tell us what the day-to-day weather will do, they provide the exciting prospect of improved forecasts for winter conditions for months and even years ahead. These forecasts play an important role in long-term contingency planning,” Ineson told Reuters.
The scientists emphasised that several other factors, such as declining levels of sea ice and El Nino, may have played a role in the unusually chilly winters, reports The Independent, which quotes Ineson as saying: “There are a lot of different factors that affect our winter climate. However, the solar cycle would probably have been acting in a way that gave us those cold winters.”
Meow says
@144: I’m pretty sure that most climate modelers understand their models’ numerical shortcomings. A cursory search reveals active work to improve them. See, e.g., Evans et al, “A Fully Implicit Solution Method Capability in CAM-HOMME” (CCSM model), Dugas & Winger, “A CRCM5 Description”, Weijer et al, “A fully-implicit model of the global ocean circulation”, etc.
I expect that “writ[ing] a check to Heartland Institute” (@120) will not assist these efforts, since I have been unable to find any relevant research sponsored by that organization.
David Young says
Meow, Your first reference is indeed on the right track and Salinger is good. They know about all the good methods. However, if you read the fine print, they say that their preconditioner is actually slower than the baseline code and they are working on a better one. That’s good and we’ve been through that too in the 1980’s. The problem is that their baseline code is so complex, some terms are treated implicitly and other explicitly so its tough. Their spectral element method is used ONLY to test other methods that are used in the real codes. By the way, spectral element methods REQUIRE variable order, something which is often neglected.
However, the point is that rewriting at least one model is required so it is designed to use the best methods. I’ve done this myself. Tacking a modern accelerator onto a legacy code helps, but is orders of magnitude worse than really doing it fully implicitly. Certainly even the AR5 simulations are based on the legacy codes and I’ll wager you finite differences. The AR5 simulations are indeed sensitive to the period of calibration. See Judy’s Curry material for an example.
At any rate, the point is that the evidence from multiple sources says that the uncertainties are much larger than one would get the impression from reading the 1000 papers based on running the models.
You know, sometimes the scientific process involves strong criticism. There are 2 responses. The first is to really understand it and realize that you will be embarrased if you don’t fix the problem. The second is to try to surpress the criticism or even worse get editors fired, etc. Perhaps even Heartland has a place in the debate if the World Wildlife Fund does. It is a free country.
David Young says
Meow, look at the outlined box in the upper left hand corner of your first reference. What you see is that climate models have gone BACKWARD recently and have “returned to fully explicit methods.”
The authors are trying to address the problems caused by this seeming insistence on repeating history. Maybe I do need to write that check after all.
Rob Nicholls says
Hank Roberts, thanks v much for your help
re: SO2 emissions from China. (posts 128 and 133).
It does seem that sulphur dioxide emissions from China have fallen since 2006.
(see e.g. p.10 of http://www.atmos-chem-phys.net/11/9839/2011/acp-11-9839-2011.pdf )
Patrick 027 says
Re David Young – Floating the idea of supporting the Heartland Institute (which isn’t even all about climate (anti)science; you’ll have ‘collateral damage’) raises a red flag, doesn’t it? I don’t see how increasing the errors in public opinion would reduce errors in computer modelling.
Hank Roberts says
> rewriting at least one model … so it is designed
> to use the best methods. I’ve done this myself.
If you do want to make a contribution,
do consider contributing to these:
Clear Climate Code
Open Climate Code
ldavidcooke says
Hey Dr. Young,
First thanks for taking the time to respond. Sorry for not prefacing my video codec spiel, with the idea that Video Compression is not unlike a creating a model of real world event. The digitized data is a representation of the light reflecting from a series of objects. The compression effort is similar to converting the measured values to a grid. The issue with data compression is the attempt to create a high resolution data representation with 1% of the data. This would be similar to taking 7200 data collection stations having 10 variables with ranges that can span 15000 discrete values per min. and trying to represent a trend in just one of the varibles which is dependent on the others either in a group or directly.
The point being in order to maintain the smooth representation requires predictive analysis. Knowing that a value is changing, the rate that it is changing and the ability to determine if the change is an anomoly or noise or valid data. (Hence part of the purpose of the FT filtering.)
The tools you had been talking about so far appear rather rudimentry IMHO. The management of an evolving data set, growing out of a series of equations, that feed the next calculation point have to of advanced light years from a mere 20 years ago. I will review the references you have suggsted to see where I have gone wrong.
(I concur that reducing grid size as well as increasing steps, would be equivalent to increased resolution, hence error values would decrease. In the instance I was referencing regarded the sumation of a series of larger grids created from many discrete data points, while maintaining step size and encoding reversibility.
As to defensive, I believe you could characterize it that way. The problem is if you leave a crack some try to wedge it open to create a door. The flip side is that confusion or a lack of clarity of what is known or unknown may partially explain the reason that a defensive posture is required. (It is when there is a misunderstanding that doubt occurs, the hard part is trying to be open and at the same time not to have to defend what you have no control over.)
As this takes things way off topic I will quit here and attempt to review your points, not that it matters; but, so at least I better understand the point you are attempting to make. So far it is proving a struggle…
Thanks,
Dave Cooke
dhogaza says
David Young:
Yeah, and he wonders why people get “defensive” when all he’s doing is making objective statements meant to help poor, ignorant climate scientists improve the quality of their work …
dhogaza says
David Young:
Climate models build their atmosphere of pizza boxes, not cubes, as the atmosphere is thin. Decreasing the grid size doesn’t necessarily imply that the pizza boxes become thinner. I think NASA GISS Model E has a couple of dozen layers in its atmosphere but it’s been awhile since I’ve looked. If you care, you can go look at it yourself.
However, you somewhat surprisingly forgot to point out that the simulated time interval between steps needs to be reduced if the grid size is reduced, as that upper left box you like puts it:
“However, finer model grids require a superlinear reduction in the time step size to account for the smaller spatial scale and increased multiscale interactions ”
Which I think gets you your factor of 8 back.
David Young says
Hank, These things take a team, trust me on this. Open source is fine, but only if Hansen turned over his team to me would I think about such an endeavor. First order of business is get Gavin to go back to school (only kidding). If NCAR or DOE wants to start a new team, that’s the best approach. They could let John Bell or David Keyes or even Phil Collella lead it. There are younger guys around too. They would know what to do and not be infected by the Hansen doctrine that despite large numerical errors my results look good (and support something I already believed).
[Response: Please leave the strawman personalisations at home. If you want to taken seriously, imagining supposed ‘doctrines’ that have no reference, cite or antecedent and are actually contradicted by the person you ascribe them to, is neither interesting nor clever. – gavin]
dhogaza says
ldavidcooke:
And climate modelers have been reducing model grid sizes as time has gone on.
So we have at least two ways to reduce errors – increase resolution is one way. Using implicit methods another.
Let’s address David Young’s accusation that climate modelers are ignorant of the methodologies he touts, and therefore incompetent. From the upper left hand box at the link he likes:
So obviously the modelers involved are not only aware of implicit methods, but have implemented at least partially implicit solutions and have made a conscious decision to go back to explicit methods. Presumably because they believe that the benefits of greater resolution due to smaller grid sizes outweighs the benefits of the implicit methods they’d incorporated into their models previously.
David Young might want to find out which models are being discussed and talk to the implementors before insisting this is a step backwards, that the modelers are obviously ignorant of improved methods, incompetent, not up to David Young’s level of expertise, not fit to sit at his table, etc.
You might find out they know more than you claim they do, and aren’t as incompetent as you claim they are …
Ray Ladbury says
David Young, So, again, let me get this straight. You are going to write a check to the professional liars at the Heartland Institute because their model is so much better… Oh, wait. That’s right, they don’t have a model. They don’t have any researchers, or research or evidence. They just have…well, lies.
Dude, you sure you’ve thought this through?
David B. Benson says
I am quite unsure just what various commenters are concerned about regarding the solution of various PDEs. However, (numerical) dispersion and dissipation are treated (rapidly) in
http://www.physics.arizona.edu/~restrepo/475B/Notes/source/node39.html
which has a link to the interesting table of contents.
ldavidcooke says
RE: 161
Hey dhogaza,
Do not get me wrong, I am not saying one is right the other wrong. Nor do I suggest that the expertise of the current systems are inferior. (BTW, the quote you associate with me is not mine.)
What I am saying is, if we disregard the attitude, is there any validity to the criticism? So far from my limited view point probably not. The hard part is trying to understand before making a judgement.
When we consider that some of the base code of many of the current feeders are likely Fortran running on a mainframe in the basement, it may be possible that the climate team could use a bit of extra support. As to a reorg. nada, the folks who have been doing the heavy lifting have struggled with shortages in personal and systems since the early ’90s. They are well equipped to maximize output with minimal capacity. (They make up the difference by thinking smarter.)
The conversion from implicit to explicit modeling is likely more for the purpose of openly showing the engine and feeds. It’s a bit like showing your work, which was a criticism prior to the UEA fiasco. An implicit differential would be more economic coding. However, the request for climate science to be open, drives the need for un-integrated data engines that specialize in solving one function. From there you can build an over-arching system which manages the feeds and data tables.
This allows the distribution of processing, right sizing of the resources and allows very small steps of discrete functions. At the same time the data filters in the data tables can be adjusted without propogating error to dependent functions.
As to opinions expressed either way, it is pretty difficult to understand if you have no knowledge of the motivation for change. Though those of us who have had a horse in the race may not always agree on who is the horse to beat; but, it is clear that those who are doing the job are the designated experts, else they would not be on the poll.
Cheers!
Dave Cooke
David Young says
I struggled with whether or not to respond. Really, I’m not saying anyone is incompetent. However, I do have my concerns about you Dhogaza. The reason for the work in the reference you took out of context is to GO TO IMPLICIT methods because of their well known advantages. The difficulties are immense, but its important. The comment about scalability is just about massive parallelism. Implicit methods can be made parallel too. In any case, for a stiff system, implicit is so much better, parallelism hardly matters.
I get the feeling here that the regulars on this web site are not the scientists, but others. Either the scientists already know what I’m telling them (and that’s fine) or they don’t want to respond to someone outside their community (and that’s fine too) or for them this is just a venue to keep the dhogaza’s of the world attacking their perceived opponents. Or maybe its a venue to demonize legitimate players in the debate. I don’t know but perhaps I should have checked before naively assuming that the site was about the science. I’ve received no feedback here indicating that any of the scientists understood what the issues were. I’m sure the better ones know, like Paul Williams. The question is what is the next step.
The bottom line is that multiple sources of evidence that I’ve uncovered quite easily give strong evidence that the numerical errors in climate models are larger than I think is generally acknowledged by people in the field in their statements to the general public. It’s in the literature, but is perhaps not emphasized here. Further, the idea that there is no legitimate scientific controversy about these things is probably not something even Gavin Schmidt would endorse. I would recommend Climate Etc. as a better place for a discussion of all aspects of this issue. It’s a much friendlier place without the lack of a sense of humor that is quite obvious here. And some of the guest posts there like the one on chaos, ergodicity, and attractors show obvious deep knowledge of the latest theory in the field.
sidd says
We are blessed, indeed, to have such an intellect as David
Young to tutor us. How best ought we avail of it ?
It has been suggested that he join the Clear Climate Code
initiative or the Open code initiative. We are fortunate that
he did not take more offense at the idea. The pedestrian
efforts at Clear/Open Climate code are far below his stature and
ability.
Another measure of his greatheartedness is that he kindly considers
taking over the “Hansen team”, before regretfully discarding them as
being “infected by the Hansen doctrine.” No, infected drudges such as
Dr. Schmidt must go back to school, one taught by such luminaries as
himself, where they may be cleansed, if that is possible, of heresy.
If not, he will, no doubt, have further suggestions in mind for their
reeducation.
What, then, are we to to do ? Let us drink from his font of wisdom again.
We must assemble a completely new, untainted group of acolytes who are
capable of fully absorbing the radiant effulgence of his genius. Then,
and only then, can climate science be delivered from the abyss of
ignorance and error.
This will, of course, be very expensive, but such a beneficial
project will will surely be amply funded by the DOE, NSF, NCAR and
all the others. Mr. Young has modestly suggested others for the lead
role, but of course, he is the only one for the job. At the very least
I am sure that he would accept a senior advisory position, with, at
minimum, a seven figure salary. Any less would be an insult.
Certainly, he is even now finishing his proposal for the project and
when it arrives on the desks of the heads of the funding agencies,
a great singing of Hosannas will echo in the halls and also a deafening
scratching sound of pens upon checkbooks.
Unfortunately, I suspect the gray reality will be a letter asking him to
reapply after he has actually published something relevant to climate
science. Such is the fate of genius in its own time.
But all may not be lost. Mr. Young is a man of means, and has previously
offered, out of his innate generosity, to fund the research with a check
to the Heartland Institute. Why does he need the (possibly corrupt)
agencies of a smothering government ? He can do this himself, and such
a man of conviction will rise to the challenge.
I await news of his continuing success, and anticipate accolades and
laurels showered upon him and rose petals under his feet, and, perhaps
even, (dare I say it) an appointment in Sweden with the King.
Be still, my beating heart.
sidd
David B. Benson says
There are lots of papers with different approaches. Here is a quotation from
A low numerical dissipation immersed interface method for the compressible Navier–Stokes equations by K. Karagiozis, R. Kamakoti & C. Pantano in J. Comput. Physics:
In this paper, we partially address the stability problem of an immersed interface method for the compressible Navier–Stokes equations by utilizing the theory of summation-by-parts (SBP) operators [69] and [70] to derive a stable immersed interface approximation for the advection derivatives. Numerical experiments suggest that this approach prevents the appearance of spurious numerical instabilities, which otherwise create shock-like regions around complex boundaries. Moreover, different from IIM formulations, the new approach completely eliminates the need to deal with jumps at the object boundary. Finally, the method is combined with semi-implicit time integration to remove any stiffness present in the operators and the implicit equations are solved explicitly for the particular case of constant transport properties.
[The reCAPTCHA oracle proclaims escape orisatt.]
TFK says
The defensiveness about the Heartland Institute is interesting. They read the data in a way that many scientists outside the mainstream do. Sort of like Copernicus did. Yet they are branded ‘liars’?
If the WWF have a seat at the table, so should Heartland. And David Youngs comment (I’ll write a check) was meant to elicit a response. I’m surprised you took his bait.
The lack of serious response to the David Young comments is revealing.
Meow says
@165:
Got any cites to support this extraordinary proposition? And have you loaded up on Intel put options?
CAPTCHA: ingMEn perfectly
Brian Dodge says
I find that the scariest metric of model inaccuracies come from the ocean – the Arctic ocean -http://www.woodfortrees.org/plot/nsidc-seaice-n/from:1979.6/every:12. The slope triples after 1995.
Perhaps the damping of the aperiodic oscillations of energy transport by what David Young believes to be shockingly unstudied large numerical errors in climate models underestimates summer Arctic sea ice loss (as well as glacier loss, ice shelf loss, and Greenland & Antarctic ice sheet loss) because of nonlinear mechanisms. E.g. if your model give a correct average temperature of below zero over the Arctic, but damps the peak values of temperatures above zero in the actual data (presently much above, compared to the historic record), it will underestimate melt rates. The decrease in albedo from melting and corresponding increase in energy available for melt during the summer when the sun is shining isn’t balanced by the increase in albedo from freezing during the winter – because the sun is no longer shining. The nonlinearities of heat versus temperature from melting may share some similarities to evaporative nonlinearities, and underestimation due to overdamped response to aperiodic oscillations. There is observational evidence that the models underestimate the increase in rainfall[1] and extreme precipitation[2].
Model inaccuracies may lead some to be cheerily optimistic. A billion here, a billion there, and pretty soon you’re talking real money.[3]
Do dissipative economic models lead to probability distributions that falsely preclude Black Swans?
[1] Science 13 July 2007: Vol. 317 no. 5835 pp. 233-235 DOI: 10.1126/science.1140746 “How Much More Rain Will Global Warming Bring?”, Frank J. Wentz*, Lucrezia Ricciardulli, Kyle Hilburn and Carl Mears
[2]Science 12 September 2008: Vol. 321 no. 5895 pp. 1481-1484 DOI: 10.1126/science.1160787 “Atmospheric Warming and the Amplification of Precipitation Extremes”, Richard P. Allan and Brian J. Soden
[3] http://www.ncdc.noaa.gov/img/reports/billion/timeseries2011prelim.pdf
Meow says
@165: Please write a paper describing your hypothesis and the evidence supporting it. Show how implementing it would improve a current model. Consider collaborating with the CAM-HOMME team. Or download CCSM4 and do it yourself. We eagerly await your well-researched and -reviewed insights.
Martin Vermeer says
TFK #168
They are liars. You haven’t done your homework.
To have a seat at the table where a problem is addressed, you must acknowledge the problem.
Why would baiting require a serious response? Any ‘scientist’ addressing a scientific issue with a political argument is not himself being serious. Such individuals are best ignored, as they are rarely any good at science, lousy at intellectual honesty, and high maintenance. This heuristic has served me well — life’s just too short.
Hank Roberts says
> the regulars on this web site are not the scientists
Often true; I’m not. Some of us try to take the obvious questions and be helpful. Some, well, I’ve posted this before as a comment when the fierceness level starts to bother me.
http://www.cartoonstock.com/newscartoons/cartoonists/amc/lowres/amcn39l.jpg
Others have other reactions.
The sidebar link identifies the Contributors, all scientists:
Contributors
Visitors who are scientists often mention their background; some link to their publications/website.
Patrick 027 says
Re TFK The defensiveness about the Heartland Institute is interesting. They read the data in a way that many scientists outside the mainstream do. Sort of like Copernicus did.
Did Copernicus cherry-pick his data or rely on misunderstandings? Was it harder to explain the observations using Copernicus’s idea than it was to use the idea he was replacing? Was there an entrenched industry that would be hurt by continued reliance on Ptolemy?
(Not everyone who disagrees with mainstream ideas has reasons to back up that decision; they don’t all turn out to be a Galileo, Newton, or Einstein. Just because an idea becomes accepted doesn’t make it wrong.)
David B. Benson says
Everybody attempts to write their numerical codes so as to satisfy the dictates of Emmy Noether’s (first) Theorem. I’ve previously posted a reference to books describing how climate GCMs are built; I’ve also provided two comments with links describing various means of avoiding numerical dissipation. I opine that quite a bit is known by everybody working on such codes in a wide selection of technical fields.
Ray Ladbury says
David Young, what utter complete horsecrap! Some of us, Sir, are real scientists–many of us, in fact, and we realize that for a complicated system, you must look at ALL the evidence.
The attribution of climate change to anthropogenic CO2 is not dependent on GCMs. As I said before, a relatively simple, two-box model and basic physics are sufficient for that. I would note that neither Arrhenius nor Tyndall had need of a GCM.
In fact, GCMs are among the most effective tools for limiting climate sensitivity on the high side. Don’t like the models? Well, you ought to be more worried rather than less.
In any case, you seem to be utterly uninformed about how climate scientists actually use their models. Like most scientific models, their utility is in providing understanding of physical systems rather than for “answers”. But, I’m sure you don’t care. You are more interested in speaking to your own denialist echo chamber at Heartland. Me, I’ll stick with the real scientists.
Lawrence Coleman says
I’d just like to bring this governmental scam in Australia to your attention. In light of the recent terrible droughts we have had over the past decade the gov’ has built desalination plants in Queensland, Western Australia and New South Wales plants to supplement the water supply in case of another crippling drought. Now these plants were paid for wholly and solely by the australian tax payers..but now here’s the rub… These plants were only designed to be used in the advent of a drought but they are still being used, actually taking precedence over natural water flowing in our now abundant river systems after the record deluge we suffered last year and up to january this year. If fact water from a large river is being pumped out to sea (wasted) while copious amounts of fossil fuels are being used to keep these desalination plants running 24/7..now wait for it…the residents in many of our major cities are being slugged high water rates and raising at up to 26%/y for the privilage of using this wonderful desalinated water. The amount of power these plants consume is horrendous and comes from coal/oil sources. All our scientific bodies have stated that these plants only be used when there is insufficent water in our river systems which is clearly not the case now. These plants and our populace are simply being used as cash cows despite the environmental damage being done.
This is an environmental travesty on a huge scale. How can we as a country possibly say that are being environmentally responsible when this wholesale wastage of fossil fuels in taking place.
dhogaza says
David Young:
This:
Earlier this
Along with a bunch of crap suggesting that climate modelers are unaware of research dating back to the 1980s. And that if Gavin learned to be competent, he should view it as “job security”.
Yes, you did, and have continuous, accused them of being incompetent and climate science, at large, of being guilty of academic misconduct (you’re “get editors fired” comment).
You struggle with whether or not to respond because you’re going to get ripped open due to your inconsistency …
Like most other deniers.
Look, if you don’t like the heat, don’t enter the kitchen.
Meow says
FWIW, the possibility that earth’s climate system contains chaotic attractors is a good reason to refrain from excessively tampering with it, lest we unknowingly nudge the system onto a trajectory toward a particularly unfriendly one.
But I’m sure that the “researchers” at Heartland have considered this issue and have an airtight case for all the attractors being centered on temperate modern conditions. Or at least they will if we’re considerate enough to write sufficient checks to fully fund their “research”.
CAPTCHA: sight mosedli
dhogaza says
ldavidcooke:
Actually, I think the quote was, the first one, but … I’m not disagreeing with you. You understand that David Young is, ummm, “pushing it”, to put it mildly.
dhogaza says
ldavidcooke:
Well, my bet was on the modelers coming to realize that the problems that trip up some explicit models don’t apply the their particular GCM, and therefore have chosen to drop the computational expense of doing implicit methods in favor of greater resolution yielded by more efficient, finer-grained explicit models.
I mean, it’s clear that Young’s first claim, that climate modelers are unaware of the explicit vs. implicit dynamic is false. So there must’ve been some reason for these particular modelers to move “backwards” as Young puts it (and I’m sure I’m not the only one who thinks that they had good reasons to do so, and that “backwards” is not the proper label).
Anyway, I’m sure that David Young will contact the appropriate researchers, freely making available his incredible world-class expertise in modeling to them, so the models can be further approved …
dhogaza says
I previously pointed out that some climate modelers, at least, are aware of the power of implicit methods, and that the upper left box you like points out that some modelers have moved back to explicit methods.
This is not evidence of ignorance. Rather, I suspect it’s evidence of greater expertise than you hold.
In other words, you’re all about “implicit methods are the only approach that works”, while the modelers referenced in that link obviously are willing to change methodology depending on circumstance (which I suspect may hinge on available computing power, but I don’t know)
Their understanding is nuanced.
Your position is absolutist.
I suspect that you’re the lightweight here … sorry, just judging by the evidence on the table.
dhogaza says
Actually, from my 15 minute skimming of explicit as opposed to implicit methods … it’s not so extraordinary.
But, David Young has not mathematically proven that explicit modeling methods of climates must yield a stiff system …
Nor has he mathematically shown that if true, it actually causes climate models to fall apart as he insists they do.
He just pontificates that this is true, then accuses climate modelers of being ignorant of the existence of implicit methods.
I think he’s probably about an hour or so ahead of me in google time, we can all probably catch up to his bullshit if we’re willing to put in a little study time. Thus far he’s just smoke and mirrors …
David B. Benson says
I’m reading the paper by K. Karagiozis, R. Kamakoti & C. Pantano I cited in an earlier comment. I’m beginning to understand why researchers prefer explicit and semi-implicit methods for solving the compressible Navier–Stokes equations.
Possibly some commenters here should actually become familiar with the available literature before commenting? Otherwise it begins to look to me a stuff fit on for The Bore Hole.
ldavidcooke says
RE:181
Hey dhogaza,
There are many advantages related to explicit functions. As to issues of numerical errors or artifacts, many can be avoided by simply running the equation and using the shortest cycle time/step value to feed a model engines data array/table.
It is a way to define a broad set of grid sizes and depending on the fastest changing variable for that grid size, step the model in a manner to create an accurate representation of an analog process digitally. In essence, you can run variables at different step sizes, sampling the data at any two points and get a very good replication. This allows for more accurate representation of changing or cyclical variables without injecting hetrodyning.
Put another way, it allows for functions to change at a “natural” rate and yet interact when they hit the edges of their range, without interacting when closer to their mean.
A Large Scale example: You might not want to code a hard and fast rule into your model, that when there was a neg. PDO moving to a neutral state that the NH Southern Jet Stream drifted North and a Blocking High took up residence centered at 104 deg. W. and 32 deg. N. If you could modify the rule to make a condition that when a volcano belched 1cu km of light gray aerosols, in the 165um range, with a density of 1000ppm/km^3 at an altitude of 8km at 65 deg. N the year before, well that might be a valuable modeling element.
The problem with many implicit models is you are unlikey to be able to have high resolution (regional/local) events represented well, when all cycles have the same start/stop points and no means of injecting long cycle functions. You would have to take many small steps for all of your calculations, stop the run, modify a function, start the run, get to the next injection point….
In my experience, wrt to modeling multiple functions/events, implicit modeling it is not very efficient either in cpu cycles, nor man hours, if the functions have different time domains or grid sizes.
Cheers!
Dave Cooke
Harmen says
For something completely different…
I found this interesting lecture about Tyndall..
Tyndall: His work and scientific heritage
EPAIreland op 8 okt 2011
Professor Richard Somerville, Distinguished Professor Emeritus and Research Professor at Scripps Institution of Oceanography at the University of California, San Diego.
http://richardsomerville.com/
This public lecture celebrated the life of John Tyndall and served as an introduction to the 3 day Tyndall Conference held from September 28th – 30th 2011. For more information, see http://tyndallconference2011.org/
http://www.youtube.com/watch?v=5maT2CunT08
Marcus says
#179 Meow
FWIW, the possibility that earth’s climate system contains chaotic attractors is a good reason to refrain from excessively tampering with it, lest we unknowingly nudge the system onto a trajectory toward a particularly unfriendly one.
This is a thought that comes always to me, when I see one of the “skeptics” pet arguments that “climate is too complex to be simulated on a computer”.
You put some force on a system you depend on, which you yourself claim is by any means too complicated to be understood… how can that lead to a position that it is not advisable to a least limit the force?
To assert the system does not respond at all or little, and that newly awkward behaviour it shows has nothing to do with the forcing, presumes that the system is quite understandable to you or at least you are quite confident you understand it.
In my opinion this is clearly a contradiction
Marcus
wili says
If we can pause briefly from engaging with bore-hole worthy Heartland Institute supporters for a moment–Does anyone have links or references for recent papers on the expected effects of an increasingly ice-free Arctic on Northern Hemisphere weather patterns? Thanks ahead of time.
Hank Roberts says
Yes. Published thus far in 2011:
http://scholar.google.com/scholar?hl=en&q=expected+effects+of+an+increasingly+ice-free+Arctic+on+Northern+Hemisphere+weather+patterns%3F+&as_sdt=0%2C5&as_ylo=2011&as_vis=0
Hank Roberts says
http://wis-wander.weizmann.ac.il/semi-arid-forests-absorb-heat
“… while the semi-arid forest can cool itself well enough to survive and take up carbon, it both absorbs more solar radiation energy (through the albedo effect) and retains more of this energy (by suppressing the emission of infrared radiation)….
…
… what happens when the opposite process – desertification – takes place? … desertification, instead of hastening global warming, as is commonly thought, has actually mitigated it, at least in the short term. By reflecting sunlight and releasing infrared radiation, desertification of semi-arid lands over the past 35 years has slowed down global warming by as much as 20%, compared with the expected effect of the CO2 rise over the same period. And in a world in which desertification is continuing at a rate of about six million hectares a year, that news might have a significant effect on how we estimate the rates and magnitude of climate change….”
CM says
On credibility:
David Young #165,
If you really want a discussion just about the science, uninterrupted by the peanut gallery, well, duh, don’t try to have it on a blog; and if you do, don’t pander to the peanut galleries of certain other blogs by telling the host his work is meaningless, he should go back to school, and you’d be happy to take over the team he works on. Given your deliberate choices to ignore these simple precautions, forgive me if I find your dismay at the reactions just a tad disingenuous. The exchanges between you and Gavin on the previous thread were interesting, but this posturing is just boring.
Dhogaza #183,
> he’s probably about an hour or so ahead of me in google time
Well, if he is who he says, he’s published work on computational fluid dynamics and probably knows quite a bit about the problems of modeling turbulent flows; those are good credentials to have for the issues he’s raised. From what has transpired here, though, he hasn’t done much homework about climate modeling before presuming to lecture Gavin.
Here are some instructive results from Google Scholar advanced search for the author “DP Young” plus a few pertinent terms, with “GA Schmidt” as a control:
“fluid dynamics”: DP Young 24 hits, GA Schmidt 17.
“numerical instability”: GA Schmidt 2, DP Young 0.
“climate models”: GA Schmidt 71, DP Young 0.
dhogaza says
CM:
That’s the point … it’s clear he has a good theoretical grounding in the subject but he’s done nothing to show that the *potential* problems that can arise from explicit methods actually apply to the climate models he claims are broken. The fact that at least some groups have moved from explicit to implicit to explicit methods makes it clear that Young’s claims that climate modelers are ignorant of such potential problems is bogus. It seems clear enough that they’re making informed decisions …
Hank Roberts says
Well, his more nuanced contributions are at JC’s place, e.g.: http://judithcurry.com/2011/10/02/wedges-reaffirmed/#comment-117764
wili says
Hank, thanks, but I did actually already search google scholar with those words, and nothing comes up, at least not in the first few pages, of relevance. So if you have an actual article, or better terms to search under, that would be helpful.
It would seem to me that helping the world better understand what it is in store–as the planet changes from one with a mostly icy Arctic all year round to one where the Arctic is increasingly more open than icy for more and more of the year–would be a central function of a site such as this (rather than constantly bantering with obvious trolls).
Again, any real help toward finding such studies would be appreciated.
wili says
Sorry if that last post came off as snarky. I did find this pdf: http://www.arctic.noaa.gov/future/docs/ArcticAND_Globe.pdf
and I can probably find some more recent work by searching under the authors’ names that are cited there.
I would still be interested in any suggestions.
Hank Roberts says
Wili, did you notice these, in the first page of results to your question?
http://www.sciencedirect.com/science/article/pii/S0921818111000397
“… periods of Arctic amplification are evident from analysis of both warm and cool periods over at least the past three million years. Arctic amplification being observed today is expected to become stronger in coming decades, invoking changes in atmospheric circulation, vegetation and the carbon cycle, with impacts both within and beyond the Arctic….”
Arctic Warming Ripples through Eurasia
JA Kelmelis – Eurasian Geography and Economics, 2011 – Bellwether Publishing
… One of the largest changes that can be expected is a drastic alteration of marine ecosystem composition …
Aren’t those along the lines you’re asking about?
Pete Dunkelberg says
Hank @ 190:
http://scholar.google.com/scholar?q=Yakir+cooling+forest&hl=en&btnG=Search&as_sdt=1%2C10&as_sdtp=on
https://www.weizmann.ac.il/ESER/People/Yakir/YATIR/publications/15.pdf
Interesting.
Hunt Janin says
Re sea level rise:
For my book on this subject, I’d like to know when and where the first scientific measurements of sea level rise were made. Any ideas?
harvey says
@198
http://sealevel.colorado.edu/content/tide-gauge-sea-level
Hank Roberts says
http://www.newscientist.com/blogs/shortsharpscience/2011/10/solar-lows-cause-extreme-europ.html
reports on http://www.nature.com/ngeo/journal/vaop/ncurrent/full/ngeo1282.html
Nature Geoscience | Letter
Solar forcing of winter climate variability in the Northern Hemisphere
doi:10.1038/ngeo1282
Received 18 April 2011
Accepted 07 September 2011
Published online 09 October 2011
More links in the article at New Scientist:
—excerpt follows—
“The authors emphasize that cooler temperatures in Northern Europe are accompanied by warmer ones further south, resulting in no net overall cooling. “It’s a jigsaw puzzle, and when you average it up over the globe, there is no effect on global temperatures,” Adam Scaife, head of the UK Met Office’s Seasonal to Decadal Prediction team, told BBC News.
The UV measurements could lead to better forecasting. “While UV levels won’t tell us what the day-to-day weather will do, they provide the exciting prospect of improved forecasts for winter conditions for months and even years ahead. These forecasts play an important role in long-term contingency planning,” Ineson told Reuters.
The scientists emphasised that several other factors, such as declining levels of sea ice and El Nino, may have played a role in the unusually chilly winters, reports The Independent, which quotes Ineson as saying: “There are a lot of different factors that affect our winter climate. However, the solar cycle would probably have been acting in a way that gave us those cold winters.”