Traducido por Angela Carosio
Unas semanas atrás estuve en una reunión en Cambridge donde se discutió como, o si, la información paleo climática puede reducir las incertidumbres conocidas en simulaciones climáticas futuras.
Las incertidumbres en el impacto que provoca el aumento de gases de invernadero en múltiples sistemas son significativas: el potencial impacto de ENSO Oscilación del Sur, El Niño (El Niño Southern Oscillation, ENSO, por sus siglas en inglés), o el vuelco en la circulación del atlántico norte, probable información sobre la composición atmosférica (CO2, CH4, N2O, aerosoles), la posibilidad de pronosticar los cambios climáticos por décadas la sensibilidad misma del clima global y, quizás la más importante es que sucederá con las capas de hielo y el régimen de lluvias regional en un clima más cálido.
La razón por la cual la información paleo climática puede ser clave en estos casos es que todos estos componentes climáticos han cambiado en el pasado. Si se pueden entender el cómo y el porqué de estos cambios en el pasado, se pueden modelar nuestras proyecciones de cambio para el futuro. Lamentablemente, el simple uso del registro, ir hacia atrás hasta un punto que tenga condiciones similares a las que esperamos para el futuro no funciona muy bien, ya que no hay buenas analogías a los trastornos que estamos causando. Nunca se ha visto antes un aumento tan veloz de gases de invernadero con la presente configuración de los continentes y vastas cantidades de hielo polar. Por ende, se deben desarrollar enfoques más sofisticados y esta reunión estuvo dedicada a examinar dichos enfoques.
La primera observación que se puede hacer es muy sencilla. ¡Si algo sucedió en el pasado, entonces eso significa que es posible! Así, cambios en ENSO, capas de hielo y los ciclos del carbono, por ejemplo, demuestran claramente que estos sistemas son realmente sensibles a cambios externos. Por lo tanto, asumir que no pueden cambiar en el futuro sería ingenuo. Esto es básico, pero no verdaderamente útil desde un punto de vista práctico.
Toda proyección futura cuenta con modelos de algún tipo. Lo que domina en el tema climático son los GCMs (Global Climate Model, GCM por sus siglas en inglés) modelos climáticos globales océano atmosféricos a gran escala, que fueron discutidos extensamente en el último informe del Panel Intergubernamental del Cambio Climático, conocido por sus siglas IPCC (Intergovernmental Panel on Climate Change en inglés), pero también se pueden usar otros modelos más simples, más especializados o más conceptuales. La razón por la que estos otros modelos son aún útiles es que los GCMs no son completos, no contienen todas las posibles interacciones que conocemos de los registros paleo climáticos y pueden ocurrir observaciones modernas. Segundo, las interacciones observadas en los registros como, por ejemplo, entre los niveles de dióxido de carbono o cantidades de polvo y el forzante de Milankovitch, implican que hay un mecanismo que los conecta. Estos mecanismos pueden ser solo imperfectamente conocidos, pero los registros paleo climáticos resaltan la necesidad de cuantificar dichos mecanismos para que los modelos sean más completos.
Tercero, y probablemente el tema más importante, es que los registros paleo climáticos son útiles para evaluar los modelos. Todo episodio en la historia climática, en principio, debiera dejarnos cuantificar cuan buenos son nuestros modelos y cuan apropiada es nuestra hipótesis sobre cambios climáticos en el pasado. Sin embargo, es vital tomar nota de un detalle, los modelos incluyen muchos más datos y suposiciones sobre cómo funciona el clima, pero para que el clima de los modelos cambie se necesita una hipótesis, como podrían ser un cambio en la órbita terrestre, actividad volcánica, cambios en el sol, etc. Comparar simulaciones de modelos climáticos con datos observados se convierte en una evaluación de los dos factores juntos. Aún cuando la hipótesis sea que los cambios se deben a variaciones intrínsecas, una simulación en un modelo buscando la magnitud de cambios intrínsecos (posiblemente debido a múltiples estados constantes o similar) sigue siendo una prueba de evaluación tanto para el modelo como para la hipótesis. Si la evaluación fracasa, es una muestra que uno o algunos elementos (o ambos) son deficientes, o que la información está incompleta o mal interpretada. Si la evaluación es exitosa, tenemos una explicación consistente de los cambios observados que puede, no obstante, no ser única, pero es un buen punto de partida.
Pero, ¿Cuál es la relevancia de estas evaluaciones? ¿Qué es lo que un modelo sobre impactos de cambios en el vuelco de la circulación oceánica del atlántico norte, o cambios en la órbita terrestre, puede realmente ofrecer para proyecciones futuras? Aquí se dirige la mayor parte de la atención. La clave desconocida es, si la habilidad de un modelo en una cuestión paleo climática esta correlacionada con la magnitud del cambio en una situación. Si no está correlacionada, por ejemplo, las proyecciones de los modelos paleo climáticos que dieron buenos resultados abarcan la misma esfera de acción que los modelos que dieron malos resultados, entonces no se ha ganado mucho. Sin embargo, si se puede demostrar que los modelos exitosos en, por ejemplo, cambios en el régimen de lluvias a mediados del Holoceno, dieron sistemáticamente proyecciones diferentes, por ejemplo, a grandes cambios en el Monzón índico bajo condiciones de gases de invernadero crecientes (Green House Gases, GHG por sus siglas en inglés), entonces se tendría una razón para sopesar las diferentes proyecciones de modelos para llegar a una evaluación revisada. Asimismo, si un modelo no puede coincidir con el rápido derretimiento de las capas de hielo en un período de deshielo, entonces se debiera disminuir la credibilidad al proyectar futuras estimaciones de derretimiento.
Desafortunadamente, aparte de unos pocos experimentos coordinados para el último período glacial y mediados del Holoceno (por ejemplo PMIP, Paleoclimate Modelling Intercomparison Proyect, por sus siglas en inglés) con modelos que no necesariamente superponen con los que se encuentran en el archivo AR4 (Fourth Assessment Report, AR4 por sus siglas en inglés), no existe una base de datos de resultados y evaluaciones. Se han mirado varios eventos paleo climáticos en modelos individuales, desde la Pequeña Edad de Hielo al Cretáceo, pero esto solo sirve como grupo de exploración adelantada que determina solo un trazado en la tierra en lugar de la hoja de ruta completa. De modo que nos enfrentamos con dos problemas: todavía no se sabe qué eventos paleo climáticos serían los más útiles (aunque todos tienen ideas propias), y no se tiene la base de datos que permitiría hacer coincidir las paleo simulaciones con las proyecciones a futuro.
Cuando se miran los registros paleo climáticos para sacar de ellos modelos de evaluación útiles, hay dos tipos de problemas: lo que sucedió en un período específico, y cuáles son las respuestas a un forzante o evento específico. Para el primero se requiere una descripción completa de los diferentes forzantes en el período específico, para el segundo, una recopilación de datos abarcando muchos períodos de tiempo asociados con un forzante. Un ejemplo del primer enfoque sería el último período glacial máximo donde se debieran incluir, por lo menos, los cambios en la órbita terrestre, los gases de invernadero, polvo, capas de hielo y vegetación. La segunda clase se caracteriza por su búsqueda de respuestas a erupciones volcánicas mediante el agrupamiento de todos los años posteriores a grandes erupciones. Se podrían desarrollar enfoques similares para la primera clase para el período Plioceno medio, el evento conocido como 8.2 kyr, el período Eemian (también conocido como Riss Würm, último período inter glacial), principios del Holoceno, la desglaciación, principios del Eoceno, el PETM (Paleocene-Eocene Thermal Maximum, PETM por sus siglas en inglés), la Pequeña Edad de Hielo, etc, y para la segunda clase, forzantes orbitales, forzantes solares, eventos de Dansgaard-Oeschger, eventos de Heinrich, etc.
Pero todavía falta un elemento. En la mayoría de los casos nuestro conocimiento sobre cambios en estos períodos de tiempo es fragmentario, está repartido desde docenas a cientos artículos, y sujeto a múltiples interpretaciones. Resumiendo, es una situación muy complicada. El elemento que falta es poner toda esa información junta y producir un resumen que sea fácilmente comparable con los modelos. El hecho que este resumen se hace solo en raras ocasiones hace hincapié en las dificultades que entraña. Hay buenos ejemplos de ello, CLIMAP (Long range Investigation, Mapping and Prediction, CLIMAP por sus siglas en inglés) y su reciente actualización, MARGO (Multiproxy approach for the reconstruction of the glacial ocean surface, MARGO por sus siglas en inglés) por las temperaturas oceánicas del último período glacial máximo(LGM, por sus siglas en inglés), las bases de datos de precipitación y vegetación para el período Holoceno medio en PIMP, especialmente la resolución de los patrones de temperatura de los últimos cientos de años de múltiples servidores porxy, etc. Se han utilizado con mucho éxito cada uno de estos sistemas en modelos de comparación de datos y han sido muy influyentes dentro y fuera de la comunidad paleo climática.
Parece raro que este tipo de estudio no se realice con mayor frecuencia, pero tiene sus razones. Fundamentalmente se debe a que las herramientas y las técnicas requeridas para obtener un buen resumen no son las mismas que para hacer mediciones o desarrollar modelos. Se podría describir como un nuevo tipo de ciencia (aunque en la ciencia no sea nuevo para nada) que requeriría, quizá, de un nuevo tipo de científico, alguien que se sintiera cómodo con las distintas fuentes de datos paleolíticos y que fuera consciente de los problemas, y aún así fuera consciente de lo que los modelistas necesitan y por qué lo necesitan. O, por otra parte, se necesitarían personas que trabajen en modelos que entiendan de qué dependen los datos indirectos y quién puede desarrollarlos en los modelos, haciendo de este modo más directa la comparación de datos con los modelos.
¿La comunidad paleolítica debiera, entonces, incrementar el énfasis en la síntesis y asignar más fondos y posiciones consecuentes? Esto es, a menudo, una cuestión polémica, ya que cuando se discute la necesidad de trabajadores para integrar la información existente, algunos se preguntan si la primacía de la recopilación de datos nuevos está siendo amenazada. Esta reunión no fue una excepción. Sin embargo, estoy convencido que este debate no es la suma cero implícita en el argumento, sino todo lo contrario. Resumir la información desde un campo altamente técnico y hacer dicha información útil para otros fuera del campo es fundamental para aumentar el respeto por este campo de estudio y así incrementar la cantidad de fondos de investigación disponibles a largo plazo. Sin embargo, la falta de gente capacitada que podría ganarse el respeto de los recolectores de información y que podrían entregar un producto con valor agregado a los encargados de ejecutar los modelos presenta un serio obstáculo.
A pesar de los problemas y de los incuestionables desafíos que representa llevar datos paleolíticos y comparaciones de modelos a un nuevo nivel, fue alentador ver que estos temas fueron abordados de frente. El deseo de tirar líneas para solicitudes de subvención para la ciencia real fue muy inspirador, tanto así, que yo debiera dejar de escribir blogs y dedicarme a ello.
La versión condensada de la reunión está altamente influenciada por conversaciones y charlas de la misma, particularmente con Peter Huybers, Paul Valdes, Eric Wolff y Sandy Harrison entre otros.
Charles Raguse says
Is there an ultimate limit that arises from human intellectual capability, irrespective of model simulations and databases? After all, even with the aid of meetings and model simulations and conversations there is a limit. Unless one subscribes to the notion of “artificial” intelligence, this would come down to an analog of a “world” chessmaster, whose intellect is (?) superior to all others, and (?) all of the parameters of the contest (in this case “prediction” of a winner) are known. A derivative of this kind of pondering leads to the “flap of a butterflies wing” analogy, i.e., how do climatologists reconcile to the chaos theory?
Eric (skeptic) says
Are there any proxies for clouds (e.g. average cloudiness or thickness, rainfall patterns, mountaintop rime ice, etc) that can be picked up on an annual or finer basis? Seems like that would provide validation for, or inputs to a model.
Jon says
Gavin,
Has anyone gone about compiling a “wishlist” of sorts in the modeling community of studies (other than syntheses) they’d like to see come out of the paleo community along the lines of what you mention in your ninth paragraph?
Cheers.
[Response: Wishlists compiled without appreciation of the reality of paleo studies tend to be a little pointless. For instance, if asked, modellers often request proxies for clouds, or the thickness of the ozone layer – both would be great – but this is just wishful thinking. Instead, modellers really need help with more forward modelling (including the proxy system in the models directly) and on downscaling larger patterns to specific proxy records (i.e. for a specific proxy record, how does the local environment influence what gets recorded). These issues require input from the data-gathering community (since they know their proxies and sites better than anyone) and would go some way to making the comparisons more direct. – gavin]
Jim Galasyn says
Assuming we keep beating Moore’s law, computation should soon be “too cheap to meter”. Given essentially limitless computational power, how will GCMs be affected? Will our abilities to model and predict weather and climate become limited only by the power to collect data? Will we be able to push our predictions to the very edge of the complexity horizon?
Ray Ladbury says
Jim Galasyn,
Technically, Moore’s law derived from scaling of one particular type of semiconductor technology–Complementary Metal Oxide Semiconductor or CMOS. The next generation could be designed from the former using fairly similar models–departing from the recipes only when technical roadblocks were encountered. That worked well down to about 0.5 microns minimum feature size. Every generation since then has been a struggle, but we are still more or less on track of Moore’s Law. The law has taken on a life of its own–driven more by competition and economic necessity. The result is that I have sitting on my desk DDR3 SDRAMs for testing and we will soon be testing 45 nm feature size parts.
If you want to peer into the crystal ball a bit, the International Technology Roadmap for Semiconductors (ITRS) makes interesting reading. What you find is that there are lots of roadblocks, but also lots of interesting alternative technologies that could take their place.
Bottom line, I don’t think we’re done as far as increasing computing power for GCMs and other applications for at least a couple of decades yet. However, if economic hard times hit, progress will be slower.
Mark J. Fiore says
Excellent post.
However, the current changes are so quick and fast that prior changes in the Earth’s climate, all created by nature, do not significantly model the true extent of the negative feedback mechanisms currently ongoing, created my humans. Take the pine beetle crisis in Canadian forests, for example. The trees are dying because the yearly low frost temperature does not fall below the level to keep the beetles in check.As the trees die they release carbon. There is no precedent for the quickness and severity of the current rise in CO2.280 ppm to 380 ppm in 150 years?And current estimates show 450 ppm in 50 years or so.Such a quick climate change has never occurred, even with huge volcanic events of the past. Even with the meteor strike 65 million years ago. Also, from a moral standpoint, these events were caused by nature, and were unavoidable. Mankind has through negligence and disrepect caused the current changes.Also, if you look carefully at the current data, the shutdown of the North Atlantic current has already begun. Upwelling, and salinity levels, are already negatively effected by the meling of the ice at the poles.I am no scientist, but I know what I read, and I read a lot.
Mark J. Fiore
Kevin says
re: Charles Raguse
It’s interesting to think about. I think that by definition climate is less dependent on initial conditions than is weather. The reason is that the term “climate” is not meaningful for any one point in time–it is by definition a more generalized state of the system over a period of time. So I think in a way your question about chaos is more for meteorologists than climatologists, i.e., will we ever have the understanding of enough variables, and enough computing power, to accurately predict weather many years into the future (or even months?)? Never say never, but that level of understanding of the system seems mighty distant right now.
Lou Grinzo says
Ray and Jim: Another key point is that we’re still in the early stages of learning how to really leverage mesh or cloud computing, e.g. using a very large number of networked computers of various speeds to work together on a single problem. There have been some notable projects with people “donating” CPU cycles over the ‘net for protein folding and SETI research, so I would expect that a similar thing could be done with climate modeling, assuming the code could be written in a reasonable compartmentalized form.
[Response: It’s already been done. – gavin]
Jim Steele says
Excellent topic. Using paleoclimate data is exactly what made me a skeptic.
The change in forcing by CO2 as calculated by this equation = 6.3 ln (C/Co) ( perhaps you may use a different coefficient but the principle remains) and using a base line of say 200 ppm for Co , we can see that at 280 ppm we get an increase of 2.119 W/m*2 from the periods of glaciation.
Now if we look at the paleoclimatic record graphed in the Nature article we see that sometimes at 280 ppm the temperature rises and other times it falls regardless that the atmospheric concentration is 280 ppm.
http://www.nature.com/nature/journal/v407/n6806/images/407859aa.2.jpg
This suggests there is a force that is much more powerful driver than CO2 to overcome the forcing at 280 ppm. Most likely the sun. Assuming the sun’s effects are typically cyclical we are faced with the problem that as temperatures rises during an interglacial the effect of the sun is less powerful because it is being assisted by the increased forcing of CO2 concentrations. However as temperatures fall as we approach the next period of glaciation the sun’s decrease in output must decrease much more rapidly than it increased to offset the upward temperature pressure that would be created by the forcing of CO2 at 280 ppm.
But such a conclusion seems at odds with known solar behavior. So it suggests that 1) either the sun is much more variable than we are aware or 2) if the sun’s output is cyclical, the effect of CO2 is negligible as temperatures rise and fall. It has been argued that the sun starts the temperature rise and then the increased forcing by increased CO2 is a feedback mechanism to increase temperatures. However that same story can not hold true when the tmepratures are falling, otherwise if CO2 is the driver, the feedback would raise the temperatures, but instead we fall back to a period of glaciation.
[Response: Ahem. If you this is all that convinced you to be a skeptic, you can breathe a sigh of relief and let the scales drop from your eyes. CO2 is an important part of what is going on during the glacial/interglacial cycles, but it is not the only part. Can you say “Milankovic?” Read up on that in wikipedia, then come back. Needless to say, Milankovic forcing is too slowly varying to be playing any significant role in the warming we are seeing now. –raypierre]
David B. Benson says
Mark J. Fiore (6) — Using the GISP2 Central Greenland temperature data averaged over 30-year intervals, the temperature there went up 0.72 K from 8170 to 8140 kpb. This was during the recovery from the 8.2 kya event:
http://en.wikipedia.org/wiki/8.2_kiloyear_event
From data there and also the HadCRUTv3 global temperatures since 1850 CE, this is close to the warming experienced during the last thirty years. The temperature recovery at the end of Younger Dryas was even more dramatic, at least in Central Greenland.
That said, this current rate of warming is certainly hard on already over-stressed organisms.
Phil Scadden says
#4 – limitless computer power? I doubt it. My models (for basin thermal evolution) have benefited enormously from increased computing power but what you can practically do is still limited heavily by available power and your idea of a “reasonable” run time. An 8-day run time limits the no. of model iteration you can do a year somewhat! With access to 1000 cpu’s, we can now do montecarlo runs to look at sensitivities but can realistically look at only a few variables and a limited no. of samples. Increasing computer power would allow more runs, more variables, and ultimately finer meshes but its hard to imagine having enough computer power in my lifetime, even assuming it doubles every year, to feel the model is not compromised by computing limitations. I suspect this is true in spades for GCMs.
Hank Roberts says
David (10), Mark Fiore’s (6) refers to rate of change of CO2 in the atmosphere, not rate of temperature change.
Tamino blogged on the rate here:
http://tamino.wordpress.com/2007/03/06/fast-co2/
“… the fastest rise or fall in the entire Vostok ice core record amounts to a rate of increase of 0.06 ppmv/year.”
Vern Johnson says
It’s a “mess” alright! What is unknown far far exceeds what IS known or just probable given our extreme lack of perspective inherent in our chaotic history as a species.
As the sun burns it’s hydrogen, what is happening to it’s total mass and therefore it’s gravitational attraction? If the earth is gradually increasing it’s mass thru dust and meteorite deposition, what does this do to our orbital radius from the sun? What do radar-distance studies tell us vis a vis Venus, Mars, Mercury inside the “snow-line”?
Gavin, please tell us what can be measured with some accuracy and then tell us what you think those measurements mean.
pete best says
http://www.telegraph.co.uk/earth/main.jhtml?xml=/earth/2008/04/30/eaclimate130.xml#form
Appalling journalism quite frankly. Typical right wing media. A decade lull, its is not a consensus view and only preliminary view.
Arctic sea ice melt is looking slightly worrying.http://nsidc.org/arcticseaicenews/
Ike Solem says
Ray and Jim, I think you have to set Moore’s Law side by side with the computational demand increase for every stepwise increase in model resolution. The European Centre weather model uses a grid scale about 25 km per side, horizontally, and it divides the atmosphere into 91 layers:
http://www.ecmwf.int/products/forecasts/guide/The_numerical_formulation.html
What climate modelers do is try and make their code as efficient as possible while preserving the basic physical processes – I’m not sure what law that follows. I believe that the current view is that it is better to spend any extra computing power on running more ensembles of models, and possibly on improving simulation of specific physical processes, rather than on increasing the grid resolution of current models.
RIght now it is hard to say what will happen to the North Atlantic thermohaline circulation, which now appears to vary by large amounts on a monthly to yearly basis (not the Gulf Stream, but the other components). However, the multi-model ensemble approach is being applied here as well:
Collins et. al 2006, Interannual to Decadal Climate Predictability in the North Atlantic: A Multimodel-Ensemble Study, JC
If one wanted to do that with paleoclimate, one of the well-studied time regions is the Younger Dryas, the sudden cold freeze about 12,900 – 11,500 years ago, which is thought to have happened rapidly, within a decade, and an associated cooling of the North Atlantic – though I don’t know if it really has been tied down to a halt in thermohaline circulation or not – and as usual, this is a topic that many people have tackled:
Tarasov & Peltier (2005), Arctic freshwater forcing of the Younger Dryas cold reversal, Nature
For those who are interested in doing a little research, you can look at all the papers that have cited that paper, courtesy of Google Scholar.
Greg van Paassen says
@ #11. Back in the 70s computer scientists realised that the increasing speed of hardware made the design of algorithms, code tuning, etc, *more* important, not less so. A famous example of paying attention to these matters reduced the run time of a simulation from a year to a day.
It would be great to get some of the best CS people helping out with climate modelling! (Perhaps some are already, but they’re keeping their heads down.)
Steve L says
It’s sometimes hard to go back and get all of the previously recorded information sorted, cleaned, verified, put into the format you want, and then used in a synthesis. My limited experience with meta-analyses leads me to recommend asking the researchers who do the primary data collection to try and use the same measures, formats, etc in their future reporting so that useful analyses can then be easily done. I’ll give one example with which I am familiar: in the Province of British Columbia, Canada, to research fish, part of the requirements to obtain sampling permits was to put results into a Provincial data base on the web. The standardization of methods (gear used, linear and area estimates of habitat surveyed, etc) and details to be recorded (GPS coordinates, species classification) was a benefit in mapping distributions of species. Because negative results were also reported, it removed some biases that would show up if only published work was surveyed for the mapping.
Perhaps you can request similar standardizations in paleoclimate?
John Mashey says
re: Moore’s Law and such
OR: Amdahl’s Law is becoming much more relevant
OR: don’t expect magic for every kind of simulation
0) Moore said nothing (in 1965) about specific technology (Bipolar, NMOS, PMOS, CMOS, BiCMOS), and in fact, it wasn’t even until ~1982/1983 that Intel shipped its first CMOS DRAM’s and micros (80286).
1) Moore’s law *really* was the 2X # transistors/chip every 2 years, or equivalent, gotten by shrinking
a) Shrinking transistors
b) Shrinking wires
c) Adding more layers [which also costs more mask steps, i.e., more cost]
2) In “the good old days”, this tended to give you more performance as well [not necessarily 2X, as there were memory delays as well], but transistor-switching speed was the limitation, and transistors switch ~ in proportion to size, so we not only got 2X transistors, but got a speed boost “for free”.
In the really old days, one could do an optical shrink from one generation to the next (or maybe at least to a half-generation) and the same design would work, just use smaller chips and run faster, and the same software would run, just faster.
3) But for various reasons, wires don’t shrink as fast as the transistors, and a lot of the time is “in the wires”, and people have already used most of the clever well-known architectural tricks for making a single CPU go faster … which is why you may have noticed:
a) the end of massive touting of big, easy GHz improvements … because those are over (at least for CMOS).
b) there are lots of 2-core or 4-core chips around, because it is way easier just to replicate simpler CPU cores.
Unfortunately, while 2 CPU cores might give you ~2X (or somewhat less) throughput of independent tasks, they don’t automagically make any given program run 2X faster.
4) Some algorithms parallelize well, some don’t, but Amdahl’s Law says that the unparallelizable part matters. For example, if one CPU runs a program in X time, and:
90% is parallelizable onto multiple CPUs
10% is serial code
Then, no matter how many CPUs you have, the best you’d ever do is approach a 10X speedup. Actually, in practice, most codes get faster with each additional CPU, but each CPU adds less performance, and after a while, adding a CPU actually makes the program run slower. Fortunately, some important classes of simulations can use *many* CPUs, unlike, say Microsft Word.
5) Bandwidth, latency, and programmability matter as well: some algorithms work OK on distributed PCs, others really want lots of CPUs connected with low-latency, high-bandwidth shared-memory. Even with multi-core CPUs, some algorithms bottleneck on access to external memory, in any of several different ways.
Good parallel programming is not easy, and the free ride we got for many years on single-CPU performance is over until some post-CMOS technology happens. We will still get more transistors/die for a while (someone mentioned ITRS). However, at least multi-core chips (including nVidia chips being used for parallel programming) are so widespread that maybe more students will learn.
Lately, a huge amount of effort has been going into power reduction, not just for mobile devices, but for server chips, albeit for different reasons.
re: #16 there are some pretty good CS people out here already, i.e., I suspect the long-hanging fruit has been pretty well picked over.
David B. Benson says
Hank Roberts (12) — Thank you for the clarification.
To a man with a hammer, every problem looks like …
Greg van Paassen (16) — Some computer scientists are interested in designing machines capable of considerable parallelism. Others are then interested in the question of how one ought to program in order to best utilize such machines. This gives rise to better programming languages for parallel computation, and some of these features can even be stuffed into the Fortran straight-jacket. :-)
Anyway, the computer science assistance with climate modeling is largely under the surface, so to speak.
gusbobb says
#9 Jim Steele, CO2 is a result of solar changes that increase ocean temperatures not the driver of changes. Raypierre mentions the Milankovitch cycles but their periodicity is too long too explain the current warming or things like Dansgaard-Oeschger events that Svensmark connects to cosmic rays. RC writers will allow for some solar warming but then attribute the rest to increased CO2 feedback. But as you suggest that makes no sense because how does temperatures decrease back to the low baseline with that CO2 now in the atmosphere unless something else is the driver, reducing temperatures and allowing CO2 to dissolve into the oceans.
Raypierre failed to mention short term solar variability. The sunspots activity is way lower than the consensus predicted, very similar to a Maunder Minimum, and several researchers like Khabibulo Absudamatov have been saying the sun will undergo decreased activity and create a cooling trend. I too have predicted lower solar output and a cooling ocean. Now there is a paper that will be published in Nature telling us that due to ocean currents that we can expect a ten year cooling trend. (Hardly what the GCM’s have predicted) and once again evidence that natural variations overwhelm CO2. Here is a snip and a link.
“Those natural climate variations could be stronger than the global-warming trend over the next 10-year period,” Wood said in an interview.
http://www.bloomberg.com/apps/news?pid=20601124&sid=aU.evtnk6DPo
According to earlier discussions with Gavin on the Galactic Glitch he says that the ocean will respond to the atmospheric forcing to achieve equilibrium with radiative input, so if this cooling trend further supported, by his own logic he would have to admit that the there is less input.
[Response: Not really. The Nature study is talking about changes associated with ocean circulation even while CO2, and the global imbalance, and global temperature, is increasing. It is exactly what we’ve been trying to explain. – gavin]
Timothy says
The reason those other models are still useful is that the GCMs are not complete.
I don’t think that’s really fair, it’s not the only reason, and one of the main reasons for using simplified models, is because they are cheaper, and so can be run for longer.
If I’m correct, what counts as a really long control run for a typical GCM is a few thousand years, while the standard centennial timescale experiments for the IPCC reports each run from 1860-2100, ie 240 years.
I think it’s correct to say that the GCMs are still too expensive, because they’ve got so much in them, to use them for glacial cycle experiments. You just can’t run a GCM for a hundred thousand years of model time.
This brings me to the discussion about computer power. My impression was that, with even desktop PCs now having to go the route of parallel processing to improve performance, the modellers now face the problem of re-writing their code to get it to work across not just dozens of processors, but hundreds, (or thousands?), if they want to see the increases in performance that they’ve achieved in the past. This will be hard because of the problem of exchanging data between the processors, and reducing that data flow to a minimum will be a key challenge.
[Response: Tim, you make a good point and I should have expanded that line. In relation to using massively parallel computers, the scaling is ok for relatively high resolution models and for intelligent domain decomposition. It does take work to get the models configured appropriately though. But it always pays to remember that mutliple ensembles (initial conditions and perturbed physics) are likely to play an ever larger role in the modelling, and these scale perfectly on MPP machines! – gavin]
gusbobb says
Gavin said “The Nature study is talking about changes associated with ocean circulation even while CO2, and the global imbalance, and global temperature, is increasing. It is exactly what we’ve been trying to explain.”
I guess I need to fix my glasses because no where did I read that “global temperature, is increasing”. Could you provide a quote?
This is what I read:
“Average temperatures in areas such as California and France may drop over the next 10 years, influenced by colder flows in the North Atlantic, said a report today by the institution based in Kiel, Germany. Temperatures worldwide may stabilize in the period.”
And when you say that is “exactly what we’ve been trying to explain”
can you link me to the quote where you or other RC writers predicted the next 10 years temperatures will stabilize worldwide and California and France will get colder?
And the article contradicts predictions made by the NASA folks in GISS Surface Temperature Analysis
Global Temperature Trends: 2007 Summation
“Based on these considerations, it is unlikely that 2008 will be a year with truly exceptional global mean temperature. These considerations also suggest that, barring the unlikely event of a large volcanic eruption, a record global temperature clearly exceeding that of 2005 can be expected within the next 2-3 years.”
http://data.giss.nasa.gov/gistemp/2007/
[Response: The global temperature in the models used in this study are increasing (see figure 4) – the changes are initially less in their SST-restored run, but they are still increasing and after a decade or two are at the same level that the standard run has. These are however 5 year trends, and so don’t include variations related to ENSO and the like. My guess is that the next record breaking year will coincide with the next moderate-to-high El Nino, a position that is not contradicted by these new results – which focus on the predictability of the N. Atlantic, not the Pacific. Since I have never presented any initialised climate forecasts, you will find that I have never predicted any short term temperature trends at the regional scale. This paper (and a previous study by Smith et al earlier this year) are some of the first people to do this and so it’s not yet clear how good it is as a method. I would recommend suspending judgement for a while. – gavin]
gusbobb says
I have suspended judgment and only speculated that “if the trend is supported”. I figure in 5-10 years we will be able to decide who had better predictions.
Gavin said “My guess is that the next record breaking year will coincide with the next moderate-to-high El Nino, a position that is not contradicted by these new results ”
But their expectation of stabilized and decreasing temerpatures doesn’t exactly support the prediction of new global highs in the near future El Nino or not. Are the equations and data for calculating global temperatures available to the public. I understand they are not.
In earlier conversation you adamantly said heat does not leave the ocean. So where does the heat come from so that El Nino will cause a record breaking year?
Zeke Hausfather says
Gavin,
Given all the media attention the new Nature article has been getting (see tomorrow’s NYT, for example, or National Geographic, the CSM, etc.), it may merit a full RC post. Its also interesting to compare this prediction with that of the Hadley Center last year, which also suggested a stagnation of temperatures (till 2009-2010 in their case, if I recall).
[Response: We’ll try and have something up at the weekend. The NYT story is ok, but the Telegraph is appalling. – gavin]
tharanga says
The number of enthusiastic posts about Moore’s law gives me some hints as to the professions of the other lay readers of the site.
Back to paleoclimate: is it possible to pick out large volcanic events, of the Krakatoa or Laki scale, in the ice core CO2 data? I’m having a look at some data from Taylor Dome, and it looks like the temporal resolution is only about 100 years or so, which would make it tough.
I realise that currently, anthropogenic sources far outweigh volcanic; just wondering how much CO2 is released by the absolute biggest events, and if that can then be seen in the record.
[Response: Volcanos affect climate almost exclusively through the aerosols they pump into the atmosphere, which reflect sunlight. The amount of CO2 pumped out by even a big eruption is pretty trivial, and would hardly make a blip in the observed CO2 record. To put things in perspective, the anthropogenic source of CO2 at present is something like 20 times the estimate of the long term mean flux of CO2 out of
the interior of the earth by all volcanos and all undersea vents. –raypierre]
Thomas says
I put in about two decades worth of tuning mostly engineering code for modern computing applications. We have to recognize that modern computing cores will run at much higher throughput when low level parallelism, what computer scientists would call Instruction Level Parallelism, is exposed to the hardware/software combination. Even with all data local the rate of operation can vary by as much as an order of magnitude for well designed single thread code, than otherwise poorly designed code. During the past few years CPU clock rates have stagnated, mainly because of power consumption/heat dissipation issues, but peak computation per core rates of well designed code has continued to increase. At the same time the number of cores per chip has been rising, with six to eight coming within the next couple of years. Unfortunately off chip data bandwidth is not keeping up with on chip computation rates. This means that unless your program makes very efficient usage of the cache memory hierarchy that even without Amdahl’s law the parallel efficiency of the chip can be a lot less than anticipated. This trend towards an increasing ratio of chip capability to off chip bandwidth is expected to continue.
Personally, I’m not as sanguine about the prospects for as rapid an increase beyond the next couple of chip shrinks, we are already at 45nm, and two more shrinks brings us to 22nm at which point both quantum effects, and the fact that the number of atoms per part starts getting small enough that part to part variation starts getting pretty severe. I wouldn’t be at all surprised that after another five years or so that progress slows (but doesn’t stop).
Edward Greisch says
ITRS is practically a conspiracy to make sure Moore’s law keeps happening. The semiconductor companies have parceled out research tasks so that each company takes a particular technology to research They have signed cross-licensing agreements. If any technology makes a breakthrough, the whole world makes the breakthrough. Some semiconductors other than silicon are up to 1000 times faster than silicon. There were about 20 semiconductor materials under consideration last time I checked. Raymond Kurzweil says some time in the near future [say 2020 or 2025] computer IQ will exceed human IQ. See “The Age of Spiritual Machines.” Once that happens, computer IQ will fly way past human IQ and soon exceed the thinking power of all of the people. We use computers to design computers. A better computer can better design its own successor. A computer smarter than a human can design a computer 10 times smarter than a human or itself. Keep on iterating.
So don’t count computers out as far as better climate predictions go. Future computers will be programmed by computers. If Kurzweil is right, computers will take on all of the tasks presently done by human climatologists. That includes the synthesis of fragmentary data task. The new kind of scientist who is at ease in dealing with the disparate sources of paleo-data and aware of the problems, and yet conscious of what is needed (and why) by modellers could be a machine. It would be a 2025 machine, not a 2008 machine. The question is, will the 2025 computer be smart enough to figure out how to undo the climate screw-up that will have happened by then? Will the humans be savable or will only computers survive? The computers will survive the extinction of humans if they become conscious/self aware because computers don’t need food or breathable air. Raymond Kurzweil is one of those super talented software people, not a science fiction writer, so I don’t count his ideas out.
Ike Solem says
Gusbob,
The ocean contains far more stored heat than the atmosphere does, and is also continually exchanging heat with the atmosphere – unless covered by sea ice.
It’s very hard to predict what the net ocean-atmosphere heat exchange will be over any given year. Much depends on the strength of the wind-driven upwelling, which brings cold deep water to the surface.
At this point, model forecasts of ocean circulation at the decadal scale should probably be viewed cautiously, since we cannot even predict an El Nino more than 9 months in advance or so.
Predictions of the Pacific Decadal Oscillation are even more unlikely – claims that it has entered its “cool phase” are basically completely unsupported. No mechanism for the PDO is known, and whether it has regular phases that will continue into the future is also unknown. El Nino is far better understood.
I would wait for the paper to come out before leaping to promote it.
Barton Paul Levenson says
Holland’s old (1978) figure for CO2 from volcanism and metamorphism was 330 million tons per year. According to T.M. Gehrels at USGS, volcanoes contribute about 200 MT of that. Last year, human technology added about 30 billion tons of CO2 to the air. So human output dwarfs natural by a factor of 91 and volcanoes by a factor of 150.
Harold Pierce Jr says
RE: #6 Mountain Pine Beetle Outbreak
Global warming has little to do with the massive mountain pine beetle outbreak in BC. A cold snap of -35 to -40 deg C for at least 2-3 days is required to kill larva and eggs under the thin bark of a lodgepole pine in winter and -25 deg C in early fall. Cold snaps of this magnitude don’t happen all that often.
The other natural control of pine beetle infestions is wildfire started by lightning strikes. Lodgepole pine is the only common pine that has serrotinous or heat sensitive cones, which can hang on the branches for long periods of time. The scales on these cones only open and drop their seeds when exposed to the intense heat of fire which melts the resin. This occurs just before the cone is consummed by fire. After the forest burn downs, the seeds germinate in soil enriched from the ash of the burned trees, and the cycle start over again with a period of about 100-150 years. This is also why there is a mono culture of LP that goes on forever and ever.
Pine forest in north western NA just don’t last that long as compared to the coastal rainforest where there are giant Skita spruce, Doug fir, and western red cedar. Old-growth WCR is more valuable than platinium, and the BC forest service is in a constant on-going, cat-and-mouse struggle with tree rustlers.
Vigorous suppression of forest fires beginning with the Smokey the Bear Era has led to large mature LP forest and lots of fuel on the forest floor. This is the reason the Yellowstone pine forest burned down in 1988. I recall on the cover of Newsweek mag a phrase,”Is global warming the cause?”, or something like that.
There is one other curious weather pattern that has contributed the current outbreak. Starting in the late ’90 and early ’00’s, late spring and early summer were unsually cool and rainy. This led to a low incident of forest fires. Many fires which start in remote areas with beetle-infested trees are often left burn out because there is no easy access for fire fighting crews and equipment.
Mountain pine beetles are very picky critters and only mass attack mature LP pines that are about 40 cm or larger in diameter at breast height. Prof John Borden (Bio. Sci., SFU) and I worked for about 25 years tying to find a hypothetical attractant host chemical emitted only by mature trees. We never found the “magic bullet”.
For all the interesting info on NA’s No. 1 tree-killing beetle, GO: http://www.for.gov.bc.ca/hfp/mountain_pine_beetle
And checkout this awesome video that shows the growth of the infestation in the BC LP forest, GO:
http://cfs.nrcan.gc.ca/images/3252.
Note how the cold snap in 1985 wiped out the beetles in the Chilcotin.
Harold Pierce Jr says
ATTN: ALL
When posting comments, please limit paragraphs to less than ca 10 lines. Otherwise the text is too hard to read especially for us old guys with vision problems.
Ray Ladbury says
John Mashey–I should have been more specific. True Moore’s original formulation was not for CMOS. However, most of the period over which the law has been valid relied on scaling of CMOS to achieve the increased density and speed. I agree that the age of CMOS scaling is coming to an end–and it has been for the past 3 or 4 generations. I think we’ll get to 22 nm with great difficulty, but beyond that I think we’ll need to rely on new technologies–to wit:
http://www.nature.com/nature/journal/v453/n7191/full/453042a.html
The thing that is interesting here is not just that the technology could provide a large boost in memory density, but that it could provide an entirely new type of circuit that could “learn”. Coupled with genetic algorithms and other programming advances, this could very well revolutionize programming for complex models such as GCM.
Now, if we can only fly it in a satellite, I’ve got job security until I retire at age 80.
Scott Reid says
Re: #30
“Lodgepole pine is the only common pine that has serrotinous or heat sensitive cones, which can hang on the branches for long periods of time.”
Just to set things straight, jack pine (Pinus banksiana) is a very common pine that has serotinous cones. Jack pine is also under threat from the spread of the mountain pine beetle.
mg says
27 It is not all obvious that the world can afford much further development of ITC. It has been clear for nearly a decade (eg see Paul Strassmann’s article here http://www.strassmann.com/pubs/ft/ft_guru.html ) that the corporate world would hit the critical point of being unable to afford further expense of scrap & build cycles. The critical point is now in terms of cycle confluence and it is no coincidence that world markets are edging towards fundamental restructuring.
The IT world, in addition to confronting an increasingly cash-strapped client base, also has to comes to terms with its contributions to climate change. Further, it is worth doing some supply chain analyses for the semiconductor industry and watch out for that wild card: sea level rise. The conditions are right for a fundamental revolution to sweep through the ITC industries, driven by financial evaporation.
gusbobb says
Ike Solem Says:
At this point, model forecasts of ocean circulation at the decadal scale should probably be viewed cautiously, since we cannot even predict an El Nino more than 9 months in advance or so.
Ike I agree with you. I think the ocean is a wild card regards heat distribution and climate predictions. I have advocated that since my first post.
I was not predicting an El Nino or its effects. It was Gavin who predicted the next record high will be in a few years with an El Nino. My question is where does the heat come from to generate the spike in temperatures during the El Nino.
I also find it amusing that cool spells due to ocean circulation and oscillations are not seen as “climate change” as the reports on the new cooling predictions suggest but road bumps to inevitable warming. The distinctions seem arbitrary and biased. If models of past climates were done correctly then these oscillations were accounted for. Attribution for increased temeperatures from warm phases of various oscillations were supposedly already accounted for. The IPCC predicted a consensus .3 degree rise in temperature in the next decade. Now the predicted decade of cooling is reported as an anomaly that was not part of the models?
And what drives these oscillations? Before it was suggested that AGW was increasing the probabilities of warm phases. Then by prior logic a cold phase must be affected by less radiative input. I eagerly await new papers.
BBP says
Vern (13)
The atomic mass of Hydrogen is 1.00794 and Helium is 4.002602 – so fusing 4H -> 1He you only convert 0.029 atomic mass units into engery, which is only 0.7%. During its 10 billion year lifetime the sun will fuse of order 10% of its hydrogen, so over any timescale of interest to climate science the mass change is negligible. The sun is slowly incresing in luminosity, but this is easy to take into account.
BBP
Martin Vermeer says
gusbobb:
The global mean temperatures are computed by areal averaging of the weather station data on land, and sea surface temperature data observed by satellite. The temperature variation of El Nino doesn’t require heat as much as it requires a very large area in the Pacific to be a little bit warmer at the surface.
The software for computing global temperature (one version of it) is freely downloadable from the GIStemp website.
About GCMs including El Nino etc., no they don’t. But they do produce their own El-Nino -like natural variations (IOW, physical realism!) which however are not synchronized with the real world. IIUC, the computations described in the article aim at precisely achieving this sync by setting realistic initial conditions.
tamino says
Re: #31 (Harold Pierce)
I too would urge all commenters to help us old guys whose vision isn’t what it used to be. Very long paragraphs can make your comment harder to read.
Ray Ladbury says
Gusbob, the classification of oceanic variability as weather rather than climate is not at all arbitrary. Oceanic variability typically persists on timescales of less than a decade. Since the solar cycle is 11 years, anything less than this is weather. And you seem to be missing the point entirely. OK, so let’s say there is a flux of cold water to the surface to suck up heat for a decade. All that means is that you won’t see the type of warming we’ve had for the last 20 years for a decade. At the end of that time, the CO2 is still there and warming kicks off again with a vengeance. What is so hard to understand about this? Climate consists of long term trends, neglecting short-term variability (
Lynn Vincentnathan says
RE #20 & Gavin’s “The Nature study is talking about changes associated with ocean circulation even while CO2, and the global imbalance, and global temperature, is increasing.”
I just read a news article about this, and the impression I got was that the team was using a new approach (something like between the climate & weather timescales???), and that the ocean circulation (a slow down ??) and some natural variability would work in sync to make it a bit cooler in the N. Atlantic region, while the natural (down turn) variability working against the GW would keep it about constant in the tropical zones.
Then after this next decade GW might kick in with even more ferocity than ever.
I’m just paraphrazing what might be a lot of mistakes via a news article. However, I remember in high school physics being in charge of the ripple tank, and how upswing waves when they converge can create bigger waves, while downswing and upswing waves crossing each other cancel out each other.
And what worries me is that the denialists will spring forth from the woodwork to denounce GW, and this may impact public perception, and our GHG emissions will continue to soar, and we’ll really be in hot water after that decade of slight cooling (or stable average temps) and beyond into ??? hysteresis.
From what I understand our GHG emissions are already higher than the worst-case IPCC scenario.
And there are other considerations, aside from the warming — like the oceans becoming acidic from CO2, and the ocean conveyor slow down reducing the upwelling of nurishment for sea life…..etc.
Chris says
Re #35 gusbobb
You ask with respect to El Nino’s “where does the heat come from to generate the spike in temperatures during the El Nino.”
It comes from the sun. Within a world completely in energy balance with a stable equilibrium temperature, there are El Nino’s and La Nina’s. These correspond to periods in which heat energy is redistributed. In an El Nino event warm surface waters (warmed by the sun!) spreads across the central tropical Pacific, and cold water upwelling off the West coast of S. America is suppressed. So the Earth’s surface temperature warms for a short spell until the circulation patterns return to their “normal” state.
Presumably all of these ocean circulations act to take solar energy into the oceans and if the Earth is in positive heat imbalance (as it seems to be under the influence of an enhanced greenhouse effect), will contribute to ocean warming, even if occasionally (due to fluctuations such as La Nina events), the ocean surface temperature is overall rather cooler for a short while.
So these truly are fluctuations or oscillations that act to modulate either the equilibrium surface temperature (if the Earth is in “heat balance”) or the transition to a new equilibrium temperature if the Earth is in imbalance with respect to temperature forcings (as in the case of an enhanced greenhouse effect).
That’s all very straightforward isn’t it? It seems rather likely that in a world warming under the influence of enhanced greenhouse forcing, that many of the “record years” will coincide with positive modulations from the various internal “oscillations” (e.g. El Nino’s in this case) that transiently redistribute heat to or from the Earth’s surface.
So I don’t see why cool spells due to ocean circulation and oscillations should be seen as “climate change” (after all they’re small and short lived), and if these are a consequence of generally “heat-neutral” and short lived redistributions of ocean surface warmth, it’s not surprising if these are considered to be “bumps” on the year on year variation of the surface temperature anomaly as it responds to enhanced greenhouse forcing.
Why should we expect otherwise? That’s how things have been during the warming of the last 30-odd years. Why should they be any different now? And many of the “bumps” can be accounted for in the past record (El Nino’s, La Nina’s, volcanic contributions, solar cycle; i.e. all of the factors that result in transient modulations of the long term trend). What’s “arbitrary” or “biased” about that??
catman306 says
I’m probably over my head here, but I’d like to point out that all of these advances in computers and writing faster running code won’t solve a basic problem with modeling: the assumption that non-included variables will stay linear and can be safely ignored. Anything that can be measured might be one of these suddenly exponential variables. Among the thousands, millions, or more ignored variables there’s probably a few that can unexpectedly go exponential and ruin the model when compared to the real world. The effect on our climate by being struck by an asteroid is an obvious example. But so too might be some undiscovered chemical reaction that only takes place at the extremely pressure found at the bottom of the deepest oceans. This variable could quantify anything imaginable (or unimaginable!) where scientists are not currently looking.
Nick Gotts says
RE #27 Edward Greisch “Raymond Kurzweil is one of those super talented software people, not a science fiction writer, so I don’t count his ideas out.”
[edit] Compare actual technical progress over the past half century with what techno-optimists were predicting in the 1950s, in areas such as space travel, medicine, nuclear power, transportation, and above all, artificial intelligence.
[Response: This is getting way off topic. Bring it back to climate models or drop it. – gavin]
Hank Roberts says
> 31, 38, “old guys with vision problems”
I’m another, same plea for line breaks, same reason.
aTdHvAaNnKcSe (THANKS in advance)
Contributors writing inline replies too, please?
Return — twice for a blank line — between thoughts.
Pekka Kostamo says
Re 35: “I also find it amusing that cool spells due to ocean circulation and oscillations are not seen as “climate change” as the reports on the new cooling predictions suggest but road bumps to inevitable warming. The distinctions seem arbitrary and biased.”
This is just a standard strawman practice claim. It is the deniers who consistently choose the “bump” year of 1998 as their reference in all kinds of calculations. I have not seen one of the climate scientists claim that year as proof of global warming.
Any record of 20th century global temperature measurements shows the warming trend as well as the short term noise, both up and down.
Hank Roberts says
From the first post:
> the tools and techniques required for doing good
> synthesis work are not the same as those for making
> measurements or for developing models. It could in
> fact be described as a new kind of science (though
> in essence it is not new at all) requiring, perhaps,
> a new kind of scientist. One who is at ease in
> dealing with the disparate sources of paleo-data
> and aware of the problems, and yet conscious of
> what is needed (and why) by modellers. Or
> additionally modellers who understand what the
> proxy data depends on and who can build that into
> the models themselves
This seems like a chance to create a new academic course, to start with. Would you all consider starting a Wiki or something in which possible study guides, coursework, references and such could be collected?
Possibly there’s a department chair or academic dean watching or you could attract the attention of one.
This sort of thing is being done for flu pandemics:
http://www.fluwikie2.com/index.php?n=Forum.WhoIsQualifiedToAssessTheConsequencesOfAPandemic
Aaron Lewis says
Gavin,
This was one of your very best posts. The only thing missing from your post was a paragraph or two that we could copy and send to our congressman to get you (plural) some money for a paleo-climate information data warehouse.
(Aside to US readers; Gavin has other things to do. Do not wait. Re-read the post and go to http://www.visi.com/juan/congress/ and send your congressional representative a missive. Then, have all of your family members, friends, and coworkers send similar missives.
Aside to other readers; Tell your representatives that the people that emit (or plan to emit) greenhouse gases should take full responsibility for their actions; including the research necessary to determine the extent of the problem.)
Aaron Lewis says
Re46
Hank,
In the old days we called this “Interdisciplinary Environmental Studies”.
While climate modeling with Paleo data is important, climate modeling with an understanding of infrastructure engineering and economics is just as important. I would say that the fact that the climate models fail to account for land ice sheet dynamics speaks volumes that our climate modeling effort needs to be vastly broader and deeper.
When the IPCC was formed, there were 2 questions that needed to be answered. The IPCC has danced around these questions 4 times, and never offered an answer to either question. The questions are: “How fast can heat be transferred to the major ice sheets?” and, “How will the ice sheets respond?” Paleo data can help answer these questions. Then, we will need infrastructure engineering and economics to decide what, if anything, we should or can do to adapt or mitigate.
The ice sheets represent a global threat at a risk level above VP Chevy’s 2% rule. Therefore, asset allocation to climate threat analysis (climate modeling) should be similar to asset allocation to assessing and fighting global terrorism – call it US $100 billion per year. Gavin should be able to run a paleo data center on some slice of that.
Lawrence Brown says
If, as some of the above references to the article in the May 1 issue of “Nature” point out there is a temporary cooling in the near term due to natural variability, this will reinforce the need to take immediate action to reduce GHGs. It’s an opportunity to take advantage of the fact that less glacial, sea surface and (at least) northern hemisphere continental ice will melt. This will mitigate the positive feedback effects of the loss of albedo(I almost said libido-that ship has almost sailed- I’m another one of those old guys, who prefers shorter paragraphs).
A cool phase coupled with accelerated steps,starting very soon, to reduce CO2,CH4,NO2 and other heat trapping gases can benefit the planet in this regard.
Hank Roberts says
Aaron, I have big hopes for the International Polar Year science — once we start seeing it.
I’ve seen only one journal article from the ANDRILL cores. The first of these were only shipped to Florida late last year, and sliced up and bits of them passed out to scientists all around the world. They must be writing papers by the dozen.
I think those two questions really have been waiting on a whole lot of research that’s wel under way now.