By John B. Drake
Few scientific topics evoke such general interest and public discussion as climate change. It is a subject that has been highly politicized. New results enter the environmental debate as evidence supporting a position. Usually the qualifiers, the background, and perspective needed to understand the result have been stripped away to form an appropriate sound bite. The attention is understandable given the importance of climate to agriculture and energy use. Fear of global warming and the greenhouse effect has been used as justification for reducing use of fossil fuels and increasing use of nuclear energy and alternative energy sources. It has been suggested that to avoid climate change, a return to a preindustrial level of emissions is necessary.
JOHN B. DRAKE, shown here studying a computer simulation of future climate
under a global warming scenario, is a mathematician in the Mathematical
Sciences Section of ORNL's Computer Science and Mathematics Division. He leads
a team that has been developing parallel algorithms for climate modeling and
implementing the Community Climate Model-2 of the National Center of
Atmospheric Research in Boulder, Colorado, on the Intel Paragon supercomputer
at ORNL. ORNL is one of several institutions attempting to predict impacts on
climate of greenhouse gases emitted by fossil-fuel combustion and other human
As a result of the Rio Conference of 1992, representatives of many nations of the world agreed on emission caps that would allow economic growth while still curbing undesirable increases in atmospheric carbon dioxide. As ecological awareness has grown, people want to know the predicted impact of human activities, including fossil-fuel combustion and forest burning, on future climate. Thus, climate prediction through computer modeling has been thrust into the limelight of policy debate.
The subject of this article is not the policy implications of greenhouse warming, or even the validity of the premise that global warming caused by the greenhouse effect is occurring. The subject is the current array of concepts and tools available to understand and predict the earth's climate based on mathematical models of physical processes. These tools for climate simulations include some of the world's most powerful computers, including the Intel Paragon XP/S 150 at ORNL. With these tools, we are attempting to predict the climate changes that may occur 100 years from now for different temperatures of the earth's surface that will likely result from rising levels of carbon dioxide in the atmosphere.
|A parallel version of the Community Climate Model-2 was developed for ORNL's Intel Paragon XP/S 150 supercomputer to provide answers to energy-related climate questions. For example, what are the effects on climate of 100 years of raising the earth's surface temperature by burning fossil fuels at a rate high enough to double atmospheric CO2 concentrations? The parallel climate model, as modified at ORNL, can be used to predict climate states resulting from global warming and allow researchers to examine particular quantities such as the precipitable water available in the atmosphere. These fields can be compared with observed data, providing a better understanding of the type and severity of climate impacts. Scientific visualizations of the computed patterns, shown as clouds of varying thicknesses moving over the oceans and continents for each day of a year, have been developed and put on videotape at ORNL.|
First, consider some of the observational data available about climate. The data help researchers frame the right questions and acquire the best understanding of climate and climate change.
Buried in the layers of Antarctica's ice is a history of the earth as revealing of earth's climate as tree rings are for the environmental conditions during growth of an oak. As ice accumulated from the yearly snowfall, oxygen-18 and deuterium from the atmosphere were trapped and preserved in their proportions to other atmospheric gases and elements. The isotopic fractions of these traces are strongly correlated with the annual mean temperature of the earth's surface. By drilling 2083 meters into the ice at Vostok, Antarctica, in 1987, a collaborating group of scientists from France and the former Soviet Union uncovered the first full glacial-interglacial cycle of earth's temperature.
|The Vostok ice core temperature variation in degrees Celsius as a difference from the modern surface-temperature value (top) and solar insolation as a function of time (bottom).|
Although the climate of the past several centuries has been nearly constant, the longer time scales of the ice core data show that natural cycles may play out over thousands of years. A look at the pattern of temperature oscillations over time invites climate projections, much as fluctuations in the stock market invite some speculators to invest. The temperature record is analogous to the heart's signal in an electrocardiogram; in both cases, even the most regular of patterns is punctuated by irregular fluctuations.
One cycle of the earth's surface temperature is related to the change in the solar input induced by the earth's orbital precession. The precession is the change of the earth's axis of rotation over a 20,000-year period; it is analogous to the "wobble" of a spinning top. The change in temperature induced by orbital precession also has a period of approximately 20 thousand years (kyr).
The data present only a small sample. Only one or two periods of a 100-kyr cycle are apparent, and the current climate is not at a very dependable point for prediction. A longer record might offer the internal consistency required to analyze the structure of the climate time series, but lacking such self-consistent data we must turn to the physical relationships between variables to understand processes that affect climate.
The Vostok ice core temperature variation in degrees Celsius plotted against the atmospheric CO2 concentration (ppmv).
Modern historical data coincide with the beginning of modern weather forecasting, which started with atmospheric pressure measurements following the invention of the barometer in 1644. Temperature, pressure, wind speed and direction, and precipitation measurements have accumulated from an increasing number of sites. The historical temperature record shows an increase in global average temperature by about 0.5°C over the past 100 years. Atmospheric concentrations of CO2 measured and recorded at the Siple Station in Antarctica over the same period show an increase from 285 parts per million by volume (ppmv) in 1850 to 312 ppmv in 1953. Measurements made at Mauna Loa (a large volcanic mountain in Hawaii) show that atmospheric concentrations of CO2 have increased from 315 ppmv in 1958 to 360 ppmv in 1993the highest atmospheric CO2 value in the 200-kyr climate record (see asterisk in figure above).
If a strict linear relationship held between the amount of CO2 in the atmosphere and the temperature, we should be experiencing a much warmer earth than exists now. However, because the current data point does not fall nicely within the time series data, solving the problem of predicting our climate requires a more penetrating look at the climate system.
A significant amount of energy is also transferred from the earth's surface in the form of a latent heat flux induced by evaporation or by precipitation. The interaction of the land surface with the atmosphere is strongly influenced by the surface's moisture content. Finally, the energy variation over the earth's surface gives rise to winds and ocean currents that transport the energy.
The energy that accumulates in the warm tropical areas north and south of the equator moves toward the earth's frozen poles. This movement defines the main structure of the atmospheric flow. Together with the salinity of the ocean this poleward flux of energy also drives much of the oceans' circulation. The atmospheric and ocean circulations that control the earth's climate are perhaps the most challenging of fluid flow problems. Even though such a fluid flow problem can be formulated by classical equations, many mathematical and physical questions arise.
John von Neumann, the brilliant mathematician, physicist, and pioneer of the computer age (as well as a classmate and best friend of ORNL's former research director Eugene Wigner), was fascinated by the climate problem. In 1955, while addressing a conference of early weather modelers, he outlined an approach to climate research. "The approach," he said, "is to try first short-range forecasts, then long-range forecasts of those properties of the circulation that can perpetuate themselves over arbitrarily long periods, and only finally to attempt to forecast medium-long time periods. . . ." Von Neumann separates the time scales of weather and climate, stressing the dependence on the initial data of the weather. However, climate does not depend on particular initial data; rather, it is an inherent behavior of the system that manifests itself only over long time periods, much as the asymptote of a function is approached. He indicates that the hardest problem is in the region between the weather time scale of a few days and the climate scale of years or centuries. This is the region of the "butterfly effect" when the initial conditions (e.g., the fluttering of a butterfly's wings somewhere in Latin America) are seen to influence the specific progression of atmospheric flow.
That there is something worth calling a climate is taken for granted. That the system has asymptotic states has still not been proved mathematically. Von Neumann understood the basic concepts of nonlinear dynamical systems and chaos theory as applied to the atmospheric circulation long before these became subjects in their own right. The current jargon refers to sets in the space of possible weather states, known as attractors. This understanding is evident from his statement, "One generally believes that the various possible initial states which the atmosphere passes through fall somehow into groups, such that each group leads in the long run to the same statistical average. . ." These attractors do just what their name suggests: they attract nearby climate states to ever closer proximity and, if they were really known, would completely characterize climate. Knowing the attractors would be similar to having a topographic map of possible climate states. We would know the dimensions of the valleys into which the states would settle and also how easily the system might pass to another valley.
Although the general fluid flow equations have been known since the time of Leonhard Euler (1707-83), the mathematical theory of existence and the uniqueness of solutions are still developing. Edward Lorenz, a meteorologist at the Massachusetts Institute of Technology, brought the study of chaos to maturity with simple dynamical systems derived from atmospheric dynamics. The discovery of "deterministic chaotic" systems that are analyzable and the development of nonlinear dynamical systems theory have helped expand the notion of what is meant by a solution, of climate itself, and have provided new tools for approaching the equations. These advances stem not from application of a new mathematical theory to the climate problem but from an ongoing interaction among climate scientists, physicists, and mathematicians.
For 20 years, climate-modeling researchers have been using some version of the Community Climate Model (CCM1) of the National Center for Atmospheric Research (NCAR). CCM1, which was produced in 1987, was operated on large serial supercomputers. Now, many of these researchers are using CCM2a step forward that has been characterized as moving from some other planet to the earth. This step roughly corresponds with the advent of large, shared memory, parallel, vector computers such as the Cray YMP. Parallel computers allow a more detailed modeling of climate. The detailed examination of the balance of physical processes in the models moves closer to the observed state as modeling of details increases, building confidence that the physics is being captured.
Current atmospheric climate models capture very well the qualitative structure of the global circulation. The transport of energy from the warm equatorial regions to the cold poles and the split of the associated winds into cells are reproduced in simulations both qualitatively and quantitatively. The tropical Hadley cell and the mid-latitude Ferrel cells and jet streams are in good agreement with observations. These are the basic structures of the atmospheric circulation felt on the earth's surface as the doldrums, trade winds, mid-latitude westerlies, and polar highs.
The ability of the models to reproduce the current climate builds confidence in their physical validity. This validation, however, is not license to use the models for future climate predictions. Another important justification for use of the models has been their application to past climatic regimes. The NCAR CCM has been used to simulate climate effects resulting from increases in solar radiation in the northern summer because of changes in the earth's orbit. One of the effects was warmer land temperatures that gave rise to more intense monsoons. Increases or decreases in solar radiation resulting from changes in the earth's orbit are believed to be responsible for conditions that produced climates of past ages. According to Stephen Schneider of NCAR, "The ability of computer models to reproduce regional climatic responses to the changes in solar radiation brought about by variations in the earth's orbit lends a major element of confidence to the reliability of these models as forecasting tools for the future climate resulting from increasing greenhouse gases."
CCM2, the most recent code in a series of climate models developed by NCAR, captures the intricate interactions of the physical processes outlined here. This climate model, which is available to academic and industrial research users, simulates the time-dependent response of the climate system to the daily and seasonal variation of the solar input and of sea surface temperatures. For the past 10 years and into the foreseeable future, these models form the basis of a broad range of climate research and scenario testing used in support of decision makers who formulate national energy and environmental policies.
The CCM2 is used almost exclusively on parallel supercomputers. The large computational requirements and the heavy volume of output generated by the model exclude its effective use on workstation-class systems. The heart of the dynamics algorithm in the CCM2 is based on spherical harmonics, the favorite functions of mathematicians and physicists who must represent functions with values on the surface of a sphere. The method transforms data on the sphere into a compact, accurate representation. Data for a 128 × 64 point grid on the earth's surface could be represented with only 882 numbers (coefficient) instead of 8192. It has had a very long reign as the method of choice for weather and climate models because of the accuracy of the spherical harmonic representation and the efficiency of the methods used to compute the transform. The transform is a "global" method in the sense that it requires data from the entire globe to compute a single harmonic coefficient. For distributed memory parallel computers, these calculations require communication among all the processors. Because communication is expensive on a parallel computer, many thought that the transform method had seen its day.
Our research identified several parallel algorithms that keep the transform method competitive, even when using large numbers of processors as on the Intel Paragon XP/S 150 at ORNL. This powerful machine has 1024 node boards, each having two computational processors and a communication processor. The full CCM2 climate model was implemented for this parallel computer by a collaboration of researchers from ORNL, Argonne National Laboratory , and NCAR. It is currently being used by ORNL's Computer Science and Mathematics Division as the basis for the development of a coupled ocean-atmosphere climate model under the sponsorship of the Department of Energy's Office of Health and Environmental Research.
Our models could be used to predict the overall impact on climate of counteracting atmospheric effects from both manmade and natural emissionsthe warming effects of greenhouse gases and the cooling effects of sulfate aerosols. By using the increased computing power of the Intel Paragon, the IBM SP2, or the Cray Research T3D, researchers should advance one step further in understanding the complex interrelations among natural processes, human activities such as fossil fuel combustion, and the climate of our terrestrial home.
[ Next article | Search | Mail | Review Home Page | ORNL Home Page ]