The largest
climate modeling experiment ever devised is running on borrowed time, literally.
The model is taking computing time on loan from more than 47,000 personal computers
worldwide, with the full knowledge and consent of their owners. From its official
launch last September, the climateprediction.net project
has simulated more than one million years of the evolution of Earths climate
generating a flood of data that scientists are just beginning to try
to understand.
As of March 10, climateprediction.net had more than 47,000 users logged into
their system worldwide. The project aims to harness the computing power of personal
computers to run large-scale climate modeling experiments. The experiment involves
more than 1.5 million calculations per time-step. Shown here is temperature
on Earths surface. Courtesy of climateprediction.net.
This is more powerful than the Japanese Earth Simulator supercomputer,
says Myles Allen, a climatologist at Oxford University and the principal investigator
of the project. The public is keeping the scientists busy.
Japans Earth Simulator Center in Yokohama has been in operation since
February 2002, using a supercomputer with 640 parallel processors to run one
iteration at a time of a very high-resolution model of Earths evolving
climate. In contrast, Allens virtual supercomputer simultaneously
sifts through many possible climate scenarios at lower resolution to determine
which starting conditions lead to stable climates. These complementary supercomputing
approaches both have the same goal: to unravel the complexities of Earths
climate in order to better plan for future change.
The project had its beginnings in a commentary Allen wrote for Nature
in October 1999 entitled Do-it-yourself climate prediction. In it,
he noted the success of the SETI@home (Search for Extraterrestrial Intelligence)
experiment in convincing more than a million people to donate microprocessor
time on their home computers, to analyze radio telescope data for signs of extraterrestrial
communications. Could a similar number be recruited for the more practical,
albeit more demanding, task of forecasting the climate of the twenty-first century?
he asked his readers.
It took four years of hard work before he received an answer. The greatest challenge
was developing a PC-friendly version of the climate modeling software. Starting
with the model used by the United Kingdom Meteorological Office to make climate
forecasts in the United Kingdom, the software team needed four years to get
approximately 750,000 lines of Fortran code in shape for PC use. The most demanding
task was preventing the program from interfering with other software
enabling participants to type a document, surf the Internet or do any other
common PC activity while running the model. Finally, on Sept. 12 of last year
it was ready go. In the first weekend of operation, about 25,000 participants
worldwide downloaded the model.
Along with the climate simulation software, each participant downloaded a block
of data, or key, containing values for 20 critical climate parameters,
which form the core of the whole experiment. The climate modeling software performs
its calculations the same way every time; it is only the starting parameters
that vary.
Scientists on the team chose the top 20 parameters after much discussion. The
parameters include such factors as the reflectivity of sea ice, the transfer
of energy between the tropical oceans and the air above them, the rate of turbulent
air mixing close to the surface of Earth, and the diffusion of heat from a warm
area to a colder one. One less obvious factor is the depth to which roots of
plants have to grow to get to water in the ground, which is important in determining
the extent to which plants recycle moisture locally, Allen says.
The most critical parameters, however, involve cloud formation because the reflection
of incoming sunlight has a large impact on the climate, Allen says. If
a model turns into a snowball, he says, chances are it became so
cloudy that sunlight was reflected away and the Earth got very cold.
The general circulation model the researchers are using is a slight variation
of the one used by the United Kingdom Meteorological Office. The main difference
is in the grid pattern covering Earths surface for calculation purposes,
Allen says. Predicting the local weather for a few days throughout the United
Kingdom requires a fine mesh of data points over a relatively small area. But
the demands of long-range climate modeling across the planet require a more
limited number of data points, so the grid scale is larger in the climateprediction.net
model.
Earths surface is divided into rectangles 2.5 degrees in latitude by 3.75
degrees in longitude; six of these rectangles cover the entire United Kingdom.
The atmosphere is vertically sectioned into 19 levels above each rectangle;
a simplified slab model of the ocean, which treats the top layers
of the ocean as a single region, brings the total number of vertical levels
in the model to 20. So there are 140,160 total 3-D grid boxes in the model.
Climatic variables in each box including temperature, pressure, humidity,
wind speed and direction are recalculated at each 30-minute program time-step,
involving more than 1.5 million calculations per time-step, according to Allen.
By running the model millions of times with slight variations in the critical
parameters for pre-industrial times to the present and into the future, Allen
and his team hope to identify a promising subset of models, perhaps 1,000
to 10,000, with interesting and plausible outcomes. They will then feed
the parameters for plausible models into a supercomputer capable of running
the model with a much higher resolution or finer grid pattern
to obtain more detail. Eventually, the researchers will identify a model that
most accurately simulates climate evolution.
Although climateprediction.net will likely not provide new climate results,
it will help climate modelers quantify uncertainty, which is especially important
for policy-makers, says Michael Wehner of Lawrence Berkeley Laboratory in Berkeley,
Calif. With a reasonable amount of statistical certainty, we can give
you an error bar, he says. This is all about quantifying uncertainty,
though its limited in that we are quantifying uncertainty with one model.
Indeed, Curtis Covey of the Program for Climate Model Diagnosis and Intercomparison
at Lawrence Livermore National Laboratory in Livermore, Calif., says that the
climateprediction.net project is worthwhile because it does something
thats not generally done with these big climate models. Distributed
computing, he says, allows the model to be run over a full range of outcomes,
with all plausible values.
Beyond testing their climate parameters, the researchers also want to provide
an interesting learning experience to those people who generously donate their
computing time to the project. The public education component is extremely
important same as SETI@home and
these other public computing enterprises:
This is an opportunity to get into peoples homes with science, Wehner
says.
To that end, the software has an interface that allows participants to observe
their model as it evolves. They can view Earths color-coded grid boxes
changing to reflect updated calculations of temperature, precipitation and cloud
cover, for instance. They can rotate the globe in all directions to see what
their model is predicting for their particular corner of the world. A participant
discussion board, where they can go to discuss their results with others, is
a popular feature.
The climateprediction.net team has also distributed a special software version
to student groups in the United Kingdom to allow them to do term projects from
the data collected on their run. Theyre doing something that only
a year ago would have been half a Ph.D., Allen says. In May 2004, the
Open University in the United Kingdom will offer a college course based on this
experiment.
Allens impulse to apply the SETI@home experience to climate modeling is
beginning to pay off. Tim Barnett, a marine research physicist at the Scripps
Institution of Oceanography in San Diego, Calif., says, in the climate
business, it is still true that we dont have the computer capacity we
need to run simulations. By constructing a virtual super-supercomputer
using borrowed PC time, Barnett says that Allen and the climateprediction.net
scientists are leapfrogging the computational community by 10 to 20 years.
![]() |
Geotimes Home | AGI Home | Information Services | Geoscience Education | Public Policy | Programs | Publications | Careers ![]() |