|NEWS NOTES||February 1999|
Surprises from Cape Roberts
Debating New Madrid
A role for carbon sinks
Managing park resources
Glassy-eyed over escoria
Cape Roberts sits 125 kilometers north-northwest of the U.S. research base at McMurdo Station and New Zealand’s Scott Base. About 250 kilometers north is Terra Nova Bay, site of the German and Italian stations. To take advantage of these resources, a joint scientific venture of the national Antarctic programs of Australia, New Zealand, Italy, Germany, the United Kingdom, and the United States was formed. The Cape Roberts Project was designed to recover sea-floor cores containing sedimentary strata that represent nearly 100 million years of geologic history. Its goal is to determine the record of glaciation, continental rifting, and sea-level changes in the Antarctic region — knowledge that will contribute to an international understanding of the causes of global sea-level change in the past, and a working model for sea-level fluctuation in the future.
“The Cape Roberts Project is a fantastic example of an international scientific collaboration,” says Scott Borg, head of the geology and geophysics program for the Antarctic Science Section of the National Science Foundation (NSF). Other projects, such as the Ocean Drilling Program and the Antarctic Offshore Stratigraphy project, can benefit from the Cape Roberts work as well, he adds.
Drilling through the ice
The Cape Roberts Project was conceived when seismic surveys indicated that a number of key strata lay deep beneath the sea floor under the Ross Sea, tilting upwards and approaching the surface of the sea floor not far from Cape Roberts. A team of scientists from six nations decided to collaborate on the project, designed to study the climatic and geologic history of Antarctica during the last 100 million years. The main questions the project seeks to answer from the drilling are glaciological (whether ice sheets existed prior to about 34 million years ago and how they have waxed and waned on Antarctica in younger times) and tectonic (at what time the continent began to rift to form the Ross Sea and the Transantarctic Mountains).
Each year in Antarctica’s early spring, the support teams head down to ready the camp. Fifteen kilometers off the coast, the drill rig is set up on sea ice between one and two meters thick, above water between 150 and 400 meters deep. The precariousness of the ice-top situation was made clear during the first drilling season a year ago, when a number of storms and a “break-out” of sea ice forced the teams to evacuate after just one week of drilling — but not before they had retrieved nearly 150 meters of core. That section contained layers dating back about 21 million years, filled with numerous new microfossil species, diverse macrofossils, and providing evidence of glaciation.
The 624 meters recovered this season (October and November 1998), covering a time span from approximately 33 million years to 20 million years ago (Early Miocene-Oligocene), weren’t completely analyzed at press time, but several surprising findings had been recorded. “The drilling program,” says Borg, “was very successful this year.” The scientists are excited about finding several new species of diatoms and palynomorphs, and hope that this can become an important reference section. And a layer of volcanic debris 110 meters below the sea floor provides the first evidence that large volcanic eruptions occurred in the area around 21.5 million years ago.
“The discovery of the volcanic material is really quite exciting,” says Borg. “It is clearly evidence of a major eruption, several times larger than Mount St. Helens and possibly comparable with the eruption [Vesuvius, A.D. 79] that destroyed Pompeii.” The thickest distinct layer of volcanic debris is 1.2 meters thick with pumice up to 1 centimeter in size. The science team, including researchers from several American universities, speculates that the volcano was within 50 to 100 kilometers of the drilling site. The discovery of the volcanic layers demonstrates a far more spectacular history of volcanism than was previously suspected for the Ross Sea region — but it is also important because it provides material for dating the core strata much more accurately.
A variety of electric logs are being run the full length of the core
hole to fully characterize the strata. Peter Barrett of Victoria University,
the chief scientist on the Cape Roberts Project, said researchers were
delighted with both the depth they had reached and with the core itself.
“It’s a credit to the 11 drillers and nine support staff who worked round-the-clock
in 12-hour shifts since mid-October.” Once drilling is complete after next
season, the project will have recovered core through as much as 1, 500
meters of strata off Cape Roberts, coring at three separate locations to
depths of up to 700 meters below the sea floor.
Recently, earthquake researchers have been using the Global Positioning System (GPS) to measure present-day strain accumulation in the New Madrid Seismic Zone. Research published in a 1992 issue of Science offered the surprising suggestion that recurrence rates in the area could be much higher than previously thought. As GPS technology has improved, so has the accuracy of the numbers — and differences between their interpretations. Two groups presented GPS research during the American Geophysical Union meeting in December that showed smaller strain accumulation rates than those published in 1992. But a lingering question remains: Could a large earthquake strike the area again? If so, when?
Not any time soon, says Seth Stein, a professor of geological sciences at Northwestern University. Stein and his team of researchers used GPS to monitor the motion of geodetic benchmarks over time in and around the New Madrid Seismic Zone. Their most recent measurements, comparing 1997 GPS data with 1993 data, show less than 2 millimeters of motion a year across the seismic zone. The movement of the 24 geodetic benchmarks they had planted in and around the zone is almost zero, Stein says. “We’re not even sure they’re moving at all.”
Their findings suggest that the seismic hazard has been overestimated, Stein says. If the strain required for the 1811 and 1812 earthquakes was 5 to 10 meters, it would take 5,000 to 10,000 years to accumulate enough strain for another such arthquake. “At plate boundaries, the plate motion over a few years is the same as for geologic time,” Stein says. He applies the same trend to the intraplate motions at New Madrid. “It’s hard to see how we could have a magnitude-7 or greater quake here any time soon. I think we should be rethinking a lot of our policy there.”
But such a change in attitude is exactly what should not happen, says Mark Zoback, a geophysicist with Stanford University who also measures strain accumulation in the New Madrid Seismic Zone. Zoback was one of the scientists who published work in the Sept. 18, 1992, issue of Science that suggested the recurrence interval in the New Madrid Seismic Zone could be as low as 500 to 1,000 years.
Recent GPS measurements taken by Zoback, fellow geophysicist Paul Segall, and colleagues show less strain accumulation than their 1992 Science paper suggested. But both researchers say that because scientists still don’t know what drives the seismic activity in the New Madrid Seismic Zone, it’s too soon to interpret the data as concretely as Stein has.
Zoback started measuring strain accumulation because paleoseismic studies showed past earthquake recurrence intervals ranging from a few hundred to a thousand years. Like the GPS measurements taken by Stein’s group, the recent GPS measurements of Zoback’s group suggest different recurrence intervals than the paleoseismic record. “It’s a complicated process we don’t fully understand,” Zoback says. “We don’t want to give the people of the mid-continent the idea that there’s no risk in the future. It’s taken a very long time to convince [them] that they should think about earthquakes. That’s what is at stake here.”
In 1991, Zoback, Segall, and Lanbo Liu, now at the University of Connecticut, compared triangulation data gathered in the 1950s to GPS data. They found a signal that implied a recurrence rate matching the paleoseismic rates. But, Zoback adds, the early numbers were error-prone, because the triangulation data were not accurate and GPS technology was in its early stages of development. They have since added more measurements and repeated earlier ones. Their findings now match those made by Stein’s group. “We agree that if there’s a signal, it’s a very small one,” Zoback says. “The signal we see now is not statistically significant. But perhaps as we look longer, we’ll see a signal emerge from the noise.”
Interpreting the numbers
Segall says that their recent numbers are low enough that, even with the smaller margins of error, a zero strain accumulation can’t be ruled out. But he shares Zoback’s caution, using a rubber band analogy. It could be that the rubber band is almost stretched to a breaking point, and the rate of strain accumulation they’ve measured since 1991 is small because the rate of strain accumulation decreases during the earthquake cycle. “When we measure strain, all we can detect is how much strain accumulated during the time we made measurements,” he says. What they don’t know is how much strain occurred between 1811 and 1950. “The amount of additional stretch may be small — it’s close to zero — but we don’t know how close to breaking that rubber band is.”
Stein is skeptical, noting that 200 years is the average time between great earthquakes on sections on the San Andreas fault. He says it’s difficult to see how a fault in the middle of a plate could be loaded as quickly as the San Andreas. Stein suggests that the 1811 and 1812 quakes were “a sort of one-time event in geologic history.” Excessive strain accumulated, maybe lasting 1,000 years and causing large earthquakes uncharacteristic of the area. “Ever since then, the thing has sort of slowed down.” The only obvious alternative, Stein says, is that the 1811 and 1812 quakes were smaller than has been assumed. “Either way, the hazard from large earthquakes is a lot less than we used to think.”
All three researchers agree that the current GPS data must be resolved
with the high recurrence rates suggested by paleoseismic data. Those high
paleo-rates are what led Zoback to measure present strain accumulation
in the central United States. “We don’t know what’s driving intraplate
earthquakes,” Segall says. “Because of that, it makes it very difficult
to interpret this area.” Stein agrees that the differences between the
paleoseismic record and current GPS measurements raise questions, but they
don’t make him uncertain about the danger of future earthquakes. “I think
we should think of New Madrid
as a fascinating science question, not a major earthquake hazard.”
Two days later in Buenos Aires, Argentina, more than 160 nations, including the United States, negotiated a follow-up agreement to the protocol. The Buenos Aires agreement sets a deadline of late 2000 for establishing rules and guidelines for the protocol. Although most of the debate has centered on carbon sources, predominantly fossil- fuel emissions, both the protocol and follow-up treaty recognize that sinks storing carbon dioxide will play a pivotal role in achieving the goals of the treaty. The ocean is a major sink, and scientists are beginning to understand the benefits of forests as carbon sinks. To address the technical aspects of this issue, the U.S. Global Climate Research Program held a seminar on Dec. 8, entitled “Changes in Carbon Sources and Sinks: The Outlook for Climate Change and Managing Carbon in the Future.”
Carbon sources and sinks
At the briefing, Dr. Pieter Tans, chief scientist at the National Oceanic and Atmospheric Administration’s Climate Modeling and Diagnostic Laboratory in Boulder, Colo., said that increased carbon-dioxide emissions will cause the concentration of atmospheric carbon dioxide to stay elevated for many years. He added that increasing emissions quickly, as the world currently is on track to do, or slowing emissions to a level such as that specified in the Kyoto Protocol, will not make a difference in the final amount of atmospheric carbon dioxide. Slowing the rate of emissions will give scientists more time to study different carbon sequestration methods and develop solutions to the challenges of a world with increased carbon concentrations.
Princeton University Professor of Geological and Geophysical Sciences Jorge Sarmiento spoke about the capacity of oceans to absorb carbon dioxide from the atmosphere. He said that over the long term (centuries to millennia), the ocean will take up close to 85 percent of the total carbon emissions. This process occurs slowly, however, and only 36 percent of fossil fuel emissions are absorbed by the ocean each year. Sarmiento estimated that changing ocean temperatures, circulation, and biology caused by global warming could slow this process by 10 percent to 30 percent, but admitted that current scientific understanding of this process is minimal.
Tans and Sarmiento agreed that increasing carbon-dioxide emissions will not be fully absorbed by the oceans and therefore other, land-based alternatives must be explored. Tans explained his recent research that found a sink of carbon dioxide over the United States (see Geotimes, December 1998) and surmised that most of the carbon was taken up by trees growing on recently abandoned agricultural land. These trees successfully took in 1 to 2 billion tons of carbon annually.
Dr. William Schlesinger, from Duke University’s Botany Department, added his analysis of an additional way that plants respond to excess carbon dioxide. Through an experiment known as Free Air CO2 Enrichment (FACE), which exposes large plots of forests to high levels of carbon dioxide, Schlesinger found that in the first year, trees exposed to excess carbon grew approximately 12 percent larger than those not exposed. Trees appeared to acclimate to the higher levels of carbon, however, and did not show as much of a difference in subsequent years. Using this information, he calculated that approximately 20 percent of the fossil fuel emitted in 2050 could be stored in plants and soils. Schlesinger emphasized that plants can aid in sequestering carbon but are not a panacea.
The ability of forests to sequester carbon has been championed by a number of lawmakers, including Sen. Ron Wyden (D-Ore.), who wrote President Clinton a letter in late 1997 urging the administration to undertake additional research in this area. Wyden noted that forests have been effectively regulating carbon since their development more than 300 million years ago. In addition, forest management is much less expensive than advanced technologies or other carbon reduction methods and provides additional benefits of clean water, species habitats, and wood products. The House of Representatives passed a resolution introduced by Rep. Don Young (R-Alaska), expressing the sense of the Congress that the United States should manage its national forests to maximize the reduction of carbon dioxide in the atmosphere.
These congressional actions suggest that carbon sequestration may be
one area of agreement in the often contentious debate on climate change.
Before carbon sinks can be included in legislation or future treaties,
however, more research is needed to quantify the amount of carbon that
forests and other sinks can store.
— Kasey Shewey White, AGI Government
Their action raises an important question for individual parks and the National Park Service as a whole. What should the role of the park service be in protecting the natural resources under its management? National parks are created to preserve their resources, beauty, and wildness. NPS’s goal, says Chief of Public Affairs David Barna, is to maintain the parks as best as they can — untouched. To do so, park managers try to stay out of the way of the ecosystem, to allow for evolution and change without human influence. According to Barna, when a profound change such as fire or drought occurs within a park’s ecosystem, “NPS doesn’t tend to react at all.”
“The parks are what we call baseline real estate,” Barna says. “We’re trying to preserve them as they were before the United States began to develop. [But] what — and when — do we preserve it as? We don’t even know what we have.”
Barna says that many of the parks in the national system are now full of nonnative, exotic species. Weeds are an especially extensive problem. In the southeastern United States, kudzu is overrunning the native plant populations, and foreign plants in Florida’s Everglades are literally strangling the swamps. Native animals, too, present difficulties. Yellowstone is plagued by too many elk and bison, and white-tailed deer overpopulate and overgraze the northeastern United States. The park service constantly needs to define the natural resources that belong and those that don’t. But, admits Barna, tourism comes first.
“We’ve got the resource side of the parks, and then we’ve got the visitation side. And money goes to the needs of the public. We had 279 million visitors last year, and we have to prepare for 10 million more next year.” Without their own funds for scientists and research centers, Barna explains, the park system has to rely on outside studies. “We now recognize that we need to make decisions based on natural resources instead of tourism, but we can’t make decisions without data.”
About one-third of the parks don’t have natural-resource managers, but many of them are still trying to get a handle on their natural resources. A project at Great Smokey Mountains National Park is starting to catalog the estimated 100,000 different species of plants and animals within its borders because, says Barna, “you lose resources if you don’t know what you have.”
The national parks can — and do — serve as laboratories. To support park science, NPS has established ongoing cooperative partnerships with universities and laboratories across the United States. One study, a collaborative effort between scientists at the U.S. Geological Survey (USGS) field station at Bandelier National Monument in New Mexico and Los Alamos National Laboratory, has produced data that show for the first time how rapidly an ecosystem responds to a drastic change in climate.
For the past several years, Craig Allen of the USGS and David Breshears of Los Alamos have been studying landscape change within Bandelier. What was initially a local story about the impact of a regional drought in the 1950s has become an ideal example of how a landscape can change — and how quickly. “The drought of the ’50s had a big impact on Bandelier’s ecosystem, but it was not well documented. We now have field verification with this study,” says Allen.
Allen and Breshears pored over detailed aerial photographs taken from the 1930s to the 1970s. They also conducted a variety of field studies and reviewed historical information to verify their findings. Their data showed that within five years of the drought, trees had died and the boundary between the two ecosystems had shifted. Ponderosa pines on Bandelier’s Frijolito Mesa had died in the “fringe” areas between the ponderosa pine and piñon-juniper regions, moving the boundary, or ecotone, upslope by two kilometers.
“Previous studies have documented shifts that take place over decades or even centuries, and they focus on birth and growth of vegetation,” says Allen. “Our research shows that more attention should be paid to mortality, because the rapidity of the shift resulted from the death of ponderosas as a result of the drought in the 1950s.”
Allen stresses that he and Breshears looked at the land-use history as well. They say that the ecotone shift, while driven by the drought, was also affected by decades of overgrazing and the active suppression of fires. Piñon-junipers, smaller trees that require less moisture, had begun to increase and were already competing with the larger pines and the grasses when the drought occurred.
“This research shows how rapidly vegetation can respond to climate,” says Breshears. “It has significant implications for modeling how climate can decimate vegetation in droughts like the one we studied and for assessing the impacts of global climate change.”
“This work also has site-specific significance,” Breshears adds. “For the national monument, it’s important in helping to understand how to protect archaeological sites. For [Los Alamos] laboratory, it’s related to our studies of water resources and the movement of contaminants through ecosystems.”
At Bandelier National Monument, an understanding of the change in vegetation is crucial. One result of the change has been a sharp increase in erosion, apparently due to the loss of herbaceous ground cover during the drought. “We’re watching this hill slope fall apart before our eyes,” says Allen, describing related studies on soil erosion and erosion- control treatments. “Soil erosion is of great interest to the park management. The very old soils are eroding. This is a problem for a park that includes around 3,500 archaeological sites and countless associated artifacts — at least 70 percent of recorded archaeo-logical sites are being damaged by erosion.”
David Barna agrees that they must pay attention to the soil erosion
at Bandelier. “If you’re losing topsoil, you’re losing everything,” he
says. Watching out for both the cultural artifacts and the erosion, Barna
says, is a great NPS opportunity. “This is resource management at its best.”
Schultz and Argentinean scientists Marcelo Zarate and Cecilia Camilion took a fresh look at the stratigraphy of the Chapadmalal Formation. “They were well studied in terms of fossils, but the precise ages of the layers were poorly constrained,” Schultz says. One of those layers is composed of escoria. The researchers found the pieces of glass scattered across about 30 kilometers in an isolated stratigraphic layer at the top of the formation. Theories of its origin have ranged from lightning strikes to human-made hearth fires. Volcanic eruptions didn’t create the glass, Schultz says, because no volcanoes of comparable age and composition are near the area, and wind could not have transported the large pieces of glass (some 2 meters long) as it did the abundant loess in the layers.
Schultz was visiting Argentina five years ago to study a known impact crater at Rio Cuarto when the interpreter showed him the pieces of escoria. A year later, he returned to the Chapadmalal Formation, he says, “because I became obsessed with trying to see if the glass was in the layer.” Working with Zarate, who had been studying the area’s stratigraphy in detail, Schultz determined that the glass could only have been formed from the energy produced by an impact. But these glasses were much older than the crater Schultz had been studying. Pieces are rolled into the loess, indicating that they had landed in a molten form and then hardened. Surrounding some of the glass are red, bricklike materials called tierras cocidas, which probably formed when the molten material hit, cooled, and oxidized the surrounding loess. The high transient temperatures (1720ºC to 1900ºC) created by the impact caused zircon to break down into monoclinic ZrO2 and SiO2, leaving behind baddeleyite clusters as clues. Also, interstitial glass in the escoria contains schlierien — disc-shaped flow layers within an igneous intrusion — that are characteristic of rapidly quenched impact glass. The textures of the escoria are similar to textures seen in impact glasses from known craters.
Schultz asked Auburn University’s Willis Hames, a specialist in geochronology and metamorphic petrology, to join the research team. Using laser fusion to determine the ratio of argon-40 to argon-39 in the glass, the scientists determined that the glass is 3.3 million years old, give or take 100,000 years.
Soon, John King, an associate professor of oceanography at the University of Rhode Island and a specialist in paleomagnetism, got involved. The team tested their date against paleomagnetic dates. “It was right on the money,” Schultz says. The paleomagnetic record, which uses Earth’s past magnetic field reversals as a time scale, set the glass layer at around 3.35 million years old. The age of the glass also placed it just after layers abundant in fossils and just before fossil layers revealing the disappearance of 36 types of mammals.
Researchers from Ocean Drilling Program cruises had measured the ratio of heavy-to-light oxygen isotopes in sediment cores from the nearby ocean floor in both the Atlantic and the Pacific. Those isotopic analyses show a sudden drop in atmospheric and water temperatures almost 3.3 million years ago — indicating that the climate changed just after the glass appeared and just before many animals disappeared.
“Was this an impact that occurred nearby that changed the landscape and caused a regional extinction?” Schultz asks. “Is there a cause and effect?” Was Earth already experiencing changes that caused the deaths of so many mammals in the area, or did an impact spur the changes that caused their deaths?
Slow or sudden
The researchers don’t have the answers yet, Schultz says, because they haven’t determined whether the extinctions occurred suddenly or over a longer period of time. “I can’t tell if [the extinction] happened the next day or if it happened over 50,000 years,” he says. The team is still hunting for the actual impact crater, which may sit offshore. The escoria samples change in size and composition with distance, and these changes could act as a trail leading to the crater. The problem requires a more systematic study of the layers, Schultz says. Researchers must look for glass fragments above and below the escoria layer or search for them further inland and on the ocean floor to make a broader connection. Investigating the cause and duration of climate change in the area also demands tighter constraints on chemical and isotopic analyses made of ocean cores.
The Chapadmalal Formation could also help Schultz study martian impacts.
“This type of environment that we see in Argentina is probably very similar
to the deposits we see on Mars,” he says. “Dark deposits found across Mars,
in fact, may represent strewn fields of similar impact glass derived from
a dusty martian surface.”