|NEWS NOTES||May 1999|
Volcanism on Kerguelen
Common ground on climate change?
Do machines have the right stuff?
From a safe distance
Seattle pilots earthquake map
Led by co-chief scientists Millard F. “Mike” Coffin of the Institute for Geophysics at the University of Texas at Austin and Fred Frey of the Massachusetts Institute of Technology, the international team of 45 scientists collected a number of cores across the plateau and are continuing to study the recovered samples. In addition to determining the ages of each part of the plateau, the researchers have also turned up some surprising results about the nature of the volcanism that formed Kerguelen.
Previous ODP expeditions recovered material that dated the Southern Kerguelen Plateau to 110-115 million years ago (Ma), and the Central Kerguelen Plateau to approximately 85 Ma. This recent leg fixed those dates at 110 Ma and 85-95 Ma, respectively, and also brought in samples from the Northern Kerguelen Plateau, which ODP had not previously drilled. The dates for the northern samples show that this part of the plateau formed less than 35 million years ago, which indicates that the LIP formed through several long-lasting and temporally separated volcanic episodes.
Volcanism over time
Between 110 and 85 million years ago, prolonged and effusive volcanism formed most of the LIP, including what are now the southern and central portions. Scientists compare the volcanism to that forming Iceland and Hawaii, and view it as “environmentally benign.” But the scale of the volcanism that created the Kerguelen LIP was far greater than any in recorded history. Laki, a volcano in Iceland, provides a contrasting example. For six months in 1783-84, Laki erupted. The eruption altered Europe’s climate for a number of years, and 75 percent of Iceland’s livestock as well as 25 percent of its population died. But the lava field that Laki produced was approximately just 1 percent of the size of a typical flood basalt flow in a LIP such as Kerguelen. The environmental effects of the volcanism over the millions of years it took to create the plateau must therefore have been substantial, if not catastrophic.
Above the sea
Through time, as the tectonic plates of the Indian Ocean moved, they carried the landmass of Kerguelen away from its volcanic source. The LIP crust cooled, contracted, and gradually subsided to depths of 1 to 2.5 kilometers below sea level. Millions of years later, the Northern Kerguelen Plateau began to form. A surprising finding from three of the drill sites showed that large-scale explosive volcanism, in contrast to the prolonged effusive volcanism that formed the bulk of the LIP, was a common phenomenon at Kerguelen near the end of its volcanic construction.
The tectonic and volcanic evidence shows that much of Kerguelen formed above sea level. The ODP team discovered fossils and terrestrial plant remains as well. “We found abundant evidence that much of the Kerguelen LIP formed above sea level,” says Coffin. “Wood fragments, a seed, spores, and pollen recovered in 90-million-year-old sediment from the Central Kerguelen Plateau, just southeast of Heard Island, unambiguously indicate that this region was above sea level.”
The scientists also discovered continental matter within the plateau. “A spectacular result was finding uniquely continental rocks in a conglomerate that was probably deposited in a river on Elan Bank, a western salient of the Central and Southern Kerguelen Plateau,” says Frey. “Understanding how pieces of an ancient continent were incorporated into the oceanic environment of Elan Bank will have significant impact on our understanding of the approximately 130-million-year-old breakup among Australia, India, and Antarctica.”
Credit for Voluntary Reductions Act
On March 4, Chafee, chair of the Senate Environment and Public Works Committee, introduced S. 547, the Credit for Voluntary Reductions Act of 1999. In his introductory remarks, Chafee said, “There is growing certainty in the international scientific community, and indeed within our own business community, that human actions may eventually cause harmful disturbances to our global climate system.” He also acknowledged that a great deal of uncertainty exists in the manner in which the United States will address this issue. Because of this uncertainty, he explained, business leaders are reluctant to take action to reduce emissions because they believe that they would not receive credit under a future greenhouse-gas reduction program. “It is this uncertainty, this regulatory and financial risk, that our legislation intends to diminish,” he stated.
Chafee’s bill would allow the president to enter “a legally binding early action agreement” with businesses. These businesses would be given credits for their greenhouse-gas reductions made over a baseline during the voluntary period. These credits can be for activities to reduce emissions and sequester carbon, including actions taken overseas. The bill proposes using emission levels from 1996 through 1998 as a baseline. Chafee stated, “Because we do not know when, if ever, the United States will impose emissions reductions, we do not know the duration of the actual voluntary period. The bill does, however, establish a 10-year sunset on the voluntary crediting period.” Businesses would be responsible for measuring, tracking, and reporting their emissions.
Chafee emphasized that he in no way supports the U.S. ratification of the Kyoto Protocol, the 1997 United Nations treaty to reduce greenhouse-gas emissions. Some of the 11 bipartisan original cosponsors of the Chafee bill made the point even more clearly in their remarks on the Senate floor. Sen. John Warner (R-Va.) said, “I continue to feel strongly that the [Kyoto] protocol is fatally flawed. This bill is about protecting United States companies that have or are interested in taking voluntary steps to lower their output of carbon dioxide and other greenhouse gases.” Other senators, who believe the United States has a role in international emission reductions, support the bill as well. Sen. Joe Lieberman (D-Conn.) cited the American Geophysical Union’s climate- change statement (Geotimes, April 1999, p. 7) that says, “pres-ent understanding of the Earth climate system provides a compelling basis for legitimate public concern over future global and regional scale changes” and called the bill “an insurance policy to protect us from the dangers of climate change.”
Reaction to the bill
Outside Congress, the bill has some unlikely allies. Fred Krupp, executive director of the Environmental Defense Fund, called it “a bipartisan breakthrough which can jump-start emissions reductions.” Similarly, Kevin Fay, executive director of the International Climate Change Partnership, a business group whose member-ship includes British Petroleum- Amoco, General Motors, and General Electric, said that his group has “consistently stressed the need to provide legally binding assurances that voluntary actions to reduce greenhouse-gas emissions will be credited in any future mandatory scheme adopted by the government.”
Although the Chafee bill appears to have more support than previous climate-change initiatives, it is likely to have a difficult journey through Congress. Several senators believe it will increase support for the Kyoto Protocol and oppose it on those grounds. Senate Energy and Natural Resources Committee Chair Frank Murkowski (R-Alaska) said the bill is “clearly a case of putting the cart before the horse.” He said lawmakers “should not rush to implement a treaty that has not been ratified or even submitted to the Senate.” In addition, several environmental groups, including the Sierra Club and Greenpeace, criticized the bill for having too many loopholes. They joined with other environmental groups to issue a statement that the bill “could only move forward with major alterations.” Chafee did not appear fazed by the criticism, stating, “Have we satisfied all the environmental groups with this bill? No, but that is the story of my life.”
— Kasey Shewey White
AGI Government Affairs Program
While the importance of manned exploration has been discussed for decades, recent advances in unmanned exploration have enlivened the debate. The new technology has led more scientists to question the added costs of sending humans beyond low-Earth orbit. However, other scientists suggest that a human presence is vital for exploring extraterrestrial terrains, for the human mind brings a complex, yet efficient, perspective to the discovery of a new environment. Due to the successes of NASA’s Pathfinder project and other “faster, cheaper, better” discovery-class satellites and probes, the value of a human presence now leads many discussions on the future of space exploration.
Citing the advancing technology of unmanned missions and the greater cost and complexity of human exploration, some scientists suggest that NASA should devote more of its limited resources to remotely operated spacecraft, landers, and rovers. “There are pressing scientific reasons for people to be able to go beyond low-Earth orbit, but I’m personally not convinced that geologic field work is one of them,” says Harry Y. (Hap) McSween Jr. of the University of Tennessee-Knoxville. “The real purpose of a rover on Mars is to be a geologist.”
The much less expensive rover, however, has capabilities that a human geologist does not. For example, the Athena instrument package that will be carried by the rover on the Mars Surveyor mission in 2003 will be able to study mineralogy, elemental chemistry, and even organic molecular components in situ. In addition, orbiting instruments, such as the Thermal Emission Spectrometer on the Mars Global Surveyor, have the capability to map surface mineralogy across the entire planet, on a scale as fine as 3 kilometers.
While not necessarily cost- effective, human explorers do provide missions with certain advantages, says McSween. The United States will probably never produce a machine capable of performing the difficult sampling that is required to hunt for fossils on other planets. “You want to do paleontology on Mars? Look how hard it is on Earth. I don’t think that you’ll ever get a machine capable … [of doing] the difficult sampling that may be required.”
Paul Spudis of the Lunar and Planetary Institute in Houston agrees. “A natural setting is tens of billions of bits of data, but only a small set is relevant to the task at hand,” he says. Spudis emphasizes that humans can focus on the small dataset and place it within a “regional context.” A rover is merely a tool for a human many thousands of miles away, and is incapable of making such decisions. Also, the time needed for a communications signal to travel from Earth to a martian rover ranges from 6 minutes to around 40 minutes. The time necessary for a human on site to decide where to explore could take seconds, and the human is less likely to get stuck on a rock a few meters from the landing site.
To point out the advantages of human exploration, Spudis and McSween both compared the NASA Apollo missions to the Soviet Union’s unmanned Luna missions of the 1970s. “Look at cost effectiveness,” says Spudis. “Manned [exploration] is more expensive, but gives far more data.” The Luna missions retrieved small samples adjacent to the landing site and returned them to Earth for study. In contrast, the Apollo astronauts were able to sample from an area that was orders of magnitude greater, draw conclusions based upon observations on site, and set up complex equipment. On-site observations made by Al Worden on Apollo 15 actually led to the selection of the landing site for Apollo 17.
Within the past few months, the National Research Council’s Space Studies Board has re-convened the Committee on Human Exploration (CHEX), first formed in the late 1980s, to address a number of issues, including the manned vs. unmanned exploration debate. In the last decade, an increased understanding of global issues such as the threat of near-Earth objects striking Earth has prompted the committee to revisit the subject.
The first American to fly on Mir, Norman Thagard of Florida State University, chairs the committee of eight experts in space medicine, biology, engineering, and space policy. University of Tennessee’s Hap McSween is the only planetary scientist on the committee. According to Space Studies Board Director Joseph Alexander, the new CHEX has already met in a low-level planning capacity to prepare for formal meetings later this year. Their conclusions may help NASA determine the status of manned exploration missions, including one of the most controversial proposals: the colonization of Mars.
— Joshua Chamot
Department of Geology, University of Tennessee, Knoxville
Volcanologists have devised a variety of volcano-monitoring mechanisms. Using the network of Global Positioning System (GPS) satellites, for example, researchers can monitor the ground swellings on and around active volcanoes. GPS receivers stationed on volcanoes can send position measurements by radio to computers in the lab, making it possible for researchers to continuously monitor a volcano’s changes from a safe distance. Magma movement within a volcano produces subtle and sometimes dramatic movements at Earth’s surface, and dense networks of the receivers can record these movements (and potentially the movements occurring before an eruption). But such networks have been unrealistic because installation of one of these sophisticated GPS receivers can cost up to $20,000.
“The cost for installing dense monitoring systems becomes prohibitive for most volcanoes around the world,” says Charles Meertens, a researcher with UNAVCO, the University NAV- STAR (Navigation Satellite Timing and Ranging) Consortium, which advances the use of GPS for geoscience research. “If you’re only measuring two to three points, you may miss part of the signal that enables you to measure where the magma is and how it’s changing with time.”
Meertens recently designed a new system for deploying cheaper, single-frequency GPS receivers — costing about $4,400 per unit — in dense networks that can still gather accurate measurements. A handful of the world’s approximately 500 active volcanoes are already dotted with the more costly dual-frequency receivers (which correct for radio signal delays in Earth’s ionosphere). Meertens’ team will deploy less expensive single-frequency receivers that aren’t as accurate individually but, in large numbers, can give accurate measurements of relative ground motions. If volcanic activity destroys one of the receivers, plenty of others remain to monitor the volcano.
Meertens and other researchers from UNAVCO and several universities are deploying dense arrays of single-frequency GPS receivers around volcanoes like Popocatépetl in Mexico, Mauna Loa in Hawaii, and Taal in the Philippines. In 1998, the team installed a dense array around the active Long Valley Caldera in California.
These arrays measure localized movements ranging from 1 to 10 kilometers rather than the large-scale, regional movements measured in tectonic studies. Instead of deploying one or two expensive receivers that process data before sending it, they are deploying 10-to-30 less-expensive receivers that will send the raw data to a base station (a system of telemeters and recording computers) for processing in near real-time.
“For local movements on a volcano, the single-frequency receivers provide adequate data about their relative movements,” says Michael Hamburger, an associate professor of geological sciences at Indiana University. A few dual-frequency receivers in the hybrid arrays help correct for ionosperic delays.
Hamburger leads a team that is working with the Philippine Institute
of Volcanology and Seismology to install a network of single-frequency
receivers on Taal. The volcano, one of the most active and deadly in the
Philippines, has erupted at least 10 times in this century and has been
in a period of unrest since 1992. The new receivers will join three dual-frequency
receivers, and this hybrid network will complement two dual-frequency receivers
stationed about 60 kilometers north in Manila. The volcano network monitors
local magma movements, and the larger network formed
by the volcano array and the Manila stations monitors regional tectonic movements, helping researchers study possible relationships between tectonic and volcanic movements. “It’s an exciting new application of GPS technology to another earth-science application,” Hamburger says. His team will start installing the single-frequency receivers later this year.
A team led by Meertens and Tim Dixon of the University of Miami has been working with the Geophysics Institute of the Universidad Nacional Autonoma de Mexico (UNAM) to install single-frequency receivers on Popocatépetl in March. The volcano was active as recently as February. As with the Taal network, the single-frequency receivers add to an existing base of dual-frequency receivers.
Dixon, the UNAVCO team, other U.S. universities, and the U.S. Geological
to deploy permanent networks of GPS receivers around 10 Pacific Rim volcanoes considered hazardous to populated areas. The networks would form a real-time warning system. To date, the deployments of dual-frequency and single-frequency receivers have been funded by the National Science Foundation, the National Aeronautics and Space Administration, and other organizations. Seth Stein, scientific director for UNAVCO and a professor of geological sciences at Northwestern University, says the group is working to secure funding to deploy networks on more volcanoes. They will also develop software for transferring and analyzing the GPS data and presenting them in a form useful for hazard assessment. Eventually, he adds, the data would be available on the World Wide Web .
Recent research has suggested that the Puget Sound area, which is lined with deep faults and sits atop the Cascadia subduction zone, could suffer a magnitude-9 subduction-zone earthquake, like one that apparently struck 300 years ago. The region could also suffer large earthquakes within the shallow part of the crust. The research further suggests that a major earthquake shook the Seattle area about 1,100 years ago, causing landslides in Lake Washington and generating a tsunami in Puget Sound. Four magnitude-7 quakes have hit the Puget Sound and Georgia Strait areas in this century.
Geoscientists have completely rewritten the hazard assessment for western Washington in the last decade, says Craig Weaver, a geophysicist with the U.S. Geological Survey and professor at the University of Washington. Weaver is also a researcher with SHIPS (Seismic Hazards Investigation in Puget Sound), one of the most complex studies ever conducted to assess earthquake hazard in an urban area.
The SHIPS study is part of Project Impact, an initiative of the Federal Emergency Management Agency to make U.S. cities more resistant to the destruction caused by natural hazards. It is also part of the National Seismic Hazard Mapping Project of the USGS. When Weaver discovered that Seattle was one of seven pilot cities for Project Impact, he worked to include development of a seismic hazard map into the city’s overall hazard evaluation scheme. “There’s an awful lot of interest in using these maps in nontraditional ways,” Weaver says. He and other geologists hold monthly meetings with city planners and Seattle business owners to develop the hazard map.
The SHIPS survey focuses specifically on urban centers, says Mike Fisher, a chief scientist for SHIPS and member of the USGS Coastal and Marine Geology Team. The SHIPS team will produce a three-dimensional map of the geologic structure from Puget Sound up through the Georgia Strait, beneath the cities of Olympia, Tacoma, Seattle, Everett, and Bellingham in Washington state as well as Victoria and Vancouver, British Columbia. The map will help pinpoint where the amplitudes of earthquake waves will be highest and where their durations will be longest.
Researchers from USGS, the Geological Survey of Canada, and five universities started SHIPS in March 1998. The logistics of surveying rugged terrain and winding waterways, and especially of deploying and maintaining hundreds of temporary seismometers, required the cooperation of five different science teams led by Anne Trehu of Oregon State University; Robert Crosson and Kenneth Creager of the University of Washington; Thomas Brocher, Thomas Parsons, and Uri ten Brink of USGS; George Spence of the University of Victoria; and Barry Zelt and Philip Hammer of the University of British Columbia. A team of 12 biologists, led by John Calambokidis of the Cascadia Research Collective, ensured that marine mammals weren’t harmed by strong sound waves. “For future success in regional endeavors of this kind, we must pool people and resources from a large number of institutions,” Fisher says.
Using the University of Washington’s Thomas Thompson research vessel and the Canadian survey’s Tully (with Canada’s Roy Hyndman as chief scientist), the SHIPS team conducted air-gun surveys over hundreds of miles of waterways in Puget Sound, Georgia Strait, and the Strait of Juan de Fuca. For two weeks, they set off thousands of shots from air guns to send pressure waves through Earth, while a network of off-shore and on-shore seismometers (some in back yards) measured the arrival times of reflected and refracted waves. The survey focused on the main urban centers and measured the transit times over a finer grid than was previously possible using the region’s permanent seismometers, Fisher says.
They found that the Seattle basin is about 1 kilometer deeper than originally believed (reaching down about 10 kilometers) and that the upper crustal rock is weaker than they thought. These differences can significantly increase the expected amplitudes of earthquake waves, especially within the basin. Several researchers will present additional reports from the SHIPS survey during the Seismological Society of America annual meeting, May 3-5, in Seattle.
The 1998 marine surveys provided a view of the Seattle basin along north-south seismic lines. To gain an east-west perspective of the basin structure, and thus assemble a 3-D image of the basin’s geometry, a team led by USGS geophysicist Tom Brocher will conduct an on-shore seismic survey, called “Dry SHIPS,” this month.
“The sedimentary rock fill of the basin tends to amplify the ground motions,” Brocher says. The combination of the March 1998 survey and the new Dry SHIPS survey will form a 3-D velocity model of the bay area’s geology and of the Seattle basin. A network of about 1,000 seismometers will measure sound waves from the approximately 50 underground explosions the team will release in the area.
Eventually, the 3-D geometry image of the basin will add to computer models that simulate and predict ground shaking — and, more importantly, locate areas of highest ground shaking — during quakes.
“The Seattle fault extends to great depth into the crust,” Fisher says, “and we are looking for other deep faults that threaten Seattle.” Glacial deposits, which Weaver says cover surface faults, make the Seattle area harder to study. Also, he adds, seismologists don’t understand the general ground deformation of the Seattle basin as well as the ground deformation around faults such as the Puente Hills fault, recently found beneath Los Angeles.
Will the seismic hazard map actually result in new building designs or building retrofits? “That’s part of the experiment,” Weaver says. Either way, the city will get its first updated geologic map since the 1950s.