|NEWS NOTES||December 1998|
Closing in on extinction rates
Saving geoscience data
Our carbon sink
Diversity in the science work force
Transportation’s response to quakes
NASA officially opened on Oct. 1, 1958. Fueled by post-war prosperity and the desire to outrun the Soviet space program, the new agency was warmly welcomed by the citizens of the United States. Every launch seemed to open a new door to the future.
Within the first week, NASA’s manned-satellite program, Project Mercury, had been proposed. In less than a year, NASA had chosen its first seven astronauts for the Mercury program, including John Glenn and Alan Shepard, and had launched the Mercury capsule model. In 1962, Glenn became the first American to orbit Earth. The 1960s was the decade of the Apollo program, culminating on July 20, 1969, as millions of people around the world watched Neil Armstrong step onto the moon’s surface.
The following years brought both tragedies and triumphs. Schoolchildren of the ’80s remember the day the Challenger exploded. Today’s schoolchildren remember the day that Pathfinder landed on Mars and Sojourner rolled out onto its dusty surface.
The Mars mission signaled a new era for NASA -- a push for unmanned space vehicles such as Sojourner and a new credo for the agency, “faster, cheaper, and better.”
With money tight, the need to develop research objectives that fit leaner budgets was turning out to be as challenging as earlier technological hurdles. But while the space exploration program has trimmed back, NASA’s education programs have grown and prospered. The Internet now provides access to all that NASA has to offer: information on its programs as well as projects for the classroom. Teachers can go to NASA’s Education Program page (<http://www.hq.nasa.gov/office/codef/education/index.html>) and link to interactive educational shows, instructional programs, curriculum support materials, and educational guides.
The “Teacher/Faculty Preparation and Enhancement” web page lists a number of goals and objectives, such as providing teachers with the experience of lesson design using scientific inquiry, and promoting the use of NASA-related materials and resources. In October, when a teacher called in to a C-Span program for more information, NASA Administrator Daniel S. Goldin immediately responded by improving teachers’ access to slides, photographs, and educational materials on the NASA web site. All of Goldin’s speeches and presentations, including his remarks to NASA about its future, are available on-line.
In addition to the information on its web site, NASA reaches out to the public through astronaut appearances, a variety of publications, and an active media relations program. Readers of The Washington Post, for example, were recently treated to a history of the space suit, when Post staff writer Kathy Sawyer covered it from head to toe in “Suited for Space” (Oct. 14, 1998, p. H01).
Last January, Goldin spoke about the changes that have occurred in NASA engineering. “As we move into the future,” he said, “the days of 100 to 1,000 people in the back room will be something of the past. That’s what I mean when I talk about a ‘faster, better, cheaper’ NASA. Just think of the impact [that] advanced information technologies and other breakthroughs will have on power plant operations ... on package delivery businesses ... and on the automotive industry. These are the tools we need.”
NASA is aware that its programs require the public’s support if the government is going to continue to fund them. Former agency administrators have struggled to convince American citizens that the space station is needed. And in the past few years, without the dazzling spectacle of men on the moon, many Americans lost interest in NASA’s projects.
The Mars landing seemed to change that. Web browsers clogged the Internet in July 1997 as users sought glimpses of the red planet. And the Hubble Space Telescope, having overcome its initial difficulties, is beaming back spectacular images from deep space, firing up both scientists and the public.
It is the new NASA that looks to be supported, and that emphasizes so
strongly the education of the public. The schoolchildren of today are the
decision-makers of tomorrow and NASA’s outreach programs represent not
only a commitment to education but a pledge to the future.
“People have dabbled in this [dating technique], but this is the first time we’ve gone after the time of a big extinction,” says Samuel Bowring, a principal research scientist in the Department of Earth, Atmospheric, and Planetary Sciences at the Massachusetts Institute of Technology. Bowring and Douglas Erwin, a paleontologist with the Department of Paleobiology at the Smithsonian’s National Museum of Natural History, presented research during the Geological Society of America meeting in Toronto that constrained the duration of the final stage of the end-Permian extinction to less than 1 million years. “What we found was that extinction occurred in a very short amount of time,” Bowring says. Such a rapid extinction points to catastrophic events as the cause.
Erwin, Bowring, and colleagues published their findings in the May 15, 1998 Science. An article by Erwin and Bowring in the September GSA Today summarizes the research. They analyzed Late Permian and Early Triassic rocks — specifically from the Changhsingian stage of the Late Permian — in south China, taking advantage of volcanic ash beds layered with fossils. The zircon in the volcanic ash provided the clock — the radioactive decay of uranium into lead. “The continued refinement of chemical separations of U and Pb from zircon and the improvement of mass spectrometry allow high-precision analyses of single grains of zircon containing as little as 10 picograms of radiogenic Pb,” Bowring and Erwin write in GSA Today.
The extinction beds the researchers studied are bracketed by the volcanic ash beds. By dating the zircon in the ash beds, the researchers could narrow and bracket the time frames of the extinction beds. The thicknesses of the sediment layers, Bowring says, are less reliable indications of time durations, especially for times when catastrophic events disrupted the environment. The south China sites also helped them test their zircon dating technique, because their calculated ages agreed with the sedimentary order.
This dating method has been around since the early 1980s, Erwin says, but the researchers took advantage of a progressive refinement of the technique and applied it to a specific research question. “Paleontologists never figured you could find ash beds in the same place you find fossils,” Erwin says. “What’s new is the resolution Sam and his group are able to achieve — we realized that with the level of precision we can get, we can answer all kinds of questions.”
Causes of extinction
The researchers also bracketed carbon isotope excursions between ash beds. They found that an extreme negative excursion in the amount of carbon-13 lasted 165,000 years or less. Previous studies constrained the excursion to within a broad 3-to-10 million years, Erwin says. Honing in so closely on the duration of such a dramatic drop in carbon-13 helped the researchers eliminate some possible causes — continental drift, for example — for an extinction that would act over millions of years.
At the GSA meeting, Erwin and Bowring discussed different models for
causes of the Permian extinction. “Now that we have some idea of what the
rates are, we can debate the different models,” Erwin says. They suggest
that the sudden carbon-13 excursion could coincide with the Permian extinction,
when at least 85 percent of the planet’s marine species disappeared. The
sudden decrease of carbon-13 in less than 200,000 years could be evidence
of an oceanic overturn, when a temperature change might have altered the
circulation of ocean waters and caused surface waters to sink and mix with
bottom waters. In 1995, a group of researchers led by Paul Renne of the
University of California at Berkeley dated the Siberian Traps, flood basalt
lavas created by ancient eruptions. Renne used a laser heating argon/ argon
isotope method to find that the eruptions probably occurred at the same
time as the Permian extinctions. These eruptions, Erwin and Bowring suggest,
could have induced sudden warming after global cooling and released methane.
The carbon excursion could also point to the impact of an icy object, such
as a comet. Bowring and Erwin say that the new geochronologic data call
for more detailed paleontologic investigations to explain the short extinction
It can cost millions of dollars to drill deep cores, and destroying a core just because it didn’t show evidence of oil is wasteful, says Kimm Harty, deputy director of the Utah survey. “To throw all that core away is basically to get rid of data,” she says. “Scientists in the future may need the cores, maybe even for a technology we don’t know about.”
The challenge of developing a “library” for core samples is obtaining the outside funding that will make it self-sufficient. The Utah library is funded by a $1-million, interest-free loan from the Utah Department of Natural Resources, to be repaid in 20 years — the length of time Allison and Harty think it should take to fill the repository.
The Texas Bureau of Economic Geology (BEG) runs two repositories, one in Austin and one in Midland. The 93,000-square-foot Austin repository is one of the nation's largest and archives cuttings from 70,000 wells and cores from 14,500 wells.
The repository is already at 90 percent of its capacity, says Database Manager James Donnelly. The Austin repository was built in 1984 with funding from the University of Texas at Austin. The repository charges $3 for pulling a core or cuttings for viewing, but those service fees barely help the repository meet its daily operational costs, Donnelly says. A few years ago, the holdings expanded significantly when Shell Oil Co. donated the Midland repository to BEG (Department of Energy funding also contributed to the Midland repository). “What seems to be happening is that the major companies are getting out of the core storage business,” Donnelly says. “The cost of the buildings alone are prohibitive in most cases.” Often, companies let Earth reclaim the cores. “There’s probably a lot of drill core laying out there on the ground,” he adds.
Before its new building was completed in October, the Utah survey stored its samples in an 11,000-square-foot basement with 8-foot-high shelves. Offices were tucked between the shelves, Allison says. Now the survey has 10,500 square feet of shelves that are 18 feet high, and 1,500 square feet for office, lab, and classroom space.
Like many states, Utah depends on voluntary contributions of samples from petroleum and mining companies. The library also holds oil and coal samples. “We have tremendous voluntary compliance with these requests,” Allison says. He adds that most of the library’s clients are petroleum companies that use the cores and lab space to train new exploration geologists and also to map out the state’s available resources. “We’re trying to promote resource development in Utah and we want to support companies based in Utah,” he says. Other frequent users are universities, mining and engineering companies, and state agencies. Deep cores are also helpful for paleoclimate studies. Any analysis performed on core, oil, or coal samples is available for public use a year after it’s completed.
A national repository
The U.S. Geological Survey runs a federal core repository in Lakewood, Colo. The 80,000-square-foot repository holds cores from 8,000 drill holes and cuttings from 53,000 holes. Tom Michalski, curator of the Core Research Center, says the number of samples in the center is stable. When the USGS suffered budget cuts three years ago, he says, the repository lost seven staff members and 40 percent of its samples. Even with that loss, the repository is 85 percent full. Tom Fouch, regional geologist for the USGS Central Region, says the center donates core to states with viable storage facilities.
The American Geological Institute (AGI) is working to create a National Geoscience Data Repository and Research Center in Denver. The center will be a part of the National Geoscience Data Repository System (NGDRS). The Denver repository would house any data that a state or regional repository can’t accept.
In August, AGI signed a letter of intent with Stapleton Development Corp. to convert a Continental Airlines hangar complex at Denver's former Stapleton Airport into a national repository. The facility will be able to hold more than 3 million boxes of cores and cuttings.
The need for such a repository, or at least a funding source for state
repositories, is very great, Allison says. National storage capacity is
not expanding, but the amount of core and cuttings at risk of disposal
is increasing. “That’s a lot of core to dump on a system that hasn’t expanded,”
he says. A national repository would house samples from states that choose
not to create their own repositories or that no longer have the capacity
to store additional core.
In an article in the Oct. 16, 1998, issue of Science, a team of researchers from NOAA, Princeton University, and Columbia University (led by S.M. Fan) say that they have tentatively identified that sink as being mostly in North America. Tans, an atmospheric chemist at NOAA’s Climate Monitoring and Diagnostics Laboratory (CMDL) and also a co-author of the second paper, says that “the North American land surface appears to be absorbing possibly as much as between 1 and 2 billion tons of carbon annually, or a sizeable fraction of global emissions of carbon dioxide from fossil-fuel burning.”
The research team obtained its data from 63 atmospheric sampling stations of the GLOBALVIEW database. GLOBALVIEW-CO2 is a compilation of high-quality atmospheric measurements of gases made by different laboratories, and is a product of the Cooperative Atmospheric Data Integration Project, coordinated by NOAA-CMDL.
The researchers developed a three-dimensional grid of Earth to model the flow of carbon dioxide, and applied the GLOBALVIEW data to it. They expected to see the amount of atmospheric CO2 increase over North America, caused by the gas produced by the burning of fossil fuels. Instead, the model showed that for the period of time studied, carbon dioxide declined in the atmosphere across North America as the model’s winds moved from west to east. The decline of atmospheric CO2 indicates that the gas is being absorbed into the land mass.
The scientists are not sure what is causing the decline of carbon dioxide. They theorize that it is partly due to the regrowth of plants and vegetation on abandoned farmland and previously logged forests in North America. It may even be enhanced by human-induced nitrogen deposition, a diluted form of acid rain. Although the actual cause is unknown at the moment, the researchers believe that plants and soils are a major factor in CO2 absorption and will continue to exert considerable influence on atmospheric carbon dioxide in the future.
Tans emphasizes that even when researchers can account for most of the carbon dioxide that has been emitted globally, the future remains uncertain. “The current uptake of carbon by terrestrial ecosystems is helping to slow down the rise of CO2 in the atmosphere, but we need to know why this is happening,” he says. “Only then may we be able to project for how long into the future this process may continue.”
In other words, it’s unclear if this is good news or bad. The absorption of CO2 into the North American land mass may alleviate some of the adverse effects caused by burning fossil fuels, but it doesn’t mean we can drop all concerns for global warming. Carbon is so closely associated with global warming that data on how it is distributed throughout the world and how it is absorbed and released are of major significance.
“This finding will assist us in better understanding the global fate
of carbon dioxide,” Tans says. Jerry Mahlman, director of NOAA’s Geophysical
Fluid Dynamics Laboratory and a co- author of the Oct. 16 paper, says that
the North American sink may prove important in worldwide management of
atmospheric carbon absorption. Its value, he adds, will be at a global,
rather than a regional level.
In particular, the commission will identify the number of women, minorities, and individuals with disabilities who are pursuing careers in science, technology, and engineering in order to determine the specific fields in which they are under-represented. The study will include research on the practices of employers regarding the recruitment, retention, and advancement of these groups. In addition, the commission will examine incentives and barriers to the advancement of these groups in both undergraduate and graduate education programs. They will gather information on programs and policies that have already proven successful and compile a list of studies that have been done. Finally, the commission will issue recommendations for the government, academia, and the private sector to follow as they create a more diverse scientific work force.
The law requires the commission to release a report on these issues within one year of its appointment. Representation on the commission will consist of science, engineering, or technology businesspeople and educators appointed by the president, congressional leadership, and the National Governors’ Association. As the bill was signed into law, Rep. Morella remarked, “Today, we have taken a positive step forward to strengthen and expand our high-tech work force by ensuring that women, minorities, and persons with disabilities have the skills necessary to compete in the information age. By addressing this problem now, we can help to ensure that our technology-driven economy continues to flourish well into the 21st century.”
The bill is based on the results of the 1996 National Science Foundation report Women, Minorities, and Persons with Disabilities in Science and Engineering and the 1994 National Research Council report Women Scientists and Engineers Employed in Industry: Why So Few? It originally focused solely on women in science, but an amendment by Rep. Donald Payne (D-N.J.) added minorities and persons with disabilities, two other under-represented groups in the sciences, to the study. The bill authorizes $400,000 for fiscal year 1999 and $400,000 for fiscal year 2000 for the study.
National Science and Technology Council study
The Clinton administration has launched a similar effort, designed to increase diversity in the federal government's science and engineering work force. At a Sept. 10 ceremony honoring the recipients of the 1998 Presidential Award for Excellence in Science, Mathematics, and Engineering Mentoring, President Clinton directed the National Science and Technology Council (NSTC) to develop recommendations within six months. During the ceremony, President Clinton noted that “in science, engineering, and mathematics, minorities, women, and people with disabilities are still grossly under-represented, even though we are becoming an ever more diverse society. If we are serious about having the finest scientists, mathematicians, and engineers in the world, we can't leave anybody behind.” He stressed the importance of mentoring to achieve this goal, adding that the federal government supports the work of tens of thousands of scientists, who could be a tremendous source for mentors.
The NSTC study aims to increase mentoring by the federal government in scientific and technical fields by recommending linkages and improvements to its existing higher education programs. It also seeks to expand federal participation with the private sector and the academic community to strengthen mentoring in higher education.
Both projects will produce recommendations on how to increase diversity
in the nation's work force, but the real challenge will come in implementing
— Kasey Shewey White, AGI Government Affairs Program
Researchers from Stanford University and the University of Southern California are leading the effort to examine how emergency workers can get where they are needed in the immediate aftermath of a quake. Their research also seeks to determine how a region will recover economically when key transportation links are knocked out of service for days, months, or even years. The goal is to develop better software tools and analytical models for transportation planning and response in all earthquake-prone urban areas.
On July 10 and 11, experts from academia, private industry, and government met at Stanford’s Blume Earthquake Engineering Center for a workshop co-sponsored by PEER. They selected the Bay Area for PEER’s first demonstration project — a detailed, three-year study of its roads, bridges, ports, railways, and airports — because San Francisco’s transportation network is more complex than that of Los Angeles. “The group decided it would be a better testbed,” says Anne Kiremidjian, professor of civil engineering at Stanford and director of the Blume Center. “There is not as much redundancy in the Bay Area system, and the reconstruction of the Bay Bridge over the next two years also might provide valuable information on time delays and traffic re-routing.”
The demonstration project also focuses on developing methods for analyzing the indirect economic impacts of earthquake damage to transportation systems. Past studies just looked at the direct costs of replacing and repairing the facilities. Kiremidjian says that the new information will aid decision-makers in setting priorities for seismic retrofitting and post-quake construction projects. She adds, “The workshop group recommended we look at the impact on [the] Silicon Valley electronic business, because it is such a major component of the economy of the Bay Area.”
Although many tools have already been developed for assessing earthquake
risk, the PEER participants believe there is a need for an analysis tool
that is more specifically focused on transportation. Some researchers said
that predicting how people and cargo will move around after an earthquake
is even harder than predicting where damage will occur. Samer Madanat,
associate professor of civil and environmental engineering at the University
of California-Berkeley, says that existing traffic modeling systems don’t
take into account the altered mindsets of people after a disaster. He adds,
“The news media can be a great help
in getting people to minimize the usage of essential infrastructure in the first few hours after a disaster.”
In their efforts to predict traffic patterns, planners and emergency personnel are further hampered because citizens frequently change their own travel routes in the days following a disaster. It’s also hard to foretell which roads and bridges might fail; past models have predicted more failures than have actually occurred. The university researchers who participated in the PEER workshop are beginning to tackle these problems.
Kiremidjian worked with Nesrin Basöz, a senior staff engineer at K2 Technologies in San Jose, to develop a bridge classification system. They put together a multiple origin-and-destination network analysis method for deciding how to allocate resources in situations where multiple roadway links have failed but where several hospitals, police stations, and fire stations are available to respond to an emergency.
Basöz also analyzed the bridge damage data reports from the 1994 and 1989 earthquakes in Northridge and Loma Prieta, and found inconsistencies in the way inspectors evaluated the damage, says Kiremidjian. “Her research suggests that better standards for assessing damage should be set to provide more robust data. Software tools to guide inspectors in the data collection would be very helpful.”
The demonstration project is funded by PEER, which received a grant
from the National Science Foundation and matching funds from California
when it was established in October 1997.
The theory that Earth was a giant ice planet, or “snowball Earth,” at least once during the Late Proterozoic was first proposed by Joseph L. Kirschvink of the California Institute of Technology in 1992. He based his conclusion on paleomagnetic data for continental positions that suggested glaciation reached the equator. Since then, many researchers have proposed a variety of theories for how Earth’s climate might have experienced an abrupt transformation to a planet-wide glaciation. An unsolved question is how warm-water carbonates and glacial diamictite — poorly sorted rocks formed at the base of glaciers — exist close together, often on top of one another, in the Neoproterozoic. In GSA Today, Hoffman’s team writes that high orbital obliquity, true polar wander, reduced solar luminosity, snowball albedo, carbon-dioxide drawdown, stagnant ocean overturn, and reinterpretation of the diamictite as mega-impact ejecta have all been offered as explanations for the dichotomy. Their recent research, they write in Science, shows that the snowball-Earth hypothesis is the best explanation for geological and geochemical observations and offers the best explanation for the mystery of why carbonates appear immediately after diamictite.
Hoffman’s team analyzed changing carbon-13 values in about 800 samples of glacial deposits from the Otavi Group of the Congo Craton in Namibia. During the Neoproterozoic, the craton was a low-lying platform the size of the conterminous United States, blanketed by carbonates and shale containing glacial diamictite. The Otavi Group contains the Chuos and Ghaup formations, two discrete glacial units dating from 760 to 700 million years ago. The scientists observed sudden changes in the amount of carbon-13 present in these layers below and above the diamictite layers. When sediments washing into the oceans buried the organisms, the carbon-12 they absorbed was also buried and the ocean water was dominated by carbon-13. Preglacial sediments contain 5 to 9 parts per million of carbon-13 beneath the subglacial surface, the researchers report, while the sequences directly beneath the subglacial unconformity drop as low as -5 parts per mil. Immediate postglacial values are -3 parts per mil and rise to 0 parts per mil above cap carbonate.
Interpreting the changes in carbon-13 values as carbon burial fluxes, the researchers found that the proportion of organic carbon to total carbon burial changes from almost 0.5 before the glacial deposits to almost zero immediately after the glacial deposits. “The isotopic pattern ... is consistent with the hypothesis of a snowball Earth, in which oceanic photosynthesis would be severely reduced for millions of years because the ice cover would block out sunlight,” they write.
Hoffman and his colleagues propose that when life was abundant in the
oceans, animals pulled carbon dioxide out of the air. Over time, the decrease
in carbon dioxide cooled the planet and the oceans froze. Eventually, carbon
dioxide from volcanoes warmed Earth’s climate again. The Otavi Group sequences
suggest that a drop in carbon dioxide in the atmosphere caused a planet-wide
freeze that lasted about 10 million years. Earth oscillated between a glacial
and greenhouse climate at least four times, starting about 760 million
years ago and before the Cambrian explosion of biodiversity. The sudden
warming after the planet-wide glaciation quite possibly spurred the Cambrian
explosion, the researchers add.
That was part of the message that Barbara Tewksbury, a professor of geology at Hamilton College in New York, delivered in October at the Geological Society of America’s annual meeting in Toronto.
The key word for geoscience departments, she states, is recruitment: “Recruit students to major in geology and to have a rigorous training in geology, even if they don’t pursue geology careers,” she says. But she is not calling for a watering-down or broadening of undergraduate geoscience curricula, she stresses. Rather, geoscience faculty need to change their attitudes about the kinds of students they recruit for their departments, she suggests. “This should be an appropriate major for people who are not planning careers in geology — departments should recruit those very bright people, who are interested in Earth, regardless of their career paths.”
Graduates can take their geology degrees not only into geoscience careers, but to law school, business school, or other programs. “Those kinds of people are increasingly in demand in business,” Tewksbury points out. People well-trained in science who also pursue degrees in law or business administration take the scientific inquiry method into business and policy. Businesses are demanding “gold-collar workers,” people with skills in many disciplines. And the undergraduate geology departments in the nation’s universities can help train the kind of 21st-century work force that a global economy demands, Tewksbury says.
Scott Rayder, for example, earned his geology degree from Hamilton and now works on oversight issues for the House Committee on Science in Washington, D.C. He says his geology background has made him a more informed writer, given him a better appreciation of Earth, and provided a huge career advantage. “In geology we talk about ‘hard rockers’ versus ‘soft rockers,’ yet physics and biology have subcultures in their disciplines as well,” Rayder says. “When you can understand these different cultures and can communicate them, it gives you an advantage when talking about concepts that are foreign to scientists — especially politics!” After he graduated from Hamilton in 1990, Rayder earned his master’s degree in public administration, with a concentration in science and technology policy, from Syracuse University.
The value of a science-business combination has already been “officially” recognized by at least two business schools: the Johnson Graduate School of Management at Cornell University and the MBA program at Pennsylvania State University. Cornell started admitting applicants holding advanced degrees in science to its program in 1995. That same year, Penn State admitted freshmen to its new science B.S./MBA program, which now graduates students with both a bachelor’s degree in general science and an MBA. Other schools are considering similar multidisciplinary programs.
Tewksbury, who has taught at Hamilton College for 22 years, says her department has evolved into one that recruits students who possess a variety of career goals. All of the students are prepared to pursue geoscience careers. But, she adds, “we don’t count as failures the people who go out to law school or business school. More than anything else, it requires an attitude change on the part of the faculty.” The change in her department’s way of thinking happened gradually. Geoscience professors have been trained to produce people who will be geoscientists. “Of course we want to do that,” she says. “But it also has put blinders on us.”
During her GSA presentation, Tewksbury outlined the benefits of this
recruitment strategy for geoscience departments: a high number of students
every year, a collection of bright students, and the freedom for a teacher,
with good conscience, to recruit a student even during times when geoscience
jobs are scarce.