Top natural hazards news stories of 2006
Looking into landslides
Coming on the heels of already devastating earthquakes or volcanic eruptions, landslides are often overlooked as a source of danger, some scientists say. In October 2005, landslides triggered by the magnitude-7.6 earthquake in Kashmir — a border region between Pakistan and India — funneled through canyons, destroying homes and displacing more than 400,000 people. The earthquake’s death toll was more than 80,000 people (see Geotimes, January 2006).
Rescue workers search for survivors of a devastating landslide that buried a Philippine city in February. Photograph is courtesy of U.S. Marine Corps/Cpl. Justin Park.
On a smaller scale, heavy rainstorms can also initiate the threat. In January, storms in Indonesia triggered a landslide that engulfed a village, burying 200 people under the mud; while in August, a mudslide caused by rainstorms in Nepal swallowed a village of nearly 500 people. And a February landslide triggered by storms on the Philippine island of Leyte killed more than 1,000 people (see Geotimes, April 2006).
“Landslides tend to be pretty catastrophic for relatively few people in a relatively small area,” says Raymond Wilson of the U.S. Geological Survey (USGS) in Menlo Park, Calif. “But there are events that can happen that can trigger thousands of landslides over regions of hundreds of square miles,” he says. If the trigger event is big enough, like a very large storm or an earthquake, a local problem can quickly become a regional or even a national one, he says.
The public is still largely unaware of the dangers, but a January report by scientists at the United Nations University aimed to raise awareness among diplomats and international organizations and identified priorities for researchers and governments (see Geotimes, March 2006). More than 10 million people were displaced, killed or otherwise impacted by nearly 500 large landslides from 1903 to 2004, according to the report.
How best to reduce the risks associated with landslides is still being researched, yet funding for landslide research is scant, Wilson says. “There are so few people and it takes several years to get new efforts going,” he says.
Researchers are investigating new technologies that can keep track of warning signs, such as changes in water pressure and vibrations. For example, USGS geoscientists are looking into tiny electronic sensors with GPS known as “smart dust” that can be sprinkled onto a hillside and monitor those parameters, Wilson says. Another promising technology is radar interferometry, which can detect movement on very large, slow-moving landslides, he says.
These projects are still “in the formative stages,” however, Wilson says. “It’s a very challenging technical problem to try to predict.”
Getting ready for the rumble
Galvanized by the magnitude-9.1 Sumatra earthquake that triggered the devastating Indian Ocean tsunami in December 2004, seismologists continued this year to seek ways to provide faster, more accurate warnings that a giant earthquake could be imminent. This year, which also marked the 100th anniversary of the devastating 1906 San Francisco quake (magnitude 7.8) (see Geotimes, April 2006), the focus has not only been on developing new warning systems, but also on unlocking the complexities of such “mega-earthquakes.” Scientists have delved into how stresses build, how ruptures develop and even how large temblors can trigger destructive aftershocks.
The very first rumblings of an earthquake can contain clues to its ultimate size, according to geophysicist Erik Olson of the University of Wisconsin-Madison and seismologist Richard Allen of the University of California-Berkeley (see Geotimes, January 2006). Analyzing data from 71 large earthquakes over the past few decades, Olson and Allen found a correlation between the frequencies of the emitted seismic energy during the first seconds of rupture and the temblor’s final magnitude. Although the relationship between that initial energy and the final magnitude is still uncertain, the find could eventually provide emergency responders with some crucial extra time to prepare for large-scale damage, the scientists said.
“Silent” earthquakes — slow-moving events that cause no ground shaking — can also foreshadow larger temblors, according to a study by geophysicist Paul Segall and colleagues at Stanford University. Their research, published in the July 6 Nature, found a link between such silent quakes and subsequent swarms of small temblors, suggesting that the silent quakes actually add stress to the fault zone. Therefore, they wrote, the “harmless” silent quakes may also help trigger larger, more devastating earthquakes.
Other studies in 2006 continued the search to better understand the forces that lead to large earthquakes. Reporting in the February Geology, seismologist Christopher Fuller of the University of Washington in Seattle and his colleagues modeled the effect of sediments collecting at subduction zones, where powerful, deep earthquakes often occur. Those sediments can help the plates “lock,” building up stresses that can lead to large earthquakes when they do slip, Fuller and his colleagues found.
Meanwhile, seismologists continue to puzzle over the motions that set off the Sumatra earthquake. Studying data from GPS stations, tidewater lines and coral elevations in the region, geologist Jean-Philippe Avouac of Caltech, and his colleagues found that the Sunda thrust fault, where the Australian Plate dives beneath the Eurasian Plate, slipped 20 to 30 meters within the first few minutes of the earthquake (see Geotimes, May 2006). It surprisingly continued to slip another 30 percent in the following weeks. Their measurements can help scientists understand the fault’s movement patterns, including where it “stuck” and its dynamics deep in the crust, and could help indicate whether more earthquakes might be in store along the still-unruptured southern section of the fault, the researchers said.
A large earthquake’s aftershocks — such as the magnitude-8.7 earthquake that shook Indonesia in March 2005, only three months after the initial Sumatra quake — can also pose significant danger to already-ravaged regions. Geologists Emily Brodsky of the University of California-Santa Cruz and Karen Felzer of the U.S. Geological Survey (USGS) in Pasadena, Calif., studied how aftershocks relate to the main event (see Geotimes, August 2006). The researchers found that rapid ground-shaking pulses during the main earthquake, rather than steady stress as a result of shifting rocks along a fault, can trigger these follow-up events.
Finally, whether an earthquake’s ultimate power is predictable remains uncertain, but the release this year of classified documents from the Chinese government demonstrates that scientists have, at least once, managed to get ahead of the game (see Geotimes online, Web Extra, June 26, 2006). In 1975, Chinese officials claimed to have predicted a magnitude-7.3 earthquake that struck Haicheng in northern China, but the suppression of relevant documents by the Chinese government and the failure of Chinese scientists to predict a magnitude-7.8 quake the following year cast that claim into doubt. Piecing together the documents and reports of key witnesses, however, geologist Kelin Wang of the Geological Survey of Canada confirmed that the Chinese scientists predicted an “imminent” event within 24 hours of the main shock — and their warnings did indeed save thousands of lives, he reported in the June Bulletin of the Seismological Society of America.
Although the study suggests that earthquake prediction may not be impossible, the Chinese study remains the only case where a large earthquake has been predicted, seismologist Susan Hough of USGS in Pasadena, Calif., told Geotimes in June. Currently, she said, scientists do not have a “road map” for reliable earthquake predictions.
Levee concerns abound
When Hurricane Katrina struck the Gulf Coast in August 2005, people around the world began to take note of the levees that protected their homes from rising waters. Concern heightened in 2006, when several old levees in California threatened to breach during winter and spring storms, and a report suggested that a levee in Florida could go at any time. Meanwhile, engineers in Louisiana were scrambling to get New Orleans’ levees at least back up to pre-Katrina levels by the start of the Atlantic hurricane season on June 1.
Engineer Bob Bea and colleagues discuss the repair operations at one of the repaired levees at the Mississippi River Gulf Outlet in New Orleans. Photograph is by Rune Storesund.
In May and June, two groups released studies of the New Orleans levee failures, looking at how and why they failed, and how to avoid a repeat of the breaches and resulting devastation. Both reached similar conclusions: The engineering failed in several ways, and this was a disaster that could have been prevented (see Geotimes, August 2006).
Since Katrina struck, the U.S. Army Corps of Engineers has been working to fix the breaches and shore up the levees first to the pre-Katrina condition, and then to a “100-year protection level.” This means protecting the city from the kind of storm surge that, statistically, comes along roughly every 100 years, says Robert Bea, an engineer at the University of California in Berkeley, who co-led one of the investigations. And although New Orleans’ levees were supposed to be at greater than a 100-year level prior to the storm, the Corps has since determined that they were “more in the range of a 50-year protection,” Bea says. So now, he says, they are working to rebuild the levees and other parts of the flood protection system to be more capable and reliable than before Katrina.
Rebuilding New Orleans’ levees has been “kind of like rebuilding a house,” in that the primary parts of reconstruction were in place by the June 1 deadline, Bea says, but many of the details were not, and “have been really troublesome.” For example, he says, the pumps to remove floodwaters from the canals weren’t working correctly and were undersized, and the gates to block off the canals from the rising seas had to be lowered manually instead of automatically, as designed.
Although much progress has been made, Bea says, there is still a long way to go, with perhaps the most important part being the development of a long-term vision and strategy for a flood protection system. Fortunately, 2006 brought a light hurricane season, so the levees were not tested this year. Still, everyone involved in rebuilding New Orleans needs to be thinking more long term, he says, perhaps considering the 1,000-year storm, or the 10,000-year storm, “following the experience and leadership of countries like the Netherlands.” And they need to consider that storms may worsen and sea levels may rise as global temperatures continue to rise, he says.
California is taking possible hydrologic changes due to climate change seriously, Bea says, by analyzing their thousands of miles of levees, which protect the state capital and cities in the Central Valley, as well as much of the nation’s agriculture and the water supply of 23 million Californians. After heavy rainfall and flooding last winter and spring, the Corps identified dozens of sites of “critical” concern — levees that could fail in the next storm — and began working on fixing those before the Nov. 1 start of the California winter storm season. In the meantime, California voted on Nov. 7 to approve a $4.1 billion plan for flood control.
The question of levee safety has reached beyond California and Louisiana, however. In May, a report came out that said the giant dike surrounding Florida’s Lake Okeechobee, which protects some 40,000 people from the lake’s waters, could fail in a strong hurricane. And Houston residents are increasingly worried that the levees between Houston and Galveston face the same engineering threats that New Orleans’ levees faced before Katrina, Bea says.
Dry conditions throughout the United States fueled wildfires that raged out of control in 2006, sparking more discussion about the causes and management of the blazes. From January through March, drought and high winds across the southern United States created prime conditions for grass fires, which burned more than a half-million acres in Texas, Oklahoma and New Mexico. And on Labor Day, a trash fire in Southern California sparked one of the biggest wildfires in the state’s history, blazing through the dry Los Padres and Angeles national forests.
Sparked by lightning, the Sawtooth Fire blazed near Southern California’s Yucca Valley in July 2006, charring at least 25,000 hectares of desert ecosystem. Photograph is by Wes Schultz, courtesy of CDF.
The forests of the western United States have experienced an increase in the frequency and intensity of wildfires since the mid-1980s, with wildfire activity from 1987 to 2003 nearly four times the activity seen from 1970-1986, according to a July 6 Science Express study by climate researcher Anthony Westerling of Scripps Institution of Oceanography, and his colleagues. Although this increase has been attributed to changes in land use in the past decades, Westerling’s team found that warmer summer temperatures and earlier melting of the snowpacks in the high mountains were paving the way for more wildfires to bloom in the forests — and allowing them to reach higher in the mountains than in previous years.
Climate models project continued increases in temperatures and in the length and intensity of summer droughts in the western United States as well, suggesting that wildfire activity will continue to increase in the region, the team said. Because forests help store 20 to 40 percent of the carbon sequestered in the United States, an increase in biomass burning will also increase carbon dioxide inputs to the atmosphere, at least in the short term, they wrote.
An increase of wildfires in wetlands is also releasing mercury to the atmosphere, according to a study by ecosystem ecologist Merritt Turetsky of Michigan State University and colleagues (see Geotimes, November 2006). The study found that mercury, long sequestered in organic-rich peatlands, is now being unleashed to the atmosphere as the wetlands burn.
Scientists are also still considering how fire management can impact the spread of wildfires. A Jan. 20 Science paper by Dan Donato, a forestry science masters student at Oregon State University (OSU), and colleagues on fire management practices ignited a raging debate. In their study, Donato’s team suggested that “salvage” logging of dead trees after wildfires hinders, rather than helps, the forests’ recovery.
Critics of the study, which included another team of forest scientists from OSU and U.S. Rep. Brian Baird (D-Wash.), responded with separate articles in the Aug. 4 Science, contesting that the original study was too narrowly focused and overlooked key environmental factors that may have contributed to how the forest recovered. Donato’s team responded in the same issue of Science by providing additional data, and standing by their conclusions.
With increasing attention being paid to the fire-climate link, scientists are continuing to examine the problem. In November, an international meeting of fire scientists met at the Third International Fire Ecology and Management Congress to consider how global and regional climate change, as well as ecological conditions, affect fires and fire management.