Geotimes
Feature 
Hubbard Brook: Making Watershed Links
Greg Peterson



Sidebar:
Taking the long view


On a crisp fall day in 1999, a helicopter flew low over a hardwood forest in central New Hampshire. A large funnel dangling from the chopper brushed against tree tops as it dropped a cloud of brown pellets onto the forest floor. The pellets, made of the mineral wollastonite, soon began to dissolve, releasing calcium into the soil, replenishing it.

Over the past 50 years, acid rain has accelerated the leaching of calcium and other important nutrients out of the soil at the Hubbard Brook Experimental Forest, as it has in forests throughout New England. Soil scientists estimate that the pool of calcium available to plants has declined during that time by more than 50 percent. And while the 1970 Clean Air Act and its 1990 amendments have led to important strides, sulfur and nitrogen emissions — the leading cause of acid rain — still remain more than twice historic levels.

Biogeochemists at Hubbard Brook suspect that the calcium depletion may be stunting the growth of the forest, hitting sugar maples particularly hard. In recent years, sugar maple twigs and branches have died back, reflecting a larger, sometimes more severe, trend seen throughout patches of New England. The wollastonite addition, 50 tons in all, will restore calcium in the soil to pre-acid rain levels throughout a 29-acre watershed. Scientists seek to discern what impact calcium depletion has had on the forest.

The wollastonite addition at Hubbard Brook is the latest chapter in a rich history of large scale manipulations aimed at understanding how human disturbances impact forests. Since its inception, research at the forest has been long-term, whether monitoring chronic disturbances like acid rain or the long-term recoveries of forests to acute disturbances like clear-cutting. The research has brought together soil scientists, ecologists, hydrologists and geologists — all with an eye toward informing sound environmental policies.

The first major project in the forest began in 1965. Hydrologists and ecologists at the U.S. Forest Service and several universities in New England clear-cut a 39-acre watershed, leaving not a single tree or shrub standing. Forest managers wanted to explore the links between forest cover and the quantity and quality of water draining New England watersheds — especially in relation to spring-time flooding and the cleanliness of drinking water. “There was only a very rudimentary understanding of forest hydrology in New England, and in particular upland forest hydrology,” says Jim Hornbeck, a hydrologist, now retired, who worked for the Forest Service for 42 years and participated in the original clear-cutting experiment.

In this initial experiment, the scientists were not trying to mimic real forestry practices. Rather, they wanted to hit the system hard — completely removing the vegetation to determine the total amount of control that the biota exerts over the draining water. Herbicides prevented any plants from re-growing during the first three years after the cut.

In 1983, researchers cut and removed trees from a 55-acre experimental watershed at Hubbard Brook in New Hampshire, to test how clear-cutting impacts the quantity and quality of water draining the watershed. Pin cherries and raspberry plants grew rapidly after the cutting, sprouting from seeds that had remained viable in the soil for decades. The rapid growth captured nutrients and water that would have otherwise washed off the site. Photo courtesy of USDA Forest Servvice.

Hubbard Brook was a particularly good place for this study because it has many small watersheds right next to oneanother that have similar vegetation, soils and geology. The scientists manipulated one watershed, while leaving the others as controls. Additionally, the quartz-mica-schist bedrock that underlies the forest is relatively impermeable to water, which means that water not escaping into the atmosphere has to leave through streams. Weirs set up at the base of the streams measured exactly how much water drained from the watershed before and after the cut.

The amount of draining water jumped up, especially during the summer, says Gene Likens, then an ecologist at Dartmouth College. Because trees no longer pulled water out of the soil and transpired it into the atmosphere, the majority of precipitation simply washed down the streams. More surprisingly, the chemistry of the draining water also changed dramatically, says Likens, who is now president of the Institute of Ecosystem Studies in Millbrook, N.Y. “The nitrate concentrations shot up like a rocket, and, in the second year, they peaked at values that were roughly two times the limit for drinking water,” he says. The trees were no longer locking up nitrogen in their tissues.

The clear-cutting also stimulated a microbial community that converted soil nitrogen into a form that is highly mobile, allowing more nitrogen to flush out of the forest during rains. In New England forests, nitrogen is typically in short supply, limiting the amount that forest can grow. The loss of nitrogen, therefore, weakened the fertility of the soil.

The experiments that followed the initial clear-cutting honed in on specific management practices. One looked at the impacts of strip-cutting, a harvesting technique that removes strips of forest over a series of years. Another experiment examined the effects of block-cutting, removing all merchantable timber but leaving standing trees smaller than 2 centimeters in diameter.

“All of the studies pointed out that forest cover is the optimum cover on the landscape in terms of protecting water yield and water quality,” Hornbeck says. But, he adds, the experiments also showed that cutting forests responsibly can protect the forest while still allowing it to be used for recreation and harvest. In particular, buffer strips of uncut trees 20 to 30 meters wide on either side of streams can protect the water from degradation through sedimentation or increases in temperature. Also, leaving the crowns of harvested trees in the woods and allowing them to decompose, rather than hauling them off to the mill, can keep a large fraction of the nutrient capital in the trees on-site. And, limiting the frequency of cutting a particular forest tract to about once per century allows time for the forests in the region to sequester nitrogen from the atmosphere, compensating for losses from cutting.

Many of the research conclusions made their way into forest policy, Hornbeck says. In the late 1970s, the Forest Service shifted the minimum harvest interval for forests in the White Mountains and Green Mountains from 70 to 80 years up to 100 to 125 years. Also, the Forest Service developed best-management-practice handbooks that incorporated buffer strips. Those handbooks go out to all private foresters in New England, he says, and while the foresters are not obligated to follow the guidelines, most do. The key, Hornbeck says, is that the best-management practices make both economic and ecological sense. For example, limiting nitrogen loss helps maximize timber yield while accelerating forest recovery from clear-cutting.

The Hubbard Brook research program has grown steadily since these initial experiments. The National Science Foundation began funding the work in the 1960s, and gave the program an additional boost in 1987 when it included Hubbard Brook in its Long Term Ecological Research (LTER) network (read sidebar).

Today, over a dozen universities, government agencies and other research organizations work at Hubbard Brook. At any given time, about 15 students live at the forest, working on master’s or Ph.D. theses, says Tim Fahey, a biogeochemist at Cornell University and co-director of the Hubbard Brook LTER. “Once you learn a certain amount about a system, it leads to questions that are on the cutting edge of science.”

The acid rain problem drives much of the research at Hubbard Brook today. Stream gages at the base of the watersheds have registered large effluxes of calcium and other important cations from the forest over the past half century. The greatest losses occurred from the mid-1950s to the late 1980s, the time of highest sulfur emissions. And while acid rain certainly contributed to that efflux, researchers do not know exactly where within the forest ecosystem all the calcium came from, says Joel Blum, a geochemist at the University of Michigan who has been working at Hubbard Brook for the past decade.

Hydrogen and aluminum ions mobilized by acid rain may be kicking the calcium off surface exchange sites on minerals and organic matter in the soil, allowing calcium that would otherwise be available to trees to flow out of the forest in groundwater, Blum says. Or the acidity could be accelerating the weathering of minerals that contain calcium, such as plagioclase. If this second path dominates, then mineral weathering could supply calcium to the soil solution, offsetting some of the leaching losses.

A weir at the base of one of the experimental watersheds measures the amount of water leaving the watershed. Periodic sampling of the chemistry of the water flowing through this and other weirs in Hubbard Brook over the past several decades has built a rich data set that chronicles a gradual rise in stream water pH since passage of the Clean Air Act in 1970. Photo courtesy of USDA Forest Service.

Blum is currently working on a model that uses natural abundances of strontium isotopes and the ratios of strontium to calcium to infer the amount of calcium that comes from exchange sites versus enhanced weathering. In the process, he has discovered that some trees may partially bypass the soil exchange sites, using symbiotic associations with fungi to directly obtain calcium from the dissolution of the soil mineral apatite.

“The kinds of expertise that I bring in terms of mineralogy, mineral weathering, and the isotopic techniques that we are so familiar with in geology, has had a real synergy with ecology,” Blum says.

The large scale wollastonite addition is the next major step in understanding the role of calcium in shaping the forest ecosystem. The total biomass of trees in Hubbard Brook reached a plateau in 1983, says Scott Bailey, a geologist at the Forest Service’s research station at Hubbard Brook. Since then, tree mortality has balanced tree growth. Forests often stop accumulating biomass at some point as they age, Bailey says. But, he adds, some ecologists believe that calcium limitation has caused the forest to reach a lower plateau than it would otherwise. The wollastonite addition may shift the biomass to a higher plateau, but that could take years, Bailey explains. As with most of the experiments at Hubbard Brook, only time — and diligent sampling by a wide variety of scientists — will tell.

Taking the long view

In 1980, the National Science Foundation began a new era in ecological understanding by establishing the Long Term Ecological Research (LTER) program. The program’s central mission is to document and analyze ecological processes that vary over long periods of time and large spatial scales. The program has grown from an initial five sites with an annual budget of $1.2 million to a network of 24 ecologically diverse sites throughout the country with a budget of about $20 million. More than 1,000 participating scientists and students pull an additional $40 million into LTER-related research grants.

In the first decade of LTER research, the sites focused on the same core ecological processes. The overarching goal was to explain how energy and nutrients move through plants, soils, groundwater and other parts of the ecosystem. The research brought together scientists from traditionally distinct disciplines, such as hydrology, geology and wildlife biology, as well as ecology.

Phillip Smith, a research technician, uses GPS to precisely measure the height of a marsh on Hog Island, part of the Virginia Coast Reserve LTER. Photo by John Porter.

The legacy of multidisciplinary research continues today. However, the importance of the core areas has faded, replaced by a research agenda focused on the strengths of individual sites and geared toward informing sound environmental policies, says Tim Fahey, co-director of the Hubbard Brook LTER.

Today, global change is the hot topic. Research at many sites aims to learn how changes in global temperature, sea level and atmospheric chemistry are, and will, impact local ecology. For instance, the Virginia Coast Reserve LTER, managed by the University of Virginia, is investigating whether sea-level rise will drown marshes along the state’s coast. Marshes connected to barrier islands seem sound. The islands provide sand, transported by winds and waves, that settles out in the marshes, raising the marsh plateau faster than increases in sea-level. For marshes in the nearby lagoons, however, weaker sediment sources and losses of sediment from the marsh during heavy rains add up to a precariously slow marsh accretion rate. Results from this study are helping wetland researchers at other sites along the East Coast hone in on the amounts of sediment that wash off marshes — a key component to understanding how those marshes will respond to global change.

Other LTER sites explore how regional ecology affects global atmospheric conditions. Northern mid-latitude forests appear to be important sinks for up to 25 percent of the carbon dioxide added to the atmosphere globally from burning fossil fuels. These forests may provide a buffer against carbon dioxide-induced climate warming. A tower at the Harvard Forest LTER in Petersham, Mass., continuously measures the exchange of carbon dioxide between the forest and the atmosphere. The data show that weather regulates seasonal variations in how much carbon the forest absorbs. But the age and composition of the forest largely determine the long-term carbon budget. Forests that regrow after deforestation take up much more carbon than mature forests — which may help to explain why forests in New England, which were largely deforested at the turn of the century to make way for farming, are currently a carbon sink.

Over the next decade, LTER research should shift toward synthesizing much of the information gathered at the sites over the past 20 years, according to an NSF-commissioned panel charged with reviewing past successes and charting future directions for the program. The goal will be to develop computer models that simulate the wide breadth of processes studied at the LTER sites with enough accuracy that they can reliably forecast future ecological conditions, especially under different management scenarios.

GGP

Back to top


Peterson is a staff writer for Geotimes.

Back to top

Geotimes Home | AGI Home | Information Services | Geoscience Education | Public Policy | Programs | Publications | Careers

© 2024 American Geological Institute. All rights reserved. Any copying, redistribution or retransmission of any of the contents of this service without the express written consent of the American Geological Institute is expressly prohibited. For all electronic copyright requests, visit: http://www.copyright.com/ccc/do/showConfigurator?WT.mc_id=PubLink