EarthScope has changed the game,” says Emily Brodsky, a geophysicist at the University of California, Santa Cruz. A decade into its 15-year lifetime, the multipronged project to elucidate the structure, evolution, and dynamics of the North American continent has generated more than 115 terabytes of data, most of it freely and immediately available; cumulatively, users have retrieved the equivalent of the entire data set nearly three times over, and in just the past three years, more than 330 publications have resulted.

Pulling together what began as three separate ideas with intertwined scientific goals, NSF funded the EarthScope project under its Major Research Equipment and Facilities Construction account (see Physics Today, December 2003, page 32); so far it has cost about $400 million, including operations. One component of EarthScope is the USArray, of which the largest part is the Transportable Array. Its 400 seismometer stations are spaced 70 km apart in a grid formation. After about 2 years at a given location, stations are moved from the trailing edge to the leading edge to systematically sweep the country from west to east. In addition, USArray makes some 2400 seismic recorders available to researchers on a competitive basis for short-term projects; a separate suite of magnetotelluric sensors measure magnetic and electric fields from about 12 km into the crust down to a depth of 350 km.

Another component of EarthScope is the Plate Boundary Observatory (PBO), a geodetic network concentrated in western North America that measures deformation in Earth’s crust. The network includes 1100 GPS sensors, plus instruments in 78 boreholes to measure strain, tilt, pore pressure, and seismic activity. The deformation data from the PBO are complemented by images obtained by satellite radar and airborne lidar (see Physics Today, December 2007, page 41).

The third component is the San Andreas Fault Observatory at Depth (SAFOD), an inclined borehole drilled across the fault at a depth of 2.7 km to study the physical and chemical processes that control faulting and earthquake generation.

A section of core from the San Andreas Fault is examined by three principal investigators on the SAFOD project: from left, Mark Zoback, Stephen Hickman, and William Ellsworth.

A section of core from the San Andreas Fault is examined by three principal investigators on the SAFOD project: from left, Mark Zoback, Stephen Hickman, and William Ellsworth.

Close modal

At SAFOD, “the outstanding accomplishment was to penetrate the creeping portion of the San Andreas Fault and obtain core across two actively deforming strands,” says Stephen Hickman of the US Geological Survey (USGS) in Menlo Park, California. “It was the first time people have drilled to seismogenic depth in an active plate-boundary fault.” The SAFOD site is at Parkfield, a section of the fault that creeps at about 2.5 cm/yr, unlike the surrounding sections that are “locked” until they produce large earthquakes. Deformation in the metal casing installed in the borehole showed where the creep occurs.

“When we got the core out and saw there was [the mineral] serpentine only within the deforming strands, that was a eureka moment,” Hickman says. Chemical reactions between the serpentine—which was extruded along the fault from greater depth—and the adjacent rock yield unusual hydrous clay minerals that make the fault slide steadily at low shearing stresses. “The fault is weak and creeps because of the minerals contained within it,” says Hickman. That discovery “resolved a longstanding mystery of why the San Andreas in central California is so much weaker than the surrounding crust.” The SAFOD cores are in high demand for scientific studies and are “treated like Moon rocks,” he says.

Two SAFOD goals have yet to be fully realized. The hope was for the borehole to serve as a multisensor observatory. But the initial suite of instruments—designed to measure seismic signals, deformation, and electromagnetic fields—failed due to the harsh conditions deep underground. Instead, a single seismometer has been recording data from the bottom of the borehole for the past five years. In addition, money dried up before the team could drill through one of the fault’s microseismic patches, which produce earthquakes up to magnitude 2.

The PBO GPS sensors record the distance to satellites, from which surface deformation can be tracked with millimeter precision. Those data “reveal how faults slip or recharge for earthquakes, how volcanoes respond to magmas in motion at depth, and even how the Earth’s surface swells or subsides in response to groundwater use,” says Meghan Miller, president of UNAVCO, the university consortium that oversees the PBO. By observing changes in deformation rates in the years and even decades following a large earthquake, the PBO can study the rheology of Earth’s crust and mantle. Some 400 GPS sensors transmit data in real time and form a key part of an earthquake early warning system that is getting under way in the western US.

Data from the PBO and USArray show that in the Pacific Northwest, episodic tremor and slip—a phenomenon discovered in Japan and Canada prior to EarthScope—recurs about every 14 months and lasts days or weeks (see the article by John Vidale and Heidi Houston, Physics Today, January 2012, page 38). The slippage, which is caused by the Juan de Fuca plate thrusting beneath the North American plate, is equivalent to an earthquake of up to magnitude 7, but cannot be felt because of its slow pace—about 1 cm/week, or as much as 108 times slower than an earthquake. “The relationship between real earthquakes and tremors is totally unknown,” says Brodsky. Historical records indicate that the last magnitude-9 earthquake in the area occurred 314 years ago this month. “It’s off the public’s radar that this is an area of concern,” she says. “The onus is on us to get it together before the next big earthquake in Seattle.”

The Transportable Array was in the right place at the right time to study a huge increase in earthquakes—nearly 40-fold for magnitude 3—in Oklahoma that began in 2009. Many of the quakes are induced by oil and gas producers injecting fluids more than 2000 m underground. “You can see the earthquakes migrating away from wells in the direction of the hydraulic gradient,” says geophysicist Katie Keranen, who recently moved from the University of Oklahoma to Cornell University.

Typically, when scientists want to know about Earth’s crust, “they put down seismometers and set off explosions,” says Columbia University’s Göran Ekström. “When we designed the Transportable Array, we were focused on the mantle and deeper. But it turns out that we can get good images of the shallow portions of the Earth that are surprising a lot of people.”

Getting those images involves exploiting noise that scientists otherwise try to suppress: One source of noise is the ocean, which is “bathing us constantly,” says Michael Ritzwoller of the University of Colorado Boulder. “These waves propagate coherently.” Correlating signals from every pair of stations in the Transportable Array “gives us a completely new window to the Earth,” he says. The method, called ambient noise tomography, yields static images of wave velocities through Earth that can be interpreted to obtain temperature, composition, fluid content, and other parameters (see the article by Roel Snieder and Kees Wapenaar, Physics Today, September 2010, page 44).

The thickness of Earth’s crust beneath the continental US as mapped using ambient noise tomography. The thickest portion, in the high Rocky Mountains and northern Great Plains, exceeds 50 km. The crust is less than 35 km thick in the geologically active western US and in the southern parts of the Great Plains. Data are not yet available for the eastern US.

The thickness of Earth’s crust beneath the continental US as mapped using ambient noise tomography. The thickest portion, in the high Rocky Mountains and northern Great Plains, exceeds 50 km. The crust is less than 35 km thick in the geologically active western US and in the southern parts of the Great Plains. Data are not yet available for the eastern US.

Close modal

“When I learned about the Transportable Array going across the US, I thought, ‘Let’s see if we can see signals that are not seismic but atmospheric,’” says Michael Hedlin of the Scripps Institution of Oceanography at the University of California, San Diego. It worked, and he and colleagues were awarded funds from NSF to outfit the stations with pressure sensors. Hedlin says they record hundreds of atmospheric events each year—everything from mine blasts to rocket launches, sonic booms, and military explosions. They even recorded the Chelyabinsk meteor that exploded over Russia last year.

And Kristine Larson, a geodesist at the University of Colorado Boulder, uses PBO data to map vegetation, soil moisture, and snowfall. “I was trying to use GPS to measure seismic motion,” she says. “Signals reflected from the ground were making my data noisy. I changed gears and decided to use the reflected signal to do something useful.” Now she downloads PBO data every night and posts the results in the morning. Surface soil moisture is indicated by the phase of an interference pattern derived from the direct and reflected beams from a satellite; the depth of snow and vegetation water content can be extracted from the frequencies and amplitudes of the interference pattern. Says Larson, “My aim is to make the data available to hydrologists, climate scientists, and water managers.”

The Transportable Array is now moving to Alaska, the most seismically active state in the US. The stations will be more widely spaced because, as array manager Bob Busby says, “it can be difficult to carry around digging machines.” And, to adapt to the frozen soil conditions, smaller instruments are needed. The conversion to new seismometers dovetails with improvements in sensors, says Busby.

A test station for EarthScope’s Transportable Array installed near Toolik Field Station in north central Alaska.

A test station for EarthScope’s Transportable Array installed near Toolik Field Station in north central Alaska.

Close modal

About half of the original transportable array stations are staying behind—some 50 have been adopted by state and local groups, and 160 are part of a project of the NSF, the USGS, the Department of Energy, and the Nuclear Regulatory Commission to monitor background earthquake rates, human-induced quakes, nuclear sites, and other hazards in the central and eastern US.

“Many in the Earth science community have grown to rely on the valuable infrastructure of EarthScope’s observatories,” says Arizona State University’s Ramón Arrowsmith, director of the EarthScope National Office. Some of that infrastructure will become part of more permanent observational systems, he adds. Indeed, there’s no shortage of ideas for how to proceed beyond EarthScope’s slated 15-year lifetime.

One possibility is to build new networks and link existing ones globally to expand coverage—since EarthScope’s inception, Europe and China have created seismometer networks, for example. A popular idea is to deploy a subduction-zone observatory along the western coast of the Americas; that would require international collaboration and development of techniques to combine ocean-bottom seismic and sea-surface GPS instruments with land-based observations. SAFOD scientists hope to outfit the borehole with more instruments and to drill into the nearby micro-earthquake region. And US Earth scientists’ wish list still includes a radar satellite devoted to high-resolution deformation measurements, an unrealized fourth pillar of the original EarthScope proposal.

Beyond the science that continues to come out of EarthScope, “the project has revitalized the whole seismology community,” says David Simpson, president of Incorporated Research Institutions for Seismology, which oversees USArray. “It’s a whole different environment than previously. The younger generation is uninhibited and doing wonderful things with the data. This has been a spectacular success.”