Operating a global seismic network - perspectives from the USGS GSN
Abstract
The Global Seismographic Network (GSN) is a permanent digital network of state-of-the-art seismological and geophysical sensors connected by a global telecommunications network, serving as a multi-use scientific facility used for seismic monitoring for response applications, basic and applied research in solid earthquake geophysics, and earth science education. A joint program of the U.S. Geological Survey (USGS), the National Science Foundation, and Incorporated Research Institutions in Seismology (IRIS), the GSN provides near- uniform, worldwide monitoring of the Earth through 144 modern, globally distributed seismic stations. The USGS currently operates 90 GSN or GSN-affiliate stations. As a US government program, the USGS GSN is evaluated on several performance measures including data availability, data latency, and cost effectiveness. The USGS-component of the GSN, like the GSN as a whole, is in transition from a period of rapid growth to steady- state operations. The program faces challenges of aging equipment and increased operating costs at the same time that national and international earthquake and tsunami monitoring agencies place an increased reliance on GSN data. Data acquisition of the USGS GSN is based on the Quanterra Q680 datalogger, a workhorse system that is approaching twenty years in the field, often in harsh environments. An IRIS instrumentation committee recently selected the Quanterra Q330 HR as the "next generation" GSN data acquisition system, and the USGS will begin deploying the new equipment in the middle of 2007. These new systems will address many of the issues associated with the ageing Q680 while providing a platform for interoperability across the GSN.. In order to address the challenge of increasing operational costs, the USGS employs several tools. First, the USGS benefits from the contributions of local host institutions. The station operators are the first line of defense when a station experiences problems, changing boards, swapping cables, and re-centering sensors. In order to facilitate this effort, the USGS maintains supplies of on-site spares at a number of stations, primarily at those with difficult shipping or travel logistics. In addition, the USGS is moving toward the GSN standard of installing a secondary broadband sensor at each site, to serve as a backup in case of failure of the primary broadband sensor. The recent transition to real-time telemetry has been an enormous boon for station operations as well as for earthquake and tsunami monitoring. For example, the USGS examines waveforms daily for data dropouts (gaps), out-of-nominal range data values, and overall noise levels. Higher level quality control focuses on problems in sensitivity, timing, polarity, orientation, and general instrument behavior. The quality control operations are essential for quickly identifying problems with stations, allowing for remedial or preventive maintenance that preserves data continuity and quality and minimizes catastrophic failure of the station or significant loss of data. The USGS tracks network performance using a variety of tools. Through Web pages with plots of waveforms (heliplots), data latency, and data availability, quick views of station status are available. The USGS has recently implemented other monitoring tools, such as SeisNetWatch, for evaluating station state of health.
- Publication:
-
AGU Spring Meeting Abstracts
- Pub Date:
- May 2007
- Bibcode:
- 2007AGUSM.S32A..04G
- Keywords:
-
- 7294 Seismic instruments and networks (0935;
- 3025)