Elsevier

Tectonophysics

Volumes 522–523, 5 February 2012, Pages 89-121
Tectonophysics

Review Article
Seismicity-based earthquake forecasting techniques: Ten years of progress

https://doi.org/10.1016/j.tecto.2011.08.019Get rights and content

Abstract

Earthquake fault systems interact over a broad spectrum of spatial and temporal scales and, in recent years, studies of the regional seismicity in a variety of regions have produced a number of new techniques for seismicity-based earthquake forecasting. While a wide variety of physical assumptions and statistical approaches are incorporated into the various methodologies, they all endeavor to accurately replicate the statistics and properties of both the historic and instrumental seismic records. As a result, the last ten years have seen significant progress in the field of intermediate- and short-term seismicity-based earthquake forecasting. These include general agreement on the need for prospective testing and successful attempts to standardize both evaluation methods and the appropriate null hypotheses. Here we differentiate the predominant approaches into models based upon techniques for identifying particular physical processes and those that filter, or smooth, the seismicity. Comparison of the methods suggests that while smoothed seismicity models provide improved forecast capability over longer time periods, higher probability gain over shorter time periods is achieved with methods that integrate statistical techniques with our knowledge of the physical process, such as the epidemic-type aftershock sequence (ETAS) model or those related to changes in the b-value, for example. In general, while both classes of seismicity-based forecasts are limited by the relatively short time period available for the instrumental catalog, significant advances have been made in our understanding of both the limitations and potential of seismicity-based earthquake forecasting. There is general agreement that both short-term forecasting, on the order of days to weeks, and longer-term forecasting over five-to-ten year periods, is within reach. This recent progress serves to illuminate both the critical nature of the different temporal scales intrinsic to the earthquake process and the importance of high quality seismic data for the accurate quantification of time-dependent earthquake hazard.

Introduction

The impact to life and property from large earthquakes is potentially catastrophic. In 2010, the M7.0 earthquake in Haiti became the fifth most deadly earthquake on record, killing more than 200,000 people and resulting in ~$8 billion (USD) in direct damages (Cavallo et al., 2010). Direct economic damage from the M8.8 earthquake that struck Chile in February of 2010 could reach US$30 billion, or 18% of Chile's annual economic output (Kovacs, 2010). As a result of the potential regional and national impact of large earthquakes, research into their prediction has been ongoing on some level for almost 100 years, with intervals marked by optimism, skepticism, and realism (Geller et al., 1997, Jordan, 2006, Kanamori, 1981, Wyss, 1997).

More than ten years ago, this controversy erupted in what has become known in the community as the Nature debates (Nature Debates, Debate on earthquake forecasting, http://www.nature.com/nature/debates/earthquake, 1999; Main, 1999b). Prompted in large part by the apparent lack of success of the Parkfield prediction experiment (see, e.g., Bakun et al., 2005), it ultimately focussed on the nature of earthquakes themselves and whether they might be intrinsically unpredictable. While this question has yet to be decided, it marked a turning point in the field of earthquake science such that today earthquake forecasting, or the assessment of time-dependent earthquake hazard, complete with associated probabilities and errors, is now the standard in earthquake predictability research.

At the same time, a wealth of seismicity data at progressively smaller magnitude levels has been collected over the past forty years, in part related to the original goals of efforts such as the Parkfield experiment and in part out of recognition that there is much still to learn about the underlying process, particularly after the Parkfield prediction window passed without an earthquake (Bakun et al., 2005). While it has long been recognized that temporal and spatial clustering is evident in seismicity data, much of the research associated with these patterns in the early years focused on a relatively small fraction of the events, primarily at the larger magnitudes (Kanamori, 1981). Examples include, but are not limited to, characteristic earthquakes and seismic gaps (Bakun et al., 1986, Ellsworth and Cole, 1997, Haberman, 1981, Swan et al., 1980), Mogi donuts and precursory quiescence (Mogi, 1969, Wyss et al., 1996, Yamashita and Knopoff, 1989), temporal clustering (Dodge et al., 1996, Eneva and Ben-Zion, 1997, Frohlich, 1987, Jones and Hauksson, 1997, Press and Allen, 1995), aftershock sequences (Gross and Kisslinger, 1994, Nanjo et al., 1998), stress transfer and earthquake triggering over large distances (Brodsky, 2006, Deng and Sykes, 1996, Gomberg, 1996, King et al., 1994, Pollitz and Sacks, 1997, Stein, 1999), scaling relations (Pacheco et al., 1992, Romanowicz and Rundle, 1993, Rundle, 1989, Saleur et al., 1995), pattern recognition (Keilis-Borok and Kossobokov, 1990, Kossobokov et al., 1999), and time-to-failure analyses (Bowman et al., 1998, Brehm and Braile, 1998, Bufe and Varnes, 1993, Jaumé and Sykes, 1999). Although this body of research represents important attempts to describe these characteristic patterns using empirical probability density functions, it was hampered by the poor statistics associated with the small numbers of moderate-to-large events either available or considered for analysis.

The availability of new, larger data sets and the computational advancements that facilitate complex time series analysis, including simulations, rigourous statistical tests, and innovative filtering techniques, provided new impetus for earthquake forecasting at a time when the field was apparently polarized on the issue (Nature Debates, Debate on earthquake forecasting, http://www.nature.com/nature/debates/earthquake, Main, 1999b, Jordan, 2006). In 2002, the first prospective forecast using small magnitude earthquake data was published (Rundle et al., 2002), and this was followed by a renewed interest in seismicity-based methodologies and engendering a renewed effort aimed at better defining and testing these techniques. Landmark initiatives in the earthquake forecasting validation and testing arena include the working group on Regional Earthquake Likelihood Models (RELM) as well as the Collaboratory on the Study of Earthquake Predictability (CSEP), both founded after 2000 (Field, 2007, Gerstenberger and Rhoades, 2010, Zechar et al., 2010).

Although a suite of potential precursory phenomena exist in addition to those associated with changes in seismicity, including tilt and strain precursors, electromagnetic signals, hydrologic phenomena, and chemical emissions (Scholz, 2002, Turcotte, 1991), we limit our discussion to the predominant seismicity-based forecasting techniques actively researched over the past ten years. Methods also not discussed here include forecasting techniques associated with earthquake interactions such precursory seismic velocity changes (e.g., Crampin and Gao, 2010) or stress transfer studies (see King et al., 1994, Stein, 1999; and others).

Here we review the current status of seismicity-based forecasting methodologies and the progress made in that field since the 1999 Nature debate. In the interest of space, we limit the discussion to methodologies which rely on the instrumental catalog for their data source and which attempt to produce forecasts which are limited in both space and time in some quantifiable manner. As a result, these methods primarily produce intermediate-term forecasts, on the order of years, although we do include a small subset that relies on aftershock statistics to generate short-term forecasts on the order of days. Important discussion exists elsewhere on the appropriate standard for furnishing a testable earthquake forecast (Jackson and Kagan, 2006, Jordan, 2006), as well as the efficacy of various forecast testing methodologies and their evaluation (e.g. Field, 2007, Gerstenberger and Rhoades, 2010, Schorlemmer et al., 2007, Vere-Jones, 1995, Zechar et al., 2010). While there is no attempt here to test the reliability of these forecasting techniques against each other or a particular null-hypothesis with rigorous statistics, in some cases attempts are made to compare to either a Poisson null-hypothesis or to a null hypothesis which includes spatial and temporal clustering, as in the case of the relative intensity (RI) forecast model (Holliday et al., 2005) (see Section 3.4) or the ETAS model (see Section 3.3) (e.g. Vere-Jones, 1995). We will discuss briefly any such efforts, or the lack thereof, particularly in those cases where the method has not been formally submitted for independent evaluation.

We have separated the methods discussed here into two different categories, although there is some unavoidable overlap. This paper begins with a review of a suite of seismicity-based forecasting methodologies that each assumes a particular physical mechanism is associated with the generation of large earthquakes and their precursors and performs a detailed analysis on the instrumental and/or historic catalog in order to isolate those precursors. We designate these “physical process models”, Section 2. In this subset we also include two techniques which fall slightly outside the parameters outlined above, the characteristic earthquake hypothesis and the accelerated moment release (AMR) hypothesis. While both use a relatively small subset of larger events and are not optimally formulated to produce time- and space-limited forecasts, their undeniable impact on the earthquake forecasting community mandates their inclusion here.

In Section 3 we detail the evolution and current state of smoothed seismicity models. These models primarily apply a series of filtering techniques, often based on knowledge or assumptions about earthquake statistics, to seismicity catalog data in order to forecast on both short- and intermediate time scales. We conclude with a short discussion of the limitations and future outlook of seismicity-based forecasting tools.

Section snippets

Physical process models

Physical process models are those in which the precursory process relies on one or more physical mechanisms or phenomena associated with the generation of large events. A detailed analysis, often but not always statistical, is performed on the instrumental and/or historic seismicity in order to isolate those precursors. These techniques are based on the assumption that the seismicity is acting as a sensor for the underlying physical process and can provide information about the spatial and

Smoothed seismicity models

Smoothed seismicity models are a more general class of seismicity-based forecasting models which define the important physical spatio-temporal features of earthquake processes, characterize these features in a mathematical and/or probabilistic manner, and then calibrate the model based on data available from seismic catalogs for particular tectonic regions. Although the classic versions did not include specific geologic or tectonic information, a few of the newer methods discussed below attempt

Conclusions

Recent developments in the fields of statistical seismology, in conjunction with the availability of large quantities of seismic data at smaller scales and computational advances, have improved significantly our understanding of time-dependent earthquake processes. As a result, the last ten years have seen significant progress in the field of intermediate- and short-term seismicity-based earthquake forecasting. These seismicity-based forecasting techniques can be differentiated into models

Acknowledgments

The work of K. F. Tiampo was supported by the NSERC and Aon Benfield/ICLR Industrial Research Chair in Earthquake Hazard Assessment. The work of R. Shcherbakov was supported by an NSERC Discovery Grant 355632–2008. Several images were plotted with the help of GMT software developed and supported by Paul Wessel and Walter H.F. Smith.

References (360)

  • C.R. Allen

    The tectonic environments of seismically active and inactive areas along the San Andreas fault system

  • E.I. Alves

    Earthquake forecasting using neural nets

    Nonlinear Dynamics

    (2006)
  • J.G. Anderson et al.

    Past and possible future earthquakes of significance to the San Diego region

    Earthquake Spectra

    (1989)
  • R. Arrowsmith et al.

    Potential for earthquake rupture and M 7 earthquakes along the Parkfield, Cholame, and Carrizo segments of the San Andreas Fault

    Seismological Research Letters

    (1997)
  • W.H. Bakun et al.

    The Parkfield, California, earthquake prediction experiment

    Science

    (1985)
  • W.H. Bakun et al.

    Recurrence models and Parkfield, California, earthquakes

    Journal of Geophysical Research

    (1984)
  • W.H. Bakun et al.

    Seismic slip, aseismic slip, and the mechanics of repeating earthquakes on the Calaveras fault, California, in Earthquake Source Mechanics

  • W.H. Bakun et al.

    Implications for prediction and hazard assessment from the 2004 Parkfield earthquake

    Nature

    (2005)
  • Y. Ben-Zion et al.

    Accelerated seismic release and related aspects of seismicity patterns on earthquake faults

    Pure and Applied Geophysics

    (2002)
  • G. Biasi et al.

    Estimating surface rupture length and magnitude of paleoearthquakes from point measurements of rupture displacement

    Bulletin of the Seismological Society of America

    (2006)
  • G.P. Biasi et al.

    Paleoseismic event dating and the conditional probability of large earthquakes on the southern San Andreas fault, California

    Bulletin of the Seismological Society of America

    (2002)
  • P. Bird et al.

    Plate-tectonic analysis of shallow seismicity: apparent boundary width, beta, corner magnitude, coupled lithosphere thickness, and coupling in seven tectonic settings

    Bulletin of the Seismological Society of America

    (2004)
  • D.M. Boore et al.

    Equations for estimating horizontal response spectra and peak acceleration from western North America earthquakes: a summary of recent work

    Seismological Research Letters

    (1997)
  • D.D. Bowman et al.

    Accelerating seismicity and stress accumulation before large earthquakes

    Geophysical Research Letters

    (2001)
  • D.D. Bowman et al.

    Intermittent criticality and the Gutenberg Richter distribution

    Pure and Applied Geophysics

    (2004)
  • D.D. Bowman et al.

    An observational test of the critical earthquake concept

    Journal of Geophysical Research

    (1998)
  • D.J. Brehm et al.

    Intermediate-term earthquake prediction using precursory events in the New Madrid Seismic Zone

    Bulletin of the Seismological Society of America

    (1998)
  • E.E. Brodsky

    Long-range triggered earthquakes that continue after the wave train passes

    Geophysical Research Letters

    (2006)
  • C.G. Bufe et al.

    Predictive modeling of the seismic cycle of the greater San Francisco Bay region

    Journal of Geophysical Research

    (1993)
  • K.P. Burnham et al.

    Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach

    (2002)
  • L. Cao et al.

    Forecasting b-values for seismic events

    International Journal of Bifurcation and Chaos

    (1996)
  • T.Q. Cao et al.

    The Revised 2002 California Probabilistic Seismic Hazard Maps

  • T.Q. Cao et al.

    Model uncertainties of the 2002 update of California seismic hazard maps

    Bulletin of the Seismological Society of America

    (2005)
  • E. Cavallo et al.

    Estimating the direct economic damage of the earthquake in Haiti

  • CEPEC Report

    Report to the Director, Governor's Office of Emergency Services by the California Earthquake Prediction Evaluation Council March 2, 2004

  • CEPEC Report

    Report to the Director, Governor's Office of Emergency Services by the California Earthquake Prediction Evaluation Council December 9, 2004

  • W.L. Chang et al.

    Integrated seismic-hazard analysis of the Wasatch Front, Utah

    Bulletin of the Seismological Society of America

    (2002)
  • S.F.M. Chastin et al.

    Statistical analysis of daily seismic event rate as a precursor to volcanic eruptions

    Geophysical Research Letters

    (2003)
  • C.C. Chen et al.

    An improved region–time–length algorithm applied to the 1999 Chi-Chi, Taiwan earthquake

    Geophysical Journal International

    (2006)
  • C. Chen et al.

    The 1999 Chi-Chi, Taiwan, earthquake as a typical example of seismic activation and quiescence

    Geophysical Research Letters

    (2005)
  • A. Christophersen et al.

    Foreshock rates from aftershock abundance

    Bulletin of the Seismological Society of America

    (2008)
  • F.R. Cinti et al.

    Probability map of the next M5.5 earthquakes in Italy

    Geochemistry, Geophysics, Geosystems

    (2004)
  • E.S. Cochran et al.

    Earth tides can trigger shallow thrust fault earthquakes

    Science

    (2004)
  • R. Console et al.

    A simple and testable model for earthquake clustering

    Journal of Geophysical Research

    (2001)
  • R. Console et al.

    Refining earthquake clustering models

    Journal of Geophysical Research

    (2003)
  • R. Console et al.

    Comparative performance of timeinvariant, long-range and short-range forecasting models on the earthquake catalogue of Greece

    Journal of Geophysical Research

    (2006)
  • R. Console et al.

    Clustering model constrained by the rate-and-state constitutive law: comparison with a purely stochastic ETAS model

    Seismological Research Letters

    (2007)
  • R. Console et al.

    Probability gains of an epidemic-type aftershock sequence model in retrospective forecasting of M  5 earthquakes in Italy

    Journal of Seismology

    (2010)
  • D.R. Cox

    Regression models and life tables with discussion

    Journal of the Royal Statistical Society, Series B

    (1972)
  • S. Crampin et al.

    Earthquakes can be stress-forecast

    Geophysical Journal International

    (2010)
  • Cited by (83)

    • Forecasting in humanitarian operations: Literature review and research needs

      2022, International Journal of Forecasting
      Citation Excerpt :

      Within the humanitarian sector there is a genuine interest in forecasting natural disasters that would trigger a need for response. Within the hard sciences such as climate science, geophysics, and fluid dynamics, there is plenty of research on forecasting natural events (Done, Holland, Bruyère, Leung, & Suzuki-Parker, 2015; Tiampo & Shcherbakov, 2012). For example, in the short-term, climate scientist and meteorologists can simulate the formation and movement of a hurricane or cyclone to forecast its strength, travel path, and expected time and point of landfall.

    • Earthquake forecasting and time-dependent neo-deterministic seismic hazard assessment in Italy and surroundings

      2022, Earthquakes and Sustainable Infrastructure: Neodeterministic (NDSHA) Approach Guarantees Prevention Rather than Cure
    • Hidden Markov models with binary dependence

      2021, Physica A: Statistical Mechanics and its Applications
    • Predicting the proximity to macroscopic failure using local strain populations from dynamic in situ X-ray tomography triaxial compression experiments on rocks

      2020, Earth and Planetary Science Letters
      Citation Excerpt :

      We employ two machine learning methods: random forest classification (e.g., Breiman, 2011) and XGBoost (i.e., extreme gradient boosting) classification (e.g., Chen and Guestrin, 2016). We designed this analysis as a classification problem similar to time-dependent seismic hazard models that predict the rate of event occurrence (e.g., Tiampo and Shcherbakov, 2012). We synthesize results from the random forest and XGBoost methods to increase the robustness of the conclusions.

    View all citing articles on Scopus
    View full text