Review ArticleSeismicity-based earthquake forecasting techniques: Ten years of progress
Introduction
The impact to life and property from large earthquakes is potentially catastrophic. In 2010, the M7.0 earthquake in Haiti became the fifth most deadly earthquake on record, killing more than 200,000 people and resulting in ~$8 billion (USD) in direct damages (Cavallo et al., 2010). Direct economic damage from the M8.8 earthquake that struck Chile in February of 2010 could reach US$30 billion, or 18% of Chile's annual economic output (Kovacs, 2010). As a result of the potential regional and national impact of large earthquakes, research into their prediction has been ongoing on some level for almost 100 years, with intervals marked by optimism, skepticism, and realism (Geller et al., 1997, Jordan, 2006, Kanamori, 1981, Wyss, 1997).
More than ten years ago, this controversy erupted in what has become known in the community as the Nature debates (Nature Debates, Debate on earthquake forecasting, http://www.nature.com/nature/debates/earthquake, 1999; Main, 1999b). Prompted in large part by the apparent lack of success of the Parkfield prediction experiment (see, e.g., Bakun et al., 2005), it ultimately focussed on the nature of earthquakes themselves and whether they might be intrinsically unpredictable. While this question has yet to be decided, it marked a turning point in the field of earthquake science such that today earthquake forecasting, or the assessment of time-dependent earthquake hazard, complete with associated probabilities and errors, is now the standard in earthquake predictability research.
At the same time, a wealth of seismicity data at progressively smaller magnitude levels has been collected over the past forty years, in part related to the original goals of efforts such as the Parkfield experiment and in part out of recognition that there is much still to learn about the underlying process, particularly after the Parkfield prediction window passed without an earthquake (Bakun et al., 2005). While it has long been recognized that temporal and spatial clustering is evident in seismicity data, much of the research associated with these patterns in the early years focused on a relatively small fraction of the events, primarily at the larger magnitudes (Kanamori, 1981). Examples include, but are not limited to, characteristic earthquakes and seismic gaps (Bakun et al., 1986, Ellsworth and Cole, 1997, Haberman, 1981, Swan et al., 1980), Mogi donuts and precursory quiescence (Mogi, 1969, Wyss et al., 1996, Yamashita and Knopoff, 1989), temporal clustering (Dodge et al., 1996, Eneva and Ben-Zion, 1997, Frohlich, 1987, Jones and Hauksson, 1997, Press and Allen, 1995), aftershock sequences (Gross and Kisslinger, 1994, Nanjo et al., 1998), stress transfer and earthquake triggering over large distances (Brodsky, 2006, Deng and Sykes, 1996, Gomberg, 1996, King et al., 1994, Pollitz and Sacks, 1997, Stein, 1999), scaling relations (Pacheco et al., 1992, Romanowicz and Rundle, 1993, Rundle, 1989, Saleur et al., 1995), pattern recognition (Keilis-Borok and Kossobokov, 1990, Kossobokov et al., 1999), and time-to-failure analyses (Bowman et al., 1998, Brehm and Braile, 1998, Bufe and Varnes, 1993, Jaumé and Sykes, 1999). Although this body of research represents important attempts to describe these characteristic patterns using empirical probability density functions, it was hampered by the poor statistics associated with the small numbers of moderate-to-large events either available or considered for analysis.
The availability of new, larger data sets and the computational advancements that facilitate complex time series analysis, including simulations, rigourous statistical tests, and innovative filtering techniques, provided new impetus for earthquake forecasting at a time when the field was apparently polarized on the issue (Nature Debates, Debate on earthquake forecasting, http://www.nature.com/nature/debates/earthquake, Main, 1999b, Jordan, 2006). In 2002, the first prospective forecast using small magnitude earthquake data was published (Rundle et al., 2002), and this was followed by a renewed interest in seismicity-based methodologies and engendering a renewed effort aimed at better defining and testing these techniques. Landmark initiatives in the earthquake forecasting validation and testing arena include the working group on Regional Earthquake Likelihood Models (RELM) as well as the Collaboratory on the Study of Earthquake Predictability (CSEP), both founded after 2000 (Field, 2007, Gerstenberger and Rhoades, 2010, Zechar et al., 2010).
Although a suite of potential precursory phenomena exist in addition to those associated with changes in seismicity, including tilt and strain precursors, electromagnetic signals, hydrologic phenomena, and chemical emissions (Scholz, 2002, Turcotte, 1991), we limit our discussion to the predominant seismicity-based forecasting techniques actively researched over the past ten years. Methods also not discussed here include forecasting techniques associated with earthquake interactions such precursory seismic velocity changes (e.g., Crampin and Gao, 2010) or stress transfer studies (see King et al., 1994, Stein, 1999; and others).
Here we review the current status of seismicity-based forecasting methodologies and the progress made in that field since the 1999 Nature debate. In the interest of space, we limit the discussion to methodologies which rely on the instrumental catalog for their data source and which attempt to produce forecasts which are limited in both space and time in some quantifiable manner. As a result, these methods primarily produce intermediate-term forecasts, on the order of years, although we do include a small subset that relies on aftershock statistics to generate short-term forecasts on the order of days. Important discussion exists elsewhere on the appropriate standard for furnishing a testable earthquake forecast (Jackson and Kagan, 2006, Jordan, 2006), as well as the efficacy of various forecast testing methodologies and their evaluation (e.g. Field, 2007, Gerstenberger and Rhoades, 2010, Schorlemmer et al., 2007, Vere-Jones, 1995, Zechar et al., 2010). While there is no attempt here to test the reliability of these forecasting techniques against each other or a particular null-hypothesis with rigorous statistics, in some cases attempts are made to compare to either a Poisson null-hypothesis or to a null hypothesis which includes spatial and temporal clustering, as in the case of the relative intensity (RI) forecast model (Holliday et al., 2005) (see Section 3.4) or the ETAS model (see Section 3.3) (e.g. Vere-Jones, 1995). We will discuss briefly any such efforts, or the lack thereof, particularly in those cases where the method has not been formally submitted for independent evaluation.
We have separated the methods discussed here into two different categories, although there is some unavoidable overlap. This paper begins with a review of a suite of seismicity-based forecasting methodologies that each assumes a particular physical mechanism is associated with the generation of large earthquakes and their precursors and performs a detailed analysis on the instrumental and/or historic catalog in order to isolate those precursors. We designate these “physical process models”, Section 2. In this subset we also include two techniques which fall slightly outside the parameters outlined above, the characteristic earthquake hypothesis and the accelerated moment release (AMR) hypothesis. While both use a relatively small subset of larger events and are not optimally formulated to produce time- and space-limited forecasts, their undeniable impact on the earthquake forecasting community mandates their inclusion here.
In Section 3 we detail the evolution and current state of smoothed seismicity models. These models primarily apply a series of filtering techniques, often based on knowledge or assumptions about earthquake statistics, to seismicity catalog data in order to forecast on both short- and intermediate time scales. We conclude with a short discussion of the limitations and future outlook of seismicity-based forecasting tools.
Section snippets
Physical process models
Physical process models are those in which the precursory process relies on one or more physical mechanisms or phenomena associated with the generation of large events. A detailed analysis, often but not always statistical, is performed on the instrumental and/or historic seismicity in order to isolate those precursors. These techniques are based on the assumption that the seismicity is acting as a sensor for the underlying physical process and can provide information about the spatial and
Smoothed seismicity models
Smoothed seismicity models are a more general class of seismicity-based forecasting models which define the important physical spatio-temporal features of earthquake processes, characterize these features in a mathematical and/or probabilistic manner, and then calibrate the model based on data available from seismic catalogs for particular tectonic regions. Although the classic versions did not include specific geologic or tectonic information, a few of the newer methods discussed below attempt
Conclusions
Recent developments in the fields of statistical seismology, in conjunction with the availability of large quantities of seismic data at smaller scales and computational advances, have improved significantly our understanding of time-dependent earthquake processes. As a result, the last ten years have seen significant progress in the field of intermediate- and short-term seismicity-based earthquake forecasting. These seismicity-based forecasting techniques can be differentiated into models
Acknowledgments
The work of K. F. Tiampo was supported by the NSERC and Aon Benfield/ICLR Industrial Research Chair in Earthquake Hazard Assessment. The work of R. Shcherbakov was supported by an NSERC Discovery Grant 355632–2008. Several images were plotted with the help of GMT software developed and supported by Paul Wessel and Walter H.F. Smith.
References (360)
- et al.
A probabilistic neural network for earthquake magnitude prediction
Neural Networks
(2009) Lateral inhomogeneities in the upper mantle
Tectonophysics
(1965)- et al.
Physical and stochastic models of earthquake clustering
Tectonophysics
(2006) - et al.
Seismicity patterns before the M=5.8 2002, Palermo Italy earthquake: seismic quiescence and accelerating seismicity
Tectonophysics
(2004) - et al.
Statistical occurrence analysis and spatio-temporal distribution of earthquakes in the Apennines Italy
Tectonophysics
(2007) - et al.
On the spatio-temporal distribution of M70.+ worldwide seismicity
Tectonophysics
(2008) - et al.
Statistical analysis of the central-Europe seismicity
Tectonophysics
(2009) New look at statistical-model identification
IEEE Transactions on Automatic Control
(1974)Maximum likelihood estimate of b in the formula logN = a − bM and its confidence limits
Bulletin of the Earthquake Research Institute
(1965)HAZGRIDX: earthquake forecasting model for ML ≥ 5.0 earthquakes in Italy based on spatially smoothed seismicity
Annals of Geophysics
(2010)
The tectonic environments of seismically active and inactive areas along the San Andreas fault system
Earthquake forecasting using neural nets
Nonlinear Dynamics
Past and possible future earthquakes of significance to the San Diego region
Earthquake Spectra
Potential for earthquake rupture and M 7 earthquakes along the Parkfield, Cholame, and Carrizo segments of the San Andreas Fault
Seismological Research Letters
The Parkfield, California, earthquake prediction experiment
Science
Recurrence models and Parkfield, California, earthquakes
Journal of Geophysical Research
Seismic slip, aseismic slip, and the mechanics of repeating earthquakes on the Calaveras fault, California, in Earthquake Source Mechanics
Implications for prediction and hazard assessment from the 2004 Parkfield earthquake
Nature
Accelerated seismic release and related aspects of seismicity patterns on earthquake faults
Pure and Applied Geophysics
Estimating surface rupture length and magnitude of paleoearthquakes from point measurements of rupture displacement
Bulletin of the Seismological Society of America
Paleoseismic event dating and the conditional probability of large earthquakes on the southern San Andreas fault, California
Bulletin of the Seismological Society of America
Plate-tectonic analysis of shallow seismicity: apparent boundary width, beta, corner magnitude, coupled lithosphere thickness, and coupling in seven tectonic settings
Bulletin of the Seismological Society of America
Equations for estimating horizontal response spectra and peak acceleration from western North America earthquakes: a summary of recent work
Seismological Research Letters
Accelerating seismicity and stress accumulation before large earthquakes
Geophysical Research Letters
Intermittent criticality and the Gutenberg Richter distribution
Pure and Applied Geophysics
An observational test of the critical earthquake concept
Journal of Geophysical Research
Intermediate-term earthquake prediction using precursory events in the New Madrid Seismic Zone
Bulletin of the Seismological Society of America
Long-range triggered earthquakes that continue after the wave train passes
Geophysical Research Letters
Predictive modeling of the seismic cycle of the greater San Francisco Bay region
Journal of Geophysical Research
Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach
Forecasting b-values for seismic events
International Journal of Bifurcation and Chaos
The Revised 2002 California Probabilistic Seismic Hazard Maps
Model uncertainties of the 2002 update of California seismic hazard maps
Bulletin of the Seismological Society of America
Estimating the direct economic damage of the earthquake in Haiti
Report to the Director, Governor's Office of Emergency Services by the California Earthquake Prediction Evaluation Council March 2, 2004
Report to the Director, Governor's Office of Emergency Services by the California Earthquake Prediction Evaluation Council December 9, 2004
Integrated seismic-hazard analysis of the Wasatch Front, Utah
Bulletin of the Seismological Society of America
Statistical analysis of daily seismic event rate as a precursor to volcanic eruptions
Geophysical Research Letters
An improved region–time–length algorithm applied to the 1999 Chi-Chi, Taiwan earthquake
Geophysical Journal International
The 1999 Chi-Chi, Taiwan, earthquake as a typical example of seismic activation and quiescence
Geophysical Research Letters
Foreshock rates from aftershock abundance
Bulletin of the Seismological Society of America
Probability map of the next M5.5 earthquakes in Italy
Geochemistry, Geophysics, Geosystems
Earth tides can trigger shallow thrust fault earthquakes
Science
A simple and testable model for earthquake clustering
Journal of Geophysical Research
Refining earthquake clustering models
Journal of Geophysical Research
Comparative performance of timeinvariant, long-range and short-range forecasting models on the earthquake catalogue of Greece
Journal of Geophysical Research
Clustering model constrained by the rate-and-state constitutive law: comparison with a purely stochastic ETAS model
Seismological Research Letters
Probability gains of an epidemic-type aftershock sequence model in retrospective forecasting of M ≥ 5 earthquakes in Italy
Journal of Seismology
Regression models and life tables with discussion
Journal of the Royal Statistical Society, Series B
Earthquakes can be stress-forecast
Geophysical Journal International
Cited by (83)
Forecasting in humanitarian operations: Literature review and research needs
2022, International Journal of ForecastingCitation Excerpt :Within the humanitarian sector there is a genuine interest in forecasting natural disasters that would trigger a need for response. Within the hard sciences such as climate science, geophysics, and fluid dynamics, there is plenty of research on forecasting natural events (Done, Holland, Bruyère, Leung, & Suzuki-Parker, 2015; Tiampo & Shcherbakov, 2012). For example, in the short-term, climate scientist and meteorologists can simulate the formation and movement of a hurricane or cyclone to forecast its strength, travel path, and expected time and point of landfall.
Earthquake forecasting and time-dependent neo-deterministic seismic hazard assessment in Italy and surroundings
2022, Earthquakes and Sustainable Infrastructure: Neodeterministic (NDSHA) Approach Guarantees Prevention Rather than CureA survey of modeling for prognosis and health management of industrial equipment
2021, Advanced Engineering InformaticsSpatiotemporally explicit earthquake prediction using deep neural network
2021, Soil Dynamics and Earthquake EngineeringHidden Markov models with binary dependence
2021, Physica A: Statistical Mechanics and its ApplicationsPredicting the proximity to macroscopic failure using local strain populations from dynamic in situ X-ray tomography triaxial compression experiments on rocks
2020, Earth and Planetary Science LettersCitation Excerpt :We employ two machine learning methods: random forest classification (e.g., Breiman, 2011) and XGBoost (i.e., extreme gradient boosting) classification (e.g., Chen and Guestrin, 2016). We designed this analysis as a classification problem similar to time-dependent seismic hazard models that predict the rate of event occurrence (e.g., Tiampo and Shcherbakov, 2012). We synthesize results from the random forest and XGBoost methods to increase the robustness of the conclusions.