At this moment, we do have a good knowledge on the location of major active faults in the world, but the time of an event is still hard, if not impossible, to predict. In some cases a fault is showing a series of earthquakes at regular time intervals and similar in size. These are called characteristic earthquakes and are studied in detail.
One example of a characteristic earthquake sequence is a series of earthquakes that happened along the San Andreas fault-system in California, close to the village of Parkfield. In this region three instrumentally recorded earthquakes of magnitude 6 did occur in the 20th century, in 1922, 1934 and 1966, at an average recurrence time of 22 years. Based on these observations and reports of felt earthquakes since 1857, assuming a strictly periodic process and taking the 1934 event as an anomaly, a prediction was issued in 1985 that the next earthquake in this series would occur before 19931). The Parkfield prediction was the only scientific earthquake prediction officially recognized by the United States government2). The predicted earthquake finally happened in 2004. Although the timing was wrong, the size of the earthquake was comparable to the earlier event.
The challenge is to find out how well this latest event compares to the previous ones, how characteristic Parkfield events are, and if there are any precursors that can be used in a possible prediction. Local and regional stations, at an epicentral distance < 1000 km, were used to compare the 1934, 1966 and 2004 records. For the 1922 event only very few records can be used. A distinct difference in waveform was observed between the 1934/1966 recordings and the 2004 record5). The differences can be explained by details in the rupture process: in the older events it developed from north-west to south-east, in the 2004 event in the opposite direction. Recordings from stations at larger, teleseismic distances are not hampered by these details and are more suitable to compare the general features of the individual earthquakes.
Station DBN
A comparison of the source mechanism and location of the latest event in the Parkfield series with earlier earthquakes requires a well calibrated seismograph station that has continuous recordings over a long period. Preferably, the same type of seismometer should be in operation all the time. Seismograph station DBN (De Bilt, the Netherlands) appears to be the only station outside the US that did record events from 1922, 1934 and 1966 on the same instrument and therefore plays an important role in the Parkfield experiment1). The 2004 event was recorded at the same site, but with a different digital instrument.
Seismograph station DBN is in operation since 1908. Most of the analogue data collected since that time are archived and made available on request. Scans of DBN records for selected events are on-line available through the SISMOS project6). Different seismographs were in operation over the years, but the most important DBN dataset consists of the continuous archive of seismograms from a set of long period Galitzin seismometers. Two horizontal seismometers are in operation since 1914, the vertical component since 1922. Recording is on photographic paper, three records per day. At the end of 1994, the seismometer was replaced by a digital long period instrument, operating in a broad frequency range that includes the frequency band of the Galitzin.
Waveform comparison
Station DBN recorded on the horizontal components of the Galitzin seismograph the Parkfield earthquakes of 1922, 1934 and 1966. Unfortunately, the vertical component was added a few months after the first event. These recordings, at an epicentral distance of 8900 km, show a striking resemblance, suggesting that the sources should have ruptured the same part of the fault-system. In order to quantify this resemblance, the analogue records were scanned and digitized and the resulting waveforms, including the digital records from the latest 2004 event, were correlated4). Since the records show no clear body waves, due to the focal mechanism and shallow depth, but instead well developed surface waves, a new digitization procedure was developed to allow long time series to be accurately sampled. Corrections were made for e.g. the uneven rotation of the recording drum.
The recordings of the 2004 Parkfield earthquake were made on a different type of instrument. Therefore a simulation of the original Galitzin seismometer was required in order to compare waveforms. This process includes a removal of the response of the new seismometer from the recorded time series, followed by a convolution of the resulting signal with the response of the Galitzin seismometer. A set of recursive time-domain filters were designed to implement the simulation.
For a successful simulation we need to have an adequate knowledge of the amplitude and phase characteristics of both the new and old systems. The Galitzin sensor was calibrated on a regular basis between 1914 and 1932 and less regular in the period until 1945. These measurements allow an estimate of the error in the amplitude and phase information. Most important for this study was an estimate of an error of 10% in magnification of the Galitzin. Both old and new seismographs have been described in detail by Dost and Haak3). The new seismograph was calibrated a few days after the 2004 earthquake.
Apart from the four Parkfield earthquakes, three additional earthquakes were digitized (Figure 1). These events were selected based on two criteria: a) epicentre from the same region, but with different source characteristics and b) epicentre from the same fault system with the same source characteristics, but outside the Parkfield section. Correlations of the surface wave trains of all these recordings (4 Parkfield and 3 non-Parkfield) show that the horizontal components of the Parkfield events have a normalized correlation coefficient (NCC) > 0.8 and these events can be distinguished from the others that show 0.3<NCC< 0.504).
High normalized correlations imply that the recorded phase information of the surface waves from Parkfield events is similar and therefore that the sources are co-located. From an evaluation of the dominant frequency of the surface waves, a maximum source separation of 10-25km was estimated. Looking at the amplitudes of the events, the conclusion is that the events are similar in size within the accuracy of the calibration (Figure 2). These observations further support the idea of a characteristic earthquake that occurs along the San-Andreas fault-system near Parkfield.
As a spin-off from this research we can better reconstruct the origin time from the oldest 1922 Parkfield event, which is poorly known, due to the high correlation of the recorded waveforms with more recent events. The waveforms we correlate have an average delay of 40 minutes and 1 s with respect to their origin time. For the 1922 event a deviation of 12 s was found and it is suggested to correct the published origin time by the same amount.
Discussion
Understanding the behaviour of seismic activity along an active fault like the San Andreas fault system is a first step towards prediction. The Parkfield experiment shows that even in a situation where a characteristic earthquake occurs around a limited section of a well studied fault, the prediction in time is cumbersome. Speculations on the reasons for this unsuccessful prediction include the effect of several large earthquakes in the neighbourhood, influencing the stress pattern around Parkfield, and the assumptions on the models behind the repeat times5).
Although the area was intensely monitored, no significant precursors were detected for the 2004 event. Since both the earthquakes of 1934 and 1966 were preceded by a magnitude 5 foreshock at 17 minutes before the mainshock, this is apparently not a characteristic feature of the Parkfield earthquakes.
Conclusion
Since the recurrence times of moderate and large earthquakes along an active fault are usually in the order of at least decades, the use of historical data is essential in unravelling the long term fault behaviour. Digital seismology really started only in the 1980’s, while analogue seismic records are available since the end of the 19th century. Historical archives thus contain valuable information for the assessment of seismic hazard. Only recently efforts are taken to coordinate access to these data through the scanning and digitization of the analogue data6). Continued long-term monitoring in combination with further development of seismicity models allows the testing of hypotheses and provides further insight into the earthquake generation process.
References
Bakun, W.H. and T.V. McEvilly, 1984. Recurrence Models and Parkfield, California, Earthquakes. J. Geophys. Res., 89, 3051-3058.
Bakun, W.H., B. Aagaard, B. Dost, W.L. Ellsworth, J.L. Hardebeck, R.A. Harris, C. Li, M.J.S. Johnston, J. Langbein, J.J. Lienkaemper, A.J. Michael, J.R. Murray, R.M. Nadeau, P.A. Reasenberg, M.S. Reichle, E.A. Roeloffs, A. Shakal, R.W. Simpson and F. Waldhauser, 2005. Implications for prediction and hazard assessment from the 2004 Parkfield earthquake. Nature, 437, 969-974, doi 10.1038/nature04067.
Dost, B. and H.W. Haak, 2002. A comprehensive description of the KNMI seismological instrumentation. KNMI technical report TR-245, 60pp.
Dost, B. and H.W. Haak, 2006. Comparing Waveforms by Digitization and Simulation of Waveforms for Four Parkfield Earthquakes Observed in Station DBN, The Netherlands. Bulletin of the Seismological Society of America, 96, S50-S55, doi: 10.1785/0120050813.
Langbein, J., R. Borcherdt, D. Dreger, J. Fletcher, J.L. Hardebeck, M. Hellweg, C. Ji, M. Johnston, J.R. Murray, R. Nadeau, M. J. Rymer and J.A. Treiman, 2005. Preliminary report on the 28 September 2004, M 6.0 Parkfield, California Earthquake. Seismol. Res. Lett., 76, 10-26.
Michelini, A., B. De Simoni, A. Amato and E. Boschi, 2005. Collecting, digitizing, and distributing historical seismological data. EOS Transactions American Geophysical Union, 86, 261, 266.