Are Megaquakes Clustered?
E.G. Daub, E. Ben-Naim, R.A. Guyer, and P.A. Johnson
We study statistical properties of the number of large earthquakes
over the past century. We analyze the cumulative distribution of the
number of earthquakes with magnitude larger than threshold M in time
interval T, and quanitfy the statistical significance of these results
by simulating a large number of synthetic random catalogs. We find
that in general, the earthquake record is random in time. This
conclusion holds whether aftershocks are removed or not, except at
magnitudes below M = 7:3. At long time intervals (T = 2-5 years), we
find that statistically significant clustering is present in the
catalog for lower magnitude thresholds (M = 7-7.2). However, this
clustering is due to a large number of earthquakes occurring in the
early part of the 20th century, when magnitudes are less certain.
source,
pdf