10:45 〜 12:15
[SSS05-P07] Better Hazard Through Less Precision
Many assumptions are necessary when constructing source models for seismic hazard assessment. Often these assumptions are difficult to quantitatively assess due to the relatively slow occurrence rates of large earthquakes and, hence, a lack of data. In this study we aim to provide quantitative information which can beused to help better constrain key assumptions around catalog-based earthquake-source modeling in the future. Catalog-based models are often used either to complement fault-based models, or to completely constrain the future earthquake occurrence rate. Broadly speaking, there are two main types of such models which make different assumptions about the ability of earthquake catalogs to constrain future hazard: uniform-rate models, which assume very coarse details, and smoothed-seismicity models which assume earthquake catalogs contain relatively precise hazard forecasting information. Smoothed-seismicity models have generally performed very well
when tested against short time frames in high-seismicity regions. The limitsof their usefullness as earthquake rates decrease has not been well tested. In this study we test the impact of earthquake rates, including potential earthquake clustering, on the question of whether smoothed-seismicity models truly provide better information for hazard assessment than uniform-rates using catalogs from high-seismicity regions under the assumption that they are fast-forwarded representations of catalogs from lower-seismicity regions and that the basic principles of how earthquakes interact are similar in both. A
necessary part of this is to quanitfy the variability in earthquake rate from one equivalent time-period to another. Our results suggests that, particularly for low-seismicity regions, spatially random models are often a better choice
for use in hazard modeling, and that the variability in earthquake rate goesfar beyond what is expected by the assumption that earthquake occurrence rates are Poisson and that declustering does not solve this problem.
when tested against short time frames in high-seismicity regions. The limitsof their usefullness as earthquake rates decrease has not been well tested. In this study we test the impact of earthquake rates, including potential earthquake clustering, on the question of whether smoothed-seismicity models truly provide better information for hazard assessment than uniform-rates using catalogs from high-seismicity regions under the assumption that they are fast-forwarded representations of catalogs from lower-seismicity regions and that the basic principles of how earthquakes interact are similar in both. A
necessary part of this is to quanitfy the variability in earthquake rate from one equivalent time-period to another. Our results suggests that, particularly for low-seismicity regions, spatially random models are often a better choice
for use in hazard modeling, and that the variability in earthquake rate goesfar beyond what is expected by the assumption that earthquake occurrence rates are Poisson and that declustering does not solve this problem.