JpGU-AGU Joint Meeting 2017

Presentation information

[EE] Poster

H (Human Geosciences) » H-DS Disaster geosciences

[H-DS12] [EE] Tsunami disaster mitigation

Thu. May 25, 2017 3:30 PM - 4:45 PM Poster Hall (International Exhibition Hall HALL7)

[HDS12-P01] From Observatory Messages to Initial Tsunami Bulletin Parameters: Does the Central Limit Theorem still apply?

*Victor Sardina1, Stuart Weinstein1 (1.Pacific Tsunami Warning Center)

Keywords:tsunami bulletins, central limit theorem, source parameters accuracy, observatory messages, magnitude residuals, tsunami message delays

As part of its daily operations the Pacific Tsunami Warning Center (PTWC) in Honolulu, Hawaii, routinely analyses most earthquakes with a 5.5 or larger magnitude occurring around the globe. Although not officially required, the PTWC scientists on duty will usually issue an observatory (obs) message that contains the first set of preliminary source parameters for these events. If the magnitude of the earthquake under analysis crosses the 6.5 magnitude threshold, however, the protocol requires the issuance of at least a tsunami information bulletin. For many years, scientists at the PTWC assumed that the ubiquitous central limit theorem guaranteed that the inclusion of a larger number of seismic stations in the initial analyses would automatically improve the quality of the source parameters, particularly a more accurate hypocenter location and Mwp magnitude estimate. In this study we assess the validity of these assumptions and their impact on the message delays based on the actual messages’ data and statistics. We matched 577 observatory messages issued by the PTWC between 2003 and 2016 with the corresponding official tsunami message products that followed them. We then computed the corresponding epicentral offsets, magnitude residuals, and message latencies against the source parameters listed in the International Seismological Centre (ISCGEM), the Global Centroid Moment Tensor (GCMT) Project, and the National Earthquake Information Center (NEIC) online catalogs. Analysis of these statistics reveals that 53% of the reported magnitudes did not change despite up to 20 additional minutes of processing time since issuing the observatory message. Paradoxically, for 17% of the dataset the median magnitude residual increases from zero in the obs messages to 0.2 magnitude units in the matching bulletins that followed. In the remaining 30% of the events the initial magnitude estimates see a reduction of the median magnitude residual from 0.3 in the obs messages to 0.1 magnitude units in the corresponding bulletins. These results indicate that for the majority (70%) of the earthquakes analyzed by the PTWC during the last 12 years the quality of the preliminary earthquake parameters does not benefit from the additional message delays. Moreover, the data statistics reveal that from 2003 to 2016 in most cases the initial source parameters included in the obs messages had an accuracy matching or exceeding those included in the initial tsunami messages that followed them. Such results suggest that within this context the central limit theorem has a limited operational applicability. This appears to stem from the rather short analysis times and limited data availability typical for most initial earthquake source characterizations conducted by the PTWC scientists for tsunami warning purposes. Notwithstanding, additional message delays seem justified when dealing with earthquakes characterized by either a complex rupture or large magnitudes. For the majority of the earthquakes processed at the PTWC, however, the results do not justify additional time delays to add more seismic stations in the initial hypocenter location analyses, or to manually review individual magnitude estimates before issuing the first official message product. Moreover, we can conclude that as often as not additional processing times turn into a waste of otherwise precious warning time, something particularly important in the near field.