9:00 AM - 9:15 AM
[J06-1-03] Correlation between Coulomb Stress Rate Change Imparted by Two Slow Slip Events and Seismic Rate Change in Lower Cook Inlet of the Alaska-Aleutian Subduction Zone
We identified two long-term slow slip events (SSEs) in Lower Cook Inlet, southcentral Alaska (1995.0-2004.8 and 2009.85-2011.81) by inverting the GPS site velocities (Li et al. 2016). The earlier SSE (SSE1) lasted at least 9 years with Mw~7.8 and an average slip rate of ~82 mm/yr. The latter SSE (SSE2) had almost the same area as the earlier one, which is within 40-60 km depth range, and a duration of ~2 years with Mw~7.2 and an average slip rate of ~91 mm/yr.
In order to see whether SSEs can trigger earthquakes outside the slow slipping area significantly and contribute to the early earthquake warning, we resolved the Coulomb stress rate changes on receiver faults in the crust and the slab using two different potential definitions for them. Our results showed that the significantly higher stress rate during SSE1 caused a significant decrease in the seismicity rate after SSE1 ended. However the area in the slab that experienced significant increasing stress rate change due to SSE1 slip did not show a clear pattern of decreasing seismic rate after SSE1 ends. No significant increasing stress rate change area influenced by SSE2 was identified in the crust. The area that experienced a significant increase in stress rate in the slab due to SSE2 slip had a seismicity rate increase right after the SSE2 started and seismic rate decrease after the SSE2 ended.
We also modeled the seismicity in potential triggering areas by using the rate/state stress transfer model (Dieterich, 1994). The observed seismicity can be well fitted with the increase in stress rate of the SSE1 period comparing with the one in the post-SSE1 period. But the stress rate due to slip from SSE2 was not large enough to explain the cumulative seismicity through time. Explaining the post-SSE2 seismicity rate change requires some additional stress rate change such as an increasing slab pull stress rate after the SSE started instead of a uniform slab pull stress rate over all time periods.
In order to see whether SSEs can trigger earthquakes outside the slow slipping area significantly and contribute to the early earthquake warning, we resolved the Coulomb stress rate changes on receiver faults in the crust and the slab using two different potential definitions for them. Our results showed that the significantly higher stress rate during SSE1 caused a significant decrease in the seismicity rate after SSE1 ended. However the area in the slab that experienced significant increasing stress rate change due to SSE1 slip did not show a clear pattern of decreasing seismic rate after SSE1 ends. No significant increasing stress rate change area influenced by SSE2 was identified in the crust. The area that experienced a significant increase in stress rate in the slab due to SSE2 slip had a seismicity rate increase right after the SSE2 started and seismic rate decrease after the SSE2 ended.
We also modeled the seismicity in potential triggering areas by using the rate/state stress transfer model (Dieterich, 1994). The observed seismicity can be well fitted with the increase in stress rate of the SSE1 period comparing with the one in the post-SSE1 period. But the stress rate due to slip from SSE2 was not large enough to explain the cumulative seismicity through time. Explaining the post-SSE2 seismicity rate change requires some additional stress rate change such as an increasing slab pull stress rate after the SSE started instead of a uniform slab pull stress rate over all time periods.