5:15 PM - 6:30 PM
[HCG27-P02] Model and “Unexpected”
Keywords:model, unexpected, earthquake science
Since the occurrence of the Great East Japan Earthquake Disaster, I have participated in several committees for the reexamination of earthquake countermeasures including the Nankai Trough earthquake. Under the banner of "No More Unexpected", model of the largest possible earthquake was asked to be discussed there, and various models were proposed. Ground motion and tsunami were calculated on the basis of these models, to estimate damages. As a result, the forecast of extremely large damage was announced, which shook the society. Since then, experts in surrounding fields and media have repeatedly used this estimate along with their arguments, which makes only these numbers prevalent. I guess some people are disgusted with the current situation.
By the way, what is a model on earth? In earth science, a wide variety of models are used, from conceptual qualitative models to precise numerical models with multiple degrees of freedom and analog models using actual materials. Just as the sand pile model, which is composed only of addition and subtraction calculations, creates a power law, even a highly abstracted model can show us the essence of the phenomenon. The analog model is constructed on a laboratory scale in order to understand the mechanism of phenomena that cannot be reproduced on a real scale. The model is extremely useful for understanding the phenomenon, because similar results can be obtained even if a third party executes them with the same settings.
On the other hand, the society counts on model’s predictability. Models of atmospheric phenomena have been used for future forecasts of weather and climate for many years, and information based on these forecasts are now a part of our lives. Following this success, models of generation of earthquakes and related disasters, such as long-term probabilistic forecast, strong ground motion prediction and tsunami forecast, are used. Based on these results, countermeasures are established, and building codes for important structures such as nuclear power plants and bridges are set.
However, the model is based on assumptions where complex factors are omitted. Therefore, the application of the model should be limited, but there are discussions that unconsciously go beyond, aren’t there? For example, the dislocation model, which is widely used in earthquake research, is also an approximate solution; the assumption of infinitesimal deformation is broken in the close vicinity of the fault. Long-term probability is calculated as a point process in which only earthquakes of a certain size or larger are extracted from the time series of earthquake occurrence. In the prediction of strong ground motion and tsunami, many assumptions such as homogeneous semi-infinite medium or stratified structure are made in the analysis, and the shape of the fault is an extremely simple rectangle. Real faults and structure are very complicated, but such complexities are stripped off. There is no assessment of the impact of the stripped elements and it is treated as an uncertainty that is difficult to quantify. Perhaps the "unexpected" is lurking there. We cannot rule the “unexpected” out.
Only researchers can criticize the validity of the model on which damage estimation is based from a professional viewpoint. Why not just make a statement, not just bitter?
By the way, what is a model on earth? In earth science, a wide variety of models are used, from conceptual qualitative models to precise numerical models with multiple degrees of freedom and analog models using actual materials. Just as the sand pile model, which is composed only of addition and subtraction calculations, creates a power law, even a highly abstracted model can show us the essence of the phenomenon. The analog model is constructed on a laboratory scale in order to understand the mechanism of phenomena that cannot be reproduced on a real scale. The model is extremely useful for understanding the phenomenon, because similar results can be obtained even if a third party executes them with the same settings.
On the other hand, the society counts on model’s predictability. Models of atmospheric phenomena have been used for future forecasts of weather and climate for many years, and information based on these forecasts are now a part of our lives. Following this success, models of generation of earthquakes and related disasters, such as long-term probabilistic forecast, strong ground motion prediction and tsunami forecast, are used. Based on these results, countermeasures are established, and building codes for important structures such as nuclear power plants and bridges are set.
However, the model is based on assumptions where complex factors are omitted. Therefore, the application of the model should be limited, but there are discussions that unconsciously go beyond, aren’t there? For example, the dislocation model, which is widely used in earthquake research, is also an approximate solution; the assumption of infinitesimal deformation is broken in the close vicinity of the fault. Long-term probability is calculated as a point process in which only earthquakes of a certain size or larger are extracted from the time series of earthquake occurrence. In the prediction of strong ground motion and tsunami, many assumptions such as homogeneous semi-infinite medium or stratified structure are made in the analysis, and the shape of the fault is an extremely simple rectangle. Real faults and structure are very complicated, but such complexities are stripped off. There is no assessment of the impact of the stripped elements and it is treated as an uncertainty that is difficult to quantify. Perhaps the "unexpected" is lurking there. We cannot rule the “unexpected” out.
Only researchers can criticize the validity of the model on which damage estimation is based from a professional viewpoint. Why not just make a statement, not just bitter?