Verne Global

HPC | Research |

12 February 2018

HPCs role in seismological study and advanced earthquake analysis

Written by Spencer Lamb

Spencer is Verne Global's Director of Research and head's up our work with European education, research and life science organisations. He also has a firm interest in HPC and machine learning.

Researchers have typically taken an empirical approach to earthquake study, which means using data collected about past events to predict the effects of a future earthquake. As high performance computing (HPC) becomes more prevalent though, these methods of seismological study are making way for a new paradigm of earthquake analysis based on high-granularity models, which can provide more accurate insight into the causes and effects of an earthquake at a drastically improved speed over what was previously possible just a few years ago.

One of the most recent, high-profile applications of numerical seismology took place at the Ludwig-Maximilians-Universität Munich (LMU) and Technical University of Munich (TUM), where researchers used the entire capacity of the 3-petaflop SuperMUC system at the Leibniz Supercomputing Centre (LRZ) to better probe one of the largest earthquakes in recent years, the devastating Sumatra-Andaman Earthquake of 2004.

To accomplish this, the researchers used a computational model of the earthquake’s “subduction zone,” where the plates of the Earth’s crust collided deep below the surface in order to simulate the earthquake and its resulting tsunami at the same time, allowing them to better understand the unpredictable and often disproportionate relationship between an earthquake’s magnitude and the size of the tsunamis it creates.

To simulate the subduction zone accurately, and account for the shallow angle at which the tectonic plates collided, requires a very large, multi-physics model — the largest ever created for this purpose. In this case, the researchers stretched a three dimensional mesh consisting of 200 million elements and more than 100 billion degrees of freedom over an area that extended from India to Thailand. The simulation ran for over 9 minutes and used over 50 trillion total operations, making it not only the largest but also the longest-running earthquake simulation ever.

In the United States, a joint team of researchers from the Department of Energy and Lawrence Laboratory in Berkeley, California are anticipating the era of exascale HPC as a means to mitigate the threat of earthquakes - a threat that looms particularly large over the San Francisco Bay region in California. To do this they’ve adapted the SW4 geophysics algorithm, which simulates seismic wave propagation, in order to build the first end-to-end simulation of a high frequency earthquake and its impact on local infrastructure. High-frequency earthquakes, which generate between 2 to 10 hertz of ground motion and are known to cause more damage to small structures like houses and buildings, aren’t as easily researched as low frequency earthquakes, which occur over a much longer period of time. The model developed by the team in Berkeley has already been able to simulate the effects of a 6.5 magnitude earthquake at 3 hertz, as well as its effects on buildings within a 100-square kilometre range. As the HPC systems supporting their research approach the exascale-level of computational power that the model was designed for, its accuracy will increase, helping to protect many of the city’s historic low-rise buildings from earthquakes, and billions of dollars in property from the next inevitable earthquake in the area.

HPC is also finding other novel uses within seismological study that are outside the realm of numerical modeling. The MyShake program, led by Qingkai Kong at the University of California, has taken a creative approach to earthquake detection. Mr Kong, a native of a seismically active region in central China, developed MyShake to take advantage of the now ubiquitous mobile device in order to “crowd source” seismic activity detection, and send timely warning signals to app users who are within a detection zone. According to a paper that outlines their concept, the MyShake software, which currently has 250,000 downloads, has enough sensitivity in its current iteration to detect magnitude 5 earthquakes at a distance of 10km using just the sensors on an average mobile phone. The app uses this collected data, along with HPC resources provided by Berkeley’s Savio Computer cluster running TensorFlow and Spark, to fine tune its detection algorithm and better differentiate between seismic activity and ordinary human activity.

When further developed the MyShake technology could help shut vulnerable infrastructure down before an earthquake causes damage, give individuals a forewarning to seek shelter, and may also help defer the cost of seismic stations which are often prohibitively expensive to build and maintain.

Reliable earthquake detection is still many years off — if it’s even possible. But sustained effort by the geological community, working hand-in-hand with the HPC community, continues to yield steady progress in our understanding of earthquakes and how we can best prepare and protect ourselves from them. As HPC continues to provide researchers with insight into the physics that occur below the Earth’s surface, we can expect to see advancements in both fields that make living on top of the Earth safer and more enjoyable for people living in seismologically active locations.

Share:FacebookTwitterLinkedInmail

Sign up for the Verne Global newsletter

Opinion, thought leadership and news delivered directly to your inbox once a month.