Research News

Computing power helps the Southern California Earthquake Center gain new insights about earthquakes

August 20, 2015

Earthquakes occur on a massive scale and often originate deep below the surface of the Earth, making them notoriously difficult to predict.

The Southern California Earthquake Center (SCEC) and its lead scientist Thomas Jordan use massive computing power made possible by the National Science Foundation (NSF) to improve the understanding of earthquakes. In doing so, SCEC is helping to provide long-term earthquake forecasts and more accurate hazard assessments.

One SCEC effort in particular, the PressOn project, aims to create more physically-realistic, wave-based earthquake simulations using an earthquake model they developed called CyberShake, which calculates how earthquake waves ripple through a 3-D model of the ground.

The latest NSF-funded supercomputers, capable of performing quadrillions of calculations every second, make this more accurate approach to studying earthquakes possible.

The Earth’s crust is made of plates that float on the molten outer core. Most earthquakes result from these plates moving relative to one another, a process called plate tectonics.

The edges of plates are rough. Those edges get stuck on one another while the rest of the plates keep moving, storing up energy–a process like stretching a rubber band. When the plate edges finally come unstuck (like letting go of one end of the rubber band), all the pent-up energy is released and the plate jerks into place. Aftershocks happen when the plate overshoots its equilibrium point and continues to readjust over the coming days to years.

Three distinct types of wave are generated by an earthquake: primary waves, secondary waves and surface waves. Each has a unique behavior and a distinct signature. The characteristics, timing and damage pattern of these waves differ by distance from the origin of the earthquake and the type of rock or soil they move through.

Given detailed information about the geological material in specific areas, physics-based, 3-D wave propagation simulations are able to calculate how earthquake waves will move through the Earth and how strong the ground motions will be when the waves reach the surface.

In 2014, the SCEC team investigated the earthquake potential of the Los Angeles Basin, where the Pacific and North American Plates run into each other at the San Andreas Fault. In this study, the simulation showed earthquake waves trapped, and reverberating, within the Los Angeles Basin. The high-shaking ground motions were much more than Jordan and his team expected.

“These basins act as essentially big bowls of jelly that shake during earthquakes and therefore very much affect the motion,” Jordan said.

SCEC’s simulations vary in terms of seismic wave cycles per second, or hertz. As that measurement increases, so does the potential for damage–and the complexity of the simulation. Structures such as buildings and bridges are most vulnerable to damage by seismic waves between 1 and 10 hertz.

The team first simulated individual earthquakes at 4 hertz. They then performed a simulation involving a large ensemble of earthquakes at 1 hertz–simulating more quakes required lowering the wave intensity–to calculate a probabilistic seismic hazard model for the Los Angeles area. A seismic hazard model describes the probability that an earthquake will occur in a given geographic area, within a given window of time and with ground motion intensity exceeding a given threshold.

Starting in April 2015 and continuing over seven weeks, SCEC used the NSF-funded Blue Waters supercomputer at the National Center for Supercomputing Applications and the Department of Energy-funded Titan supercomputer at the Oak Ridge Leadership Computing Facility to calculate the first 1 hertz CyberShake hazard model specific to the Los Angeles Basin. This simulation doubled the maximum simulated frequency of the previous year’s CyberShake seismic hazard model, therefore also doubling the accuracy.

Even though the number of calculations required increased as the maximum simulated frequency of the earthquake went up, the tremendous computing power of Blue Waters and Titan reduced the time needed for these calculations from months to weeks.

Scientists believe seismic hazard analyses need to simulate earthquake frequencies above 10 hertz to realistically capture the full dynamics of a potential event. SCEC’s work is paving the way for those simulations. Physics-based, 3-D earthquake simulations at 10 hertz, once a distant dream, are now on the horizon.

Aaron Dubrow,

(703) 292-4489

Tricia Barker,

(217) 265-8013

  • CyberShake map for 336 sites in the Los Angeles region show regions of high and low hazard risk.
    Credit and Larger Version

  • Blue Waters will enable researchers to investigate challenging and heretofore impossible problems.
    Credit and Larger Version

  • Titan is a supercomputer at Oak Ridge National Laboratory for use in a variety of science projects.
    Credit and Larger Version


Thomas Jordan
Kim Olsen
Yifeng Cui
Jacobo Bielak
Philip Maechling

Related Institutions/Organizations

University of Southern California
University of Illinois at Urbana-Champaign


, Illinois
Oak Ridge
, Tennessee
Los Angeles
, California

Related Programs


Leadership-Class System Acquisition – Creating a Petascale Computing Environment for Science and Engineering

Related Awards

#1450451 SI2-SSI: Community Software for Extreme-Scale Computing in Earthquake System Science
#1148493 SI2-SSI: A Sustainable Community Software Framework for Petascale Earthquake Modeling
#1238993 Sustained-Petascale In Action: Blue Waters Enabling Transformative Science And Engineering

Years Research Conducted

– 2018

Total Grants


Related Agencies
U.S. Geological Survey

Source: NSF News

Brought to you by China News

Leave a Reply

Your email address will not be published. Required fields are marked *