Machine Learning-Detected Signal Predicts Time to Earthquake

Typography

Machine-learning research published in two related papers today in Nature Geosciences reports the detection of seismic signals accurately predicting the Cascadia fault’s slow slippage, a type of failure observed to precede large earthquakes in other subduction zones.

Machine-learning research published in two related papers today in Nature Geosciences reports the detection of seismic signals accurately predicting the Cascadia fault’s slow slippage, a type of failure observed to precede large earthquakes in other subduction zones.

Los Alamos National Laboratory researchers applied machine learning to analyze Cascadia data and discovered the megathrust broadcasts a constant tremor, a fingerprint of the fault’s displacement. More importantly, they found a direct parallel between the loudness of the fault’s acoustic signal and its physical changes. Cascadia’s groans, previously discounted as meaningless noise, foretold its fragility.

“Cascadia’s behavior was buried in the data. Until machine learning revealed precise patterns, we all discarded the continuous signal as noise, but it was full of rich information. We discovered a highly predictable sound pattern that indicates slippage and fault failure,” said Los Alamos scientist Paul Johnson. “We also found a precise link between the fragility of the fault and the signal’s strength, which can help us more accurately predict a megaquake.”

The new papers were authored by Johnson, Bertrand Rouet-Leduc and Claudia Hulbert from the Laboratory’s Earth and Environmental Sciences Division, Christopher Ren from the Laboratory’s Intelligence and Space Research Division and collaborators at Pennsylvania State University.

Read more at DOE/Los Alamos National Laboratory

Photo Credit: Angelo_Giordano via Pixabay