Scientists Develop a New Method for Detecting Overlooked Weak Earthquakes
Stanford scientists have developed an algorithm that can detect quakes that have been previously overlooked because they were too weak. These quakes are recorded in enormous ground motion measurement databases but are not regarded as actual earthquakes. Referred to as the Fingerprint And Similarity Thresholding, this new technique is said to be a major development for the detection of small earthquakes or microquakes.
As mentioned, this new technique is called Fingerprint And Similarity Thresholding or FAST. What it does is to determine the presence of little temblors that are usually lacking in strength to be considered as earthquakes under traditional methods. One of the trends in seismology over the past decade is the use of waveform similarity to search for weak quakes. Seismologists employ a technique called template matching, which compares an earthquake’s seismic wave pattern versus wave signatures logged in a database. This technique, however, is a rather slow process. It takes a lot of time to come up with results and also requires, in advance, a clear idea of the signal being sought.
With FAST, these two drawbacks are effectively circumvented. FAST features a system that takes all the recorded data from a seismic station and cuts the continuous signal into multiple segments. Each segment is only around a few seconds in duration. The segmented signals are subsequently compressed into compact representations referred to as “fingerprints,” so they can be processed more quickly. These fingerprints are further sorted into different groups after being evaluated for their similarities.
Details of this new technique are presented in an article published on the journal Sciences Advances.
The process FAST performs is comparable to how music identification services like Shazam work. In fact, most reports about FAST point out its Shazam-like operation as its main highlight. Also, Gregory Beroza of the Stanford University Department of Geophysics actually admits that FAST is inspired by Shazam as he himself was awed by the sophistication of the song-finding service a few years ago. The similarities between songs and earthquakes may not be immediately apparent but basically, what FAST does is to search for pairs of fingerprints that share common features. These are then mapped to the time windows they come from.
Shazam works by doing an analysis of an audio sample and searching for a match for it based on an acoustic fingerprint stored in a large database (around 11 million songs). The audio fingerprint is then compared against a time-frequency graph or a spectogram to find the matching song.
Earthquakes, regardless of their magnitudes or reported intensities, are bound to have similar fingerprints as long as they happen along the same fault. Also, whether the quakes happened a decade ago or only recently, they will register the same fingerprints if they occur on the same fault. It’s just like how songs work. Songs can be sung at different volumes and with slight variations but if you are looking for a specific song, you are bound to find it through Shazam as long as you have the right fingerprint.
The Need to Detect Microquakes
So why is there a need to detect microquakes? Will they be helpful in detecting future earthquakes? Unfortunately, earthquake prediction is still a very remote possibility. However, with the help of FAST, it will become easier to predict the frequency and likelihood of the occurrence of larger earthquakes. It can facilitate the determination of the areas where larger earthquakes are likely to occur. Microquakes are obviously unlike the dreaded megaquakes as they don’t pose threats to people and structures but monitoring them can be useful in analyzing seismic activity and in possibly doing rough predictions for major earthquakes.
The Need for Greater Efficiency in Handling Earthquake Data
In addition to the need to detect microquakes to improve the accuracy of seismological analysis and predictions, there is also the need for new systems or techniques to handle huge amounts of data. Over the years, seismology has accumulated enormous amounts of data that are already overwhelming current processing algorithms. Observational seismology requires the ability to identify seismic events in continuous data so it needs something like FAST in order to keep up.
What makes FAST efficient in handling large amounts of data is its sorting process. It’s like compiling similar documents in a filing cabinet so they can be easily found later on. Instead of comparing a fingerprint with virtually all of the fingerprints stored in a database, FAST only compares the fingerprint to those with which it is comparable. According Wayne Loel, one of the researchers involved in the project, FAST performs searches 3,000 faster when compared to the performance of conventional techniques.