the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
A dynamic informed deep learning method for future estimation of laboratory stick-slip
Abstract. Fault activities modelling holds vital importance for earthquake monitoring, risk management, and early alert. Studies on laboratory earthquakes are instrumental in the modelling of natural fault ruptures and in enhancing our grasp of natural earthquake dynamics. Recently, deep learning methods have been proven effective in predicting instantaneous fault stress in laboratory settings and slow slip events on Earth. However, these methods have struggled to conduct steady future prediction lacking grasping of the complex dynamics of highly nonlinear laboratory fault slip systems. Addressing this, we introduce the Hankel Koopman Auto-encoder (HKAE), a novel method inspired by dynamical system theories. HKAE performs dynamic modelling of laboratory fault system and provides a continuous estimation of the future state of the system. It has been deployed on experiments with different slip behaviours and shows superior ability to predict shear stress variation during a slip cycle and also slip activities in longer-term seismic cycles. The HKAE model surpasses conventional time series prediction deep learning methods, showing superior statistical evaluation metrics like RMSE and R2 with two prediction horizons. Meanwhile, we find that the HKAE can model the slip dynamics better than purely statistical modelling, as evidenced by its more accurate modelling of the slip timing, slip cycle intervals and its ability to summarize the quasi-periodic dynamics as an operator from a small number of samples to generate more robust beyond-horizon prediction. The capability of HKAE to decompose and model complex temporal dynamics highlights its potential in and sparse-observed geophysical system with quasi-periodic characteristics like natural fault activities.
- Preprint
(6055 KB) - Metadata XML
- BibTeX
- EndNote
Status: open (until 23 May 2024)
-
RC1: 'Comment on gmd-2024-46', Anonymous Referee #1, 25 Apr 2024
reply
Yue et al. developed an algorithm (HKAE) to perform time series forecasting. The algorithm exploits concepts derived from dynamical systems theory and Koopman theory, and it uses an autoencoder architecture to realise the link between the two. They decided to apply the algorithm to laboratory earthquakes data, and in particular to shear stress time series.
I think that the idea and the results are interesting. Nonetheless, there are several points that need to be better explained and/or further developed.
I have two major concerns about the manuscript.
1) The first problem that I see comes from your interpretations. You state that your algorithm outperforms the existing ones (e.g., you tested LSTM, TCN and MLP). But in many occasions this is not true. Your Figures 8 and 10 show that, especially for the immediate next future, LSTM performs better than HKAE. I would recommend you to not oversell the HKAE algorithm.
2) The second problem comes from the pre-processing of the data. Reading the code, I noted that there is a pre-processing step to smooth the data. In the manuscript you do not mention any filtering or smoothing step. The smoothing function that you use is taken from the statsmodels package, and it takes the closest data to perform a local linear regression. The closest data in a 1-dim time series can come from both the past and the future. This means that when you smooth the data you are introducing information from the future. This is a problem if you want to evaluate forecasting performances. You need to clarify how many data from the future are used to smooth the data.
Other two, less critical but still important, problems are the following.
One concerns the reproducibility. In order to reproduce the results, it is important that you add a README.txt file in your repository. Furthermore, you should add a requirements.txt file with the details of the packages that you used. It is a good practice to do it so that people can create a local virtual environment and reproduce your results. Some comments in your code are not in English, and you should translate them.
Finally, it is not easy to follow the reasoning and the various steps mainly because of the overall poor English structure. I was trying to write down the correction myself, but after line 70 I gave up because there were too many corrections to suggest. In the Acknowledgments section you mention that the manuscript was polished with GPT-4. Sometimes the feeling is that entire paragraphs were written automatically, without a proper logical connection with the next part. I highly recommend you ask a native English speaker to review and edit the manuscript.
For detailed comments, please see the attached file.
Model code and software
Hankel Koopman Auto Encoder Enjiang Yue, Mengjiao Qin, Linshu Hu, Sensen Wu, and Zhenhong Du https://zenodo.org/records/10846361
Viewed
HTML | XML | Total | BibTeX | EndNote | |
---|---|---|---|---|---|
181 | 28 | 18 | 227 | 18 | 12 |
- HTML: 181
- PDF: 28
- XML: 18
- Total: 227
- BibTeX: 18
- EndNote: 12
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1