Deep learning methods for underground deformation time-series prediction

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

91 Downloads (Pure)

Abstract

Prediction is a vague concept that is why we need to conceptualize it specifically for underground deformation time-series data. For this impending issue, this paper employs an advanced deep learning model Bi-LSTM-AM to address it. The results show its applicability for practical engineering. The proposed model is compared with other basic deep learning models including long short-term memory (LSTM), Bi-LSTM, gated recurrent units (GRU), and temporal convolutional networks (TCN). These models cover the most common three forms of deep learning for time-series prediction: recurrent neural networks (RNN) and convolutional neural networks (CNN). This research is supposed to benefit the underground deformation time-series prediction.
Original languageEnglish
Title of host publicationExpanding Underground - Knowledge and Passion to Make a Positive Impact on the World- Proceedings of the ITA-AITES World Tunnel Congress, WTC 2023
Subtitle of host publicationProceedings of the ITA-AITES World Tunnel Congress 2023 (WTC 2023), 12-18 May 2023, Athens, Greece
EditorsGeorgios Anagnostou, Andreas Benardos, Vassilis P. Marinos
Place of PublicationLondon
PublisherCRC Press
Pages2775-2781
Number of pages7
Edition1st Edition
ISBN (Electronic)978-1-003-34803-0
DOIs
Publication statusPublished - 12 Apr 2023
MoE publication typeA4 Conference publication
EventWorld Tunnel Congress - Athens, Greece
Duration: 12 May 202318 May 2023

Conference

ConferenceWorld Tunnel Congress
Abbreviated titleWTC
Country/TerritoryGreece
CityAthens
Period12/05/202318/05/2023

Keywords

  • underground engineering
  • time-series
  • deep learning
  • deformation prediction
  • machine learning

Fingerprint

Dive into the research topics of 'Deep learning methods for underground deformation time-series prediction'. Together they form a unique fingerprint.

Cite this