东北大学学报(自然科学版) ›› 2023, Vol. 44 ›› Issue (11): 1631-1638.DOI: 10.12068/j.issn.1005-3026.2023.11.016

• 资源与土木工程 • 上一篇    下一篇

基于融合多注意力机制的深度学习的盾构荷载预测方法

陈城1, 史培新1, 王占生2, 贾鹏蛟1   

  1. (1. 苏州大学 轨道交通学院, 江苏 苏州215000; 2. 苏州轨道交通集团有限公司, 江苏 苏州215004)
  • 发布日期:2023-12-05
  • 通讯作者: 陈城
  • 作者简介:陈城(1993-),男,江苏盐城人,苏州大学博士研究生; 史培新(1975-), 男, 江苏溧阳人,苏州大学教授,博士生导师.
  • 基金资助:
    国家自然科学基金资助项目(52278405).

Shield Load Prediction Method Based on Deep Learning with Multiattention Mechanism

CHEN Cheng1, SHI Pei-xin1, WANG Zhan-sheng2, JIA Peng-jiao1   

  1. 1. School of Rail Transportation, Soochow University, Suzhou 215000, China; 2. Suzhou Rail Transit Group Co., Ltd., Suzhou 215004, China.
  • Published:2023-12-05
  • Contact: SHI Pei-xin
  • About author:-
  • Supported by:
    -

摘要: 盾构荷载作为盾构的主要性能指标,准确的荷载预测对于保证盾构安全高效工作和周边环境稳定具有重要意义.鉴于传统预测方法精度差的局限性,本研究以数据的高维度特征和时序特征为切入点,提出一种结合卷积神经网络、双向长短期记忆神经网络和注意力机制的混合模型(CNN-BiLSTM-Multiattention,CBM),对盾构荷载进行精准预测.该模型不仅可以提取数据的高维度特征和时序特征,还能突出高维度特征的重要性和关键时间节点信息.通过实验证明了相较于4种现有的模型,本文所提出的模型在3种评价指标上均优于其他模型,对推力和扭矩预测的准确率达到94.2%和96.2%.

关键词: 深度学习;注意力机制;时序特征;高维度;荷载预测

Abstract: Shield load is the main performance indicator of the shield, accurate load prediction is significant to ensure the safety and efficiency of the shield and the stability of the surrounding environment. Recognizing the limitations of the traditional prediction methods, this paper proposes a hybrid model(CBM), combining convolutional neural network (CNN), bi-directional long short-term memory (BiLSTM) and attention mechanism, to predict the shield load accurately based on the high-dimensional feature and time series characteristic of the data. The proposed model not only can extract the high-dimensional features and time series characteristics of the data, but also can highlight the importance of high-dimensional features and important time node information. The experiment results show that compared with the existing models, the proposed model achieves a higher prediction performance, the prediction accuracy of the thrust and torque is 94.2% and 96.2%.

Key words: deep learning; attention mechanism; time series feature; high-dimension; load prediction

中图分类号: