We propose a multi-task deep learning (DL) method to simultaneously estimate the basement depth and the density contrast from gravity field anomalies. The method is based on a specially designed hybrid architecture, which comprises a convolutional neural network branch and a Multilayer Perceptron branch. This hybrid architecture fully leverages the benefits of multi-task DL, enabling simultaneous estimation of basement depth and density contrast, where the input is a gravity map. In the training phase, useful statistical prior information is incorporated from a global basin dataset. Our idea is that the learning based on such dataset helps to restrict the solution to a limited domain, so leading to a reasonable estimation of the basement depth and the density contrast. We utilize a Deep Convolutional Generative Adversarial Network (DCGAN) to generate high-quality maps of basement depths based on a global catalog of basins. The preliminary real basement maps originate from the re-interpolations and nonstandard coordinate transformations of the sediment data inside the global basins, and more additional basement samples are generated by the trained DCGAN architecture, thereby forming our dataset. We apply the method to synthetic dataset and to two real cases, thus demonstrating the feasibility and effectiveness of our DL method. The results show good performance of our DL architecture not only for the estimated basement models, but also for the density contrast. The method candidates as a valid tool for practical applications, especially when there is a lack of constraint information in complex real cases.

Simultaneous estimation of basement depth and density contrast by gravity anomaly via multi-task deep learning / Wang, Lin; Florio, Giovanni; Fedi, Maurizio; Xiong, Shengqing; Wang, Wanyin. - In: JOURNAL OF APPLIED GEOPHYSICS. - ISSN 0926-9851. - 240:105781(2025). [10.1016/j.jappgeo.2025.105781]

Simultaneous estimation of basement depth and density contrast by gravity anomaly via multi-task deep learning

Florio, Giovanni;Fedi, Maurizio;
2025

Abstract

We propose a multi-task deep learning (DL) method to simultaneously estimate the basement depth and the density contrast from gravity field anomalies. The method is based on a specially designed hybrid architecture, which comprises a convolutional neural network branch and a Multilayer Perceptron branch. This hybrid architecture fully leverages the benefits of multi-task DL, enabling simultaneous estimation of basement depth and density contrast, where the input is a gravity map. In the training phase, useful statistical prior information is incorporated from a global basin dataset. Our idea is that the learning based on such dataset helps to restrict the solution to a limited domain, so leading to a reasonable estimation of the basement depth and the density contrast. We utilize a Deep Convolutional Generative Adversarial Network (DCGAN) to generate high-quality maps of basement depths based on a global catalog of basins. The preliminary real basement maps originate from the re-interpolations and nonstandard coordinate transformations of the sediment data inside the global basins, and more additional basement samples are generated by the trained DCGAN architecture, thereby forming our dataset. We apply the method to synthetic dataset and to two real cases, thus demonstrating the feasibility and effectiveness of our DL method. The results show good performance of our DL architecture not only for the estimated basement models, but also for the density contrast. The method candidates as a valid tool for practical applications, especially when there is a lack of constraint information in complex real cases.
2025
Simultaneous estimation of basement depth and density contrast by gravity anomaly via multi-task deep learning / Wang, Lin; Florio, Giovanni; Fedi, Maurizio; Xiong, Shengqing; Wang, Wanyin. - In: JOURNAL OF APPLIED GEOPHYSICS. - ISSN 0926-9851. - 240:105781(2025). [10.1016/j.jappgeo.2025.105781]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/1019818
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact