Explainable Deep Neural Networks for Seismic Reflectivity Inversion
Abstract
Reflectivity inversion/model building is an important inverse problem in seismic imaging. Considering a convolutional model by assuming a piecewise-constant impedance profile, we present a novel deep neural network architecture, namely, Nonuniform Sparse Proximal Average Network (NuSPAN), to solve the problem within the framework of model-based prior learning (Mache et al., 2021a, 2021b). The architecture is inspired by sparse recovery algorithms. Greedy and iterative techniques formulate the problem as an l1-norm regularization problem. Although this is a widely used sparsity enforcing regularizer, it suffers from estimation bias (Candes et al., 2008). Recently, neural networks have been deployed for solving the reflectivity inversion problem. For example, the feedforward network by Kim and Nakata (2018) outperformed conventional techniques in support recovery but showed suboptimal amplitude recovery. We develop data-driven nonuniform sparse regularization based on a composite prior constructed from a convex combination of weighted convex and nonconvex penalties. Learning accurate priors from seismic data instead of using a fixed one allows one to estimate the sparse reflectivity accurately. We develop a model-based prior learning network called NuSPAN within the paradigm of deep-unfolding (Gregor and LeCun, 2010). NuSPAN combines the advantages of iterative and data-driven techniques. Deep unfolding gives rise to interpretable architectures, unlike ad hoc networks. We demonstrate the efficacy of NuSPAN for amplitude and support recovery considering both synthetic and simulated data (Marmousi2 model) and show that the accuracy is higher than the state-of-the-art techniques. For the Marmousi2 model, NuSPAN results in 600x faster inference than FISTA, which is the next best technique (Beck and Teboulle, 2009). Such a speedup is an attractive feature when handling large amounts of data.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2021
- Bibcode:
- 2021AGUFM.S15F0312M