Mitigation of Spatial Nonstationarity with Vision Transformers
Abstract
Spatial nonstationarity, the feature statistical distributions varying by location, is ubiquitous in many natural settings, for example, in geological reservoirs due to geomechanical compaction trends, mineral deposits due to sedimentation processes, hydrology due to the atmosphere and topography interactions, and metallurgy due to differential cooling. Conventional geostatistical modeling workflows rely on the assumption of stationarity to be able to model properties for the geostatistical inference. Nevertheless, this is often not a realistic assumption when dealing with nonstationary spatial data, and this has motivated a variety of nonstationary spatial modeling workflows such as trend and residual decomposition and collocate cosimulation.
The advent of deep learning technologies has made it possible to model spatial relationships with ease. However, there is no general guidance addressing the mitigation of spatial nonstationarity in geospatial contexts. In this work, we demonstrate the impact and then explore the mitigation of two common types of geostatistical spatial nonstationarity using self-attention (transformer) models. The first type is related to the spatial features relatively larger than the domain size. That is to say; the subsurface properties may not be stationary locally but stationary over a larger domain size; the second type is related to a spatial trend that may be formed during regional spatial processes. Our results show that self-attention models can mitigate the impacts of the above types of spatial nonstationarity on deep learning models by at least 10% and 15% of relative errors, respectively. Our results demonstrate the ability of self-attention networks for modeling large-scale spatial relationships while mitigating spatial nonstationarity. Our work provides guidance and best practices for deep learning applications for nonstationary datasets commonly encountered in natural settings.- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2022
- Bibcode:
- 2022AGUFMIN33A..05L