Mesh Conflation of Oblique Photogrammetric Models Using Virtual Cameras and Truncated Signed Distance Field
Abstract
Conflating/stitching 2.5D raster digital surface models (DSM) into a large one has been a running practice in geoscience applications, however, conflating full-3D mesh models, such as those from oblique photogrammetry, is extremely challenging. In this letter, we propose a novel approach to address this challenge by conflating multiple full-3D oblique photogrammetric models into a single, and seamless mesh for high-resolution site modeling. Given two or more individually collected and created photogrammetric meshes, we first propose to create a virtual camera field (with a panoramic field of view) to incubate virtual spaces represented by Truncated Signed Distance Field (TSDF), an implicit volumetric field friendly for linear 3D fusion; then we adaptively leverage the truncated bound of meshes in TSDF to conflate them into a single and accurate full 3D site model. With drone-based 3D meshes, we show that our approach significantly improves upon traditional methods for model conflations, to drive new potentials to create excessively large and accurate full 3D mesh models in support of geoscience and environmental applications.
- Publication:
-
IEEE Geoscience and Remote Sensing Letters
- Pub Date:
- 2023
- DOI:
- 10.1109/LGRS.2023.3298321
- arXiv:
- arXiv:2308.12139
- Bibcode:
- 2023IGRSL..2098321S
- Keywords:
-
- Computer Science - Computer Vision and Pattern Recognition
- E-Print:
- 5 Figures