ICESat-2 Noise Filtering Using ConvPoint Neural Network
Abstract
ICESat-2 has the primary mission to monitor the elevation of the Earth's ice sheets using a photon counting lidar. It's secondary objective is to provide information on forest heights - an essential input for estimating the Earth's carbon budget. Recently published results show that in the presence of dense tropical vegetation, terrain elevation estimation errors grow proportionally with the heights of the trees, while top of canopy measurements occasionally exhibit outlier values due to atmospheric conditions such as fog or low-lying clouds, which are exacerbated by the sensitivity of the lidar photon detectors. Therefore, it is necessary to develop filtering methods to remove noise photon events before estimating forest heights and terrain elevations. Here we propose an algorithm for separating noise from the signal events from the terrain and tree canopy using a neural network. This approach leverages both geometric and local signal density information of overlapping airborne lidar measurements to train a deep neural network to separate signal from noise photon events. A ConvPoint architecture for 3D point clouds is applied to the raw ICESat-2 ATL03 point cloud records, producing reliable and continuous noise filtering results even in densely vegetated areas. The benefit of the ConvPoint architecture, when compared with traditional 2D convolutional neural networks, is that the approach works directly on 3D photon data without conversion to an intermediate representation such as an image. We validate our results for ICESat-2 terrain and top of canopy elevations in dense tropical forests of Mexico, Belize, Guatemala, and Honduras by comparing the neural network classified data with high-density airborne lidar data from seven test sites. The results show better consistency and improvement in terrain and top of canopy estimates when compared to current ICESat-2 noise filtering algorithms as delivered in the ATL08 product.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2022
- Bibcode:
- 2022AGUFMIN22A..04V