Detecting extreme events in streaming satellite data
Abstract
Near real time detection of extreme events such as wildfires, cyclones or floods in the satellite data is becoming crucial for managing disasters. Though there are several earth observing satellites providing information about disasters, those in the geostationary orbit provide data at intervals as frequent as every minute - practically a video from space. In this work, we present two frameworks to detect and localize the wildfires in the satellite videos: 1) Future frame prediction based unsupervised learning framework to detect the anomalies in the videos. 2) Transfer learning based approach to detect and localize the wildfires in the videos using Yolov5.
In the first approach , we predict the future frames based on the past frames and calculate the difference between predicted frame and the ground truth to find the anomalous event. Generative adversarial network with U-Net as generator is used to generate the future frames. To predict the high quality future frame , motion ( temporal ) constraint is introduced along with the gradient and intensity loss. This model is trained on normal videos and tested on abnormal videos ( with wildfires ). PSNR (peak signal to noise ratio) is calculated for each video frame to identify the frame as either anomalous or normal. Higher PSNR for a frame indicates the frame as anomalous. But we have normalized the PSNR of all frames in each video between 0 and 1. Now, one can flag the frame as anomalous or normal based on the setted threshold. The evaluation metric which is used to calculate the performance of anomaly detection is ROC by changing the threshold of these regular scores. AUC is calculated to evaluate the performance. We have got a AUC score of 0.73 on the satellite videos. This approach predicts the future frames which identifies the movements of clouds and smoke for the future times. In the second approach, we have trained several deep learning models like inception net , alexnet and Yolov5 on the fire images dataset and used this trained model to detect the fire events in the satellite video frames. Yolov5 has performed best in detecting and localizing the fire in the satellite videos. This is a supervised learning based framework due to the training with labeled dataset. We will present a complete pipeline to detect and localize multiple extreme events in the videos from geostationary satellites.- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2022
- Bibcode:
- 2022AGUFMIN56A..06A