Explainable Artificial Intelligence for Solar Flare Prediction
Abstract
Solar flares have the potential to disrupt Earths technology infrastructure and negatively impact the health and safety of space travelers. Consequently, predicting these events is of great interest to the scientific community and a benefit to society as a whole. The complexity of solar flare data combined with the fact that the physical processes behind solar flare occurrence are still not currently well understood suggest that the flare prediction problem is a good candidate for the application of machine learning techniques. To that end, machine learning algorithms have already been shown to be effective in forecasting solar flares, but it is often unclear why a particular model has made a given prediction. Not only does this lack of transparency make it difficult to better understand what causes solar flares, but it also poses potential challenges for having our predictions trusted. These enigmatic black box models raise similar concerns in other machine learning applications across many different fields, fueling an industry-wide desire to have improved models whose predictions are less mysterious. Enhancing our models with Explainable Artificial Intelligence (XAI), an emerging paradigm in machine learning grounded in the principles of accountability, responsibility, and transparency, has the potential to remove the shroud of uncertainty around our results, deepening our understanding of flares in general and strengthening the reliability of our predictions. Using Local Interpretable Model-Agnostic Explanations (LIME), a Python library for explaining machine learning models, we investigated the integration of XAI with flare prediction models and explored the potential benefits of a transparent model with explanations for its decisions that are easily understood. LIMEs highly interpretable diagrams show which features contributed positively or negatively to a particular prediction and quantify the relative importance of different features in the decision, allowing for transparency for experts and non-experts alike.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2021
- Bibcode:
- 2021AGUFMNG45B0576F