Can you trust interpolations? A look at combining multi-viewpoint data from three spacecraft
Abstract
We are homogenizing and making machine-learning (ML)-ready 25 years of full-disk extreme ultraviolet (EUV) images from 1996 to present, accessible via the cloud, by bringing in historical restoration of EUV imaging from the STEREO and SOHO era along with present datasets. The data synthesis will provide 360 degree maps of the Sun, and the combined cadences will allow better use of less-sampled EUV instruments with higher-cadence sets. This data will enable many science investigations that benefit from multi-viewpoint observations, including studies of the long-term evolution of active regions, long-range interactions (e.g., sympathetic flaring), total solar irradiance, and more.
Our work requires creating homogenous data from source instruments of different resolution, cadence, and slight wavelength differences as well as dealing with cross-calibration. With spatial sampling, maintaining photometric accuracy while also preserving feature evolution from a mix of 2048x2048, 1024x1024, and 512x512 images involves tradeoffs. Furthermore, differences in cadence require experimentation to yield cross-comparable results. Cadence questions include when data frames are temporally 'close enough' and whether time-interpolated data frames are valid for quiescent and/or active region studies. We also solicit input as to what machine-learning-suitable problems our dataset should target, including those that benefit from multi-viewpoint observations.- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2022
- Bibcode:
- 2022AGUFMSH45D2372V