Structural changes in nonlocal denoising models arising through bi-level parameter learning
Abstract
We introduce a unified framework based on bi-level optimization schemes to deal with parameter learning in the context of image processing. The goal is to identify the optimal regularizer within a family depending on a parameter in a general topological space. Our focus lies on the situation with non-compact parameter domains, which is, for example, relevant when the commonly used box constraints are disposed of. To overcome this lack of compactness, we propose a natural extension of the upper-level functional to the closure of the parameter domain via Gamma-convergence, which captures possible structural changes in the reconstruction model at the edge of the domain. Under two main assumptions, namely, Mosco-convergence of the regularizers and uniqueness of minimizers of the lower-level problem, we prove that the extension coincides with the relaxation, thus admitting minimizers that relate to the parameter optimization problem of interest. We apply our abstract framework to investigate a quartet of practically relevant models in image denoising, all featuring nonlocality. The associated families of regularizers exhibit qualitatively different parameter dependence, describing a weight factor, an amount of nonlocality, an integrability exponent, and a fractional order, respectively. After the asymptotic analysis that determines the relaxation in each of the four settings, we finally establish theoretical conditions on the data that guarantee structural stability of the models and give examples of when stability is lost.
- Publication:
-
arXiv e-prints
- Pub Date:
- September 2022
- DOI:
- 10.48550/arXiv.2209.06256
- arXiv:
- arXiv:2209.06256
- Bibcode:
- 2022arXiv220906256D
- Keywords:
-
- Mathematics - Analysis of PDEs