Decentralized Implicit Differentiation
Abstract
The ability to differentiate through optimization problems has unlocked numerous applications, from optimization-based layers in machine learning models to complex design problems formulated as bilevel programs. It has been shown that exploiting problem structure can yield significant computation gains for optimization and, in some cases, enable distributed computation. One should expect that this structure can be similarly exploited for gradient computation. In this work, we discuss a decentralized framework for computing gradients of constraint-coupled optimization problems. First, we show that this framework results in significant computational gains, especially for large systems, and provide sufficient conditions for its validity. Second, we leverage exponential decay of sensitivities in graph-structured problems towards building a fully distributed algorithm with convergence guarantees. Finally, we use the methodology to rigorously estimate marginal emissions rates in power systems models. Specifically, we demonstrate how the distributed scheme allows for accurate and efficient estimation of these important emissions metrics on large dynamic power system models.
- Publication:
-
arXiv e-prints
- Pub Date:
- March 2024
- DOI:
- arXiv:
- arXiv:2403.01260
- Bibcode:
- 2024arXiv240301260F
- Keywords:
-
- Mathematics - Optimization and Control;
- Electrical Engineering and Systems Science - Systems and Control