Recursive Least Squares with Fading Regularization for Finite-Time Convergence without Persistent Excitation
Abstract
This paper extends recursive least squares (RLS) to include time-varying regularization. This extension provides flexibility for updating the least squares regularization term in real time. Existing results with constant regularization imply that the parameter-estimation error dynamics of RLS are globally attractive to zero if and only the regressor is weakly persistently exciting. This work shows that, by extending classical RLS to include a time-varying (fading) regularization term that converges to zero, the parameter-estimation error dynamics are globally attractive to zero without weakly persistent excitation. Moreover, if the fading regularization term converges to zero in finite time, then the parameter estimation error also converges to zero in finite time. Finally, we propose rank-1 fading regularization (R1FR) RLS, a time-varying regularization algorithm with fading regularization that converges to zero, and which runs in the same computational complexity as classical RLS. Numerical examples are presented to validate theoretical guarantees and to show how R1FR-RLS can protect against over-regularization.
- Publication:
-
arXiv e-prints
- Pub Date:
- January 2025
- DOI:
- arXiv:
- arXiv:2501.04566
- Bibcode:
- 2025arXiv250104566L
- Keywords:
-
- Electrical Engineering and Systems Science - Signal Processing;
- Electrical Engineering and Systems Science - Systems and Control
- E-Print:
- Submitted to the 2025 American Control Conference