A Note on Uncertainty Quantification for Maximum Likelihood Parameters Estimated with Heuristic Based Optimization Algorithms
Abstract
Gradient-based solvers risk convergence to local optima, leading to incorrect researcher inference. Heuristic-based algorithms are able to ``break free" of these local optima to eventually converge to the true global optimum. However, given that they do not provide the gradient/Hessian needed to approximate the covariance matrix and that the significantly longer computational time they require for convergence likely precludes resampling procedures for inference, researchers often are unable to quantify uncertainty in the estimates they derive with these methods. This note presents a simple and relatively fast two-step procedure to estimate the covariance matrix for parameters estimated with these algorithms. This procedure relies on automatic differentiation, a computational means of calculating derivatives that is popular in machine learning applications. A brief empirical example demonstrates the advantages of this procedure relative to bootstrapping and shows the similarity in standard error estimates between this procedure and that which would normally accompany maximum likelihood estimation with a gradient-based algorithm.
- Publication:
-
arXiv e-prints
- Pub Date:
- January 2024
- DOI:
- 10.48550/arXiv.2401.07176
- arXiv:
- arXiv:2401.07176
- Bibcode:
- 2024arXiv240107176P
- Keywords:
-
- Economics - Econometrics