Graduated Optimization of Black-Box Functions
Abstract
Motivated by the problem of tuning hyperparameters in machine learning, we present a new approach for gradually and adaptively optimizing an unknown function using estimated gradients. We validate the empirical performance of the proposed idea on both low and high dimensional problems. The experimental results demonstrate the advantages of our approach for tuning high dimensional hyperparameters in machine learning.
- Publication:
-
arXiv e-prints
- Pub Date:
- June 2019
- DOI:
- 10.48550/arXiv.1906.01279
- arXiv:
- arXiv:1906.01279
- Bibcode:
- 2019arXiv190601279S
- Keywords:
-
- Computer Science - Machine Learning;
- Mathematics - Optimization and Control;
- Statistics - Machine Learning;
- 90C26;
- G.1.6
- E-Print:
- Accepted Workshop Submission for the 6th ICML Workshop on Automated Machine Learning