Title
Multi-Fidelity Automatic Hyper-Parameter Tuning via Transfer Series Expansion
Abstract
Automatic machine learning (AutoML) aims at automatically choosing the best configuration for machine learning tasks. However, a configuration evaluation can be very time consuming particularly on learning tasks with large datasets. This limitation usually restrains derivative-free optimization from releasing its full power for a fine configuration search using many evaluations. To alleviate this limitation, in this paper, we propose a derivative-free optimization framework for AutoML using multi-fidelity evaluations. It uses many low-fidelity evaluations on small data subsets and very few high-fidelity evaluations on the full dataset. However, the low-fidelity evaluations can be badly biased, and need to be corrected with only a very low cost. We thus propose the Transfer Series Expansion (TSE) that learns the low-fidelity correction predictor efficiently by linearly combining a set of base predictors. The base predictors can be obtained cheaply from down-scaled and experienced tasks. Experimental results on real-world AutoML problems verify that the proposed framework can accelerate derivative-free configuration search significantly by making use of the multi-fidelity evaluations.
Year
Venue
Field
2019
THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE
Fidelity,Small data,Hyperparameter,Computer science,Series expansion,Artificial intelligence,Machine learning
DocType
Citations 
PageRank 
Conference
1
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
Yi-Qi Hu1243.46
Yang Yu248848.20
Wei-Wei Tu3337.86
Qiang Yang417039875.69
Yuqiang Chen5673.23
Wenyuan Dai6114249.14