Abstract | ||
---|---|---|
Automatic machine learning (AutoML) aims at automatically choosing the best configuration for machine learning tasks. However, a configuration evaluation can be very time consuming particularly on learning tasks with large datasets. This limitation usually restrains derivative-free optimization from releasing its full power for a fine configuration search using many evaluations. To alleviate this limitation, in this paper, we propose a derivative-free optimization framework for AutoML using multi-fidelity evaluations. It uses many low-fidelity evaluations on small data subsets and very few high-fidelity evaluations on the full dataset. However, the low-fidelity evaluations can be badly biased, and need to be corrected with only a very low cost. We thus propose the Transfer Series Expansion (TSE) that learns the low-fidelity correction predictor efficiently by linearly combining a set of base predictors. The base predictors can be obtained cheaply from down-scaled and experienced tasks. Experimental results on real-world AutoML problems verify that the proposed framework can accelerate derivative-free configuration search significantly by making use of the multi-fidelity evaluations. |
Year | Venue | Field |
---|---|---|
2019 | THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | Fidelity,Small data,Hyperparameter,Computer science,Series expansion,Artificial intelligence,Machine learning |
DocType | Citations | PageRank |
Conference | 1 | 0.34 |
References | Authors | |
0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yi-Qi Hu | 1 | 24 | 3.46 |
Yang Yu | 2 | 488 | 48.20 |
Wei-Wei Tu | 3 | 33 | 7.86 |
Qiang Yang | 4 | 17039 | 875.69 |
Yuqiang Chen | 5 | 67 | 3.23 |
Wenyuan Dai | 6 | 1142 | 49.14 |