Abstract | ||
---|---|---|
AdaNet is a lightweight TensorFlow-based (Abadi et al., 2015) framework for automatically learning high-quality ensembles with minimal expert intervention. Our framework is inspired by the AdaNet algorithm (Cortes et al., 2017) which learns the structure of a neural network as an ensemble of subnetworks. We designed it to: (1) integrate with the existing TensorFlow ecosystem, (2) offer sensible default search spaces to perform well on novel datasets, (3) present a flexible API to utilize expert information when available, and (4) efficiently accelerate training with distributed CPU, GPU, and TPU hardware. The code is open-source and available at: this https URL. |
Year | Venue | DocType |
---|---|---|
2019 | arXiv: Learning | Journal |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
12 |
Name | Order | Citations | PageRank |
---|---|---|---|
Charles Weill | 1 | 1 | 0.68 |
Javier Gonzalvo | 2 | 0 | 0.34 |
Vitaly Kuznetsov | 3 | 68 | 9.33 |
Scott Yang | 4 | 0 | 0.68 |
Scott Yak | 5 | 0 | 0.68 |
Hanna Mazzawi | 6 | 48 | 6.42 |
Eugen Hotaj | 7 | 0 | 0.34 |
Ghassen Jerfel | 8 | 6 | 3.18 |
Vladimir Macko | 9 | 1 | 0.68 |
Ben Adlam | 10 | 3 | 3.45 |
Mehryar Mohri | 11 | 4502 | 448.21 |
Corinna Cortes | 12 | 6574 | 1120.50 |