Title | ||
---|---|---|
Everything old is new again: A multi-view learning approach to learning using privileged information and distillation. |
Abstract | ||
---|---|---|
We adopt a multi-view approach for analyzing two knowledge transfer settings---learning using privileged information (LUPI) and distillation---in a common framework. Under reasonable assumptions about the complexities of hypothesis spaces, and being optimistic about the expected loss achievable by the student (in distillation) and a transformed teacher predictor (in LUPI), we show that encouraging agreement between the teacher and the student leads to reduced search space. As a result, improved convergence rate can be obtained with regularized empirical risk minimization. |
Year | Venue | DocType |
---|---|---|
2019 | arXiv: Learning | Journal |
Volume | Citations | PageRank |
abs/1903.03694 | 0 | 0.34 |
References | Authors | |
8 | 1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Weiran Wang | 1 | 114 | 9.99 |