Title
Stateless neural meta-learning using second-order gradients
Abstract
Meta-learning can be used to learn a good prior that facilitates quick learning; two popular approaches are MAML and the meta-learner LSTM. These two methods represent important and different approaches in meta-learning. In this work, we study the two and formally show that the meta-learner LSTM subsumes MAML, although MAML, which is in this sense less general, outperforms the other. We suggest the reason for this surprising performance gap is related to second-order gradients. We construct a new algorithm (named TURTLE) to gain more insight into the importance of second-order gradients. TURTLE is simpler than the meta-learner LSTM yet more expressive than MAML and outperforms both techniques at few-shot sine wave regression and 50% of the tested image classification settings (without any additional hyperparameter tuning) and is competitive otherwise, at a computational cost that is comparable to second-order MAML. We find that second-order gradients also significantly increase the accuracy of the meta-learner LSTM. When MAML was introduced, one of its remarkable features was the use of second-order gradients. Subsequent work focused on cheaper first-order approximations. On the basis of our findings, we argue for more attention for second-order gradients.
Year
DOI
Venue
2022
10.1007/s10994-022-06210-y
Machine Learning
Keywords
DocType
Volume
Meta-learning, Few-shot learning, Deep learning, Transfer learning, 68T07, 68T45
Journal
111
Issue
ISSN
Citations 
9
0885-6125
0
PageRank 
References 
Authors
0.34
4
3
Name
Order
Citations
PageRank
Mike Huisman100.34
Aske Plaat252472.18
Jan N. van Rijn300.68