Title
Sequential mastery of multiple tasks: Networks naturally learn to learn.
Abstract
We explore the behavior of a standard convolutional neural net in a setting that introduces classification tasks sequentially and requires the net to master new tasks while preserving mastery of previously learned tasks. This setting corresponds to that which human learners face as they acquire domain expertise, for example, as an individual reads a textbook chapter-by-chapter. Through simulations involving sequences of ten related tasks, we find reason for optimism that nets will scale well as they advance from having a single skill to becoming domain experts. We observed two key phenomena. First, _forward facilitation_---the accelerated learning of task $n+1$ having learned $n$ previous tasks---grows with $n$. Second, _backward interference_---the forgetting of the $n$ previous tasks when learning task $n+1$---diminishes with $n$. Amplifying forward facilitation is the goal of research on metalearning, and attenuating backward interference is the goal of research on catastrophic forgetting. We find that both of these goals are attained simply through broader exposure to a domain.
Year
Venue
DocType
2019
arXiv: Learning
Journal
Volume
Citations 
PageRank 
abs/1905.10837
0
0.34
References 
Authors
0
2
Name
Order
Citations
PageRank
Guy Davidson100.34
Michael C. Mozer21020457.84