Abstract | ||
---|---|---|
We unify recent neural approaches to one-shot learning with older ideas of associative memory in a model for metalearning. Our model learns jointly to represent data and to bind class labels to representations in a single shot. It builds representations via slow weights, learned across tasks through SGD, while fast weights constructed by a Hebbian learning rule implement one-shot binding for each new task. On the Omniglot, Mini-ImageNet, and Penn Treebank one-shot learning benchmarks, our model achieves state-of-the-art results. |
Year | Venue | Field |
---|---|---|
2018 | arXiv: Neural and Evolutionary Computing | Content-addressable memory,Metalearning,Computer science,Hebbian theory,Artificial intelligence,Treebank,Machine learning |
DocType | Volume | Citations |
Journal | abs/1807.05076 | 0 |
PageRank | References | Authors |
0.34 | 0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Tsendsuren Munkhdalai | 1 | 0 | 2.70 |
adam p trischler | 2 | 161 | 17.61 |