Title
Deep Learning with Dynamic Computation Graphs.
Abstract
Neural networks that compute over graph structures are a natural fit for problems in a variety of domains, including natural language (parse trees) and cheminformatics (molecular graphs). However, since the computation graph has a different shape and size for every input, such networks do not directly support batched training or inference. They are also difficult to implement in popular deep learning libraries, which are based on static data-flow graphs. We introduce a technique called dynamic batching, which not only batches together operations between different input graphs of dissimilar shape, but also between different nodes within a single input graph. The technique allows us to create static graphs, using popular libraries, that emulate dynamic computation graphs of arbitrary shape and size. We further present a high-level library of compositional blocks that simplifies the creation of dynamic graph models. Using the library, we demonstrate concise and batch-wise parallel implementations for a variety of models from the literature.
Year
Venue
Field
2017
International Conference on Learning Representations
Graph theory,Graph operations,Modular decomposition,Computer science,Implicit graph,Theoretical computer science,Artificial intelligence,Graph product,Deep learning,Clique-width,Artificial neural network,Machine learning
DocType
Volume
Citations 
Journal
abs/1702.02181
23
PageRank 
References 
Authors
0.81
6
4
Name
Order
Citations
PageRank
Moshe Looks114011.68
Marcello Herreshoff2230.81
Delesley Hutchins3512.70
Peter Norvig442561.47