Title
Gated Graph Sequence Neural Networks
Abstract
Graph-structured data appears frequently in domains including chemistry, natural language semantics, social networks, and knowledge bases. In this work, we study feature learning techniques for graph-structured inputs. Our starting point is previous work on Graph Neural Networks (Scarselli et al., 2009), which we modify to use gated recurrent units and modern optimization techniques and then extend to output sequences. The result is a flexible and broadly useful class of neural network models that has favorable inductive biases relative to purely sequence-based models (e.g., LSTMs) when the problem is graph-structured. We demonstrate the capabilities on some simple AI (bAbI) and graph algorithm learning tasks. We then show it achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.
Year
Venue
Field
2015
international conference on learning representations
Data structure,Graph algorithms,Graph,Social network,Computer science,Graph neural networks,Theoretical computer science,Artificial intelligence,Artificial neural network,Natural language semantics,Feature learning,Machine learning
DocType
Volume
Citations 
Journal
abs/1511.05493
66
PageRank 
References 
Authors
1.38
0
4
Name
Order
Citations
PageRank
Yujia Li140523.01
Daniel Tarlow251431.62
Marc Brockschmidt347528.64
Richard S. Zemel44958425.68