Title
The neural network pushdown automation: model, stack and learning simulations
Abstract
Abstract In order for neural networks to learn complex languages or grammars, they must have sufficient computational power or resources to recognize or generate such languages. Though many approaches to effectively utilizing the com- putational power of neural networks have been discussed, an obvious one is to couple a recurrent neural network with an external stack memory,- in effect creating a neural network pushdown,automata (NNPDA). This NNPDA general- izes the concept of a recurrent network so that the network becomes,a more complex,computing,structure. This paper discusses in detail a NNPDA - its construction, how it can be trained and how useful symbolic information can be ex- tracted from the trained network. To effectively couple the external stack to the neural network, an optimization method is developed which uses an error function that connects the learning of the state automaton of the neural network to the learning of the operation of the external stack: push, pop, and no-operation. To minimize the error function using gradient descent learning, an analog stack is designed such that the action and storage of information in the stack are continuous. One interpretation of a continuous stack is the probabilistic storage of and action on data. After training on sample strings of an unknown source grammar, a quantization procedure extracts from the analog stack and neural network a discrete pushdown au- tomata (PDA). Simulations show,that in learning deterministic context-free grammars,- the balanced parenthesis language, 1,, and the deterministic Palindrome - the extracted PDA is correctin the sense that it can correctly rec- ognize unseen strings of arbitrary length. In addition, the extracted PDAs can be shown to be identical or equivalent to the PDAs of the source grammars,which were used to generate the training strings.
Year
Venue
Keywords
2017
The neural network pushdown automation: model, stack and learning simulations
neural network pushdown automation,technical report
Field
DocType
Volume
Computer science,Call stack,Recurrent neural network,Theoretical computer science,Artificial intelligence,Artificial neural network,Stack-based memory allocation,Embedded pushdown automaton,Nested stack automaton,Automaton,Algorithm,Pushdown automaton,Machine learning
Journal
abs/1711.05738
Citations 
PageRank 
References 
11
1.17
18
Authors
4
Name
Order
Citations
PageRank
G. Z. Sun1111.17
C. Lee Giles2111541549.48
H. H. Chen3111.17
Y. C. Lee4111.17