Title
Latent semantic analysis of game models using LSTM
Abstract
We are proposing a method for identifying whether the observed behaviour of a function at an interface is consistent with the typical behaviour of a particular programming language. This is a challenging problem with significant potential applications such as in security (intrusion detection) or compiler optimisation (profiling). To represent behaviour we use game semantics, a powerful method of semantic analysis for programming languages. It gives mathematically accurate models (‘fully abstract’) for a wide variety of programming languages. Game-semantic models are combinatorial characterisations of all possible interactions between a term and its syntactic context. Because such interactions can be concretely represented as sets of sequences, it is possible to ask whether they can be learned from examples. Concretely, we are using LSTM, a technique which proved effective in learning natural languages for automatic translation and text synthesis, to learn game-semantic models of sequential and concurrent versions of Idealised Algol (IA), which are algorithmically complex yet can be concisely described. We will measure how accurate the learned models are as a function of the degree of the term and the number of free variables involved. Finally, we will show how to use the learned model to perform latent semantic analysis between concurrent and sequential Idealised Algol.
Year
DOI
Venue
2019
10.1016/j.jlamp.2019.04.003
Journal of Logical and Algebraic Methods in Programming
Keywords
Field
DocType
Programming language semantics,Game semantics,Recurrent neural networks,Machine learning
Free variables and bound variables,Recurrent neural network,Theoretical computer science,Compiler,Natural language,Latent semantic analysis,Game semantics,Intrusion detection system,Syntax,Mathematics
Journal
Volume
Issue
ISSN
106
1
2352-2208
Citations 
PageRank 
References 
0
0.34
0
Authors
2
Name
Order
Citations
PageRank
Dan R. Ghica134630.34
Khulood AlYahya2195.17