Title
Abstraction refinement guided by a learnt probabilistic model.
Abstract
The core challenge in designing an effective static program analysis is to find a good program abstraction -- one that retains only details relevant to a given query. In this paper, we present a new approach for automatically finding such an abstraction. Our approach uses a pessimistic strategy, which can optionally use guidance from a probabilistic model. Our approach applies to parametric static analyses implemented in Datalog, and is based on counterexample-guided abstraction refinement. For each untried abstraction, our probabilistic model provides a probability of success, while the size of the abstraction provides an estimate of its cost in terms of analysis time. Combining these two metrics, probability and cost, our refinement algorithm picks an optimal abstraction. Our probabilistic model is a variant of the Erdos--Renyi random graph model, and it is tunable by what we call hyperparameters. We present a method to learn good values for these hyperparameters, by observing past runs of the analysis on an existing codebase. We evaluate our approach on an object sensitive pointer analysis for Java programs, with two client analyses (PolySite and Downcast).
Year
DOI
Venue
2016
10.1145/2914770.2837663
ACM SIGPLAN Notices
Keywords
Field
DocType
Datalog,Horn,hypergraph,probability
Codebase,Abstraction model checking,Static program analysis,Programming language,Abstraction,Computer science,Theoretical computer science,Artificial intelligence,Pointer analysis,Abstraction inversion,Statistical model,Datalog,Machine learning
Conference
Volume
Issue
ISSN
abs/1511.01874
1
0362-1340
Citations 
PageRank 
References 
8
0.46
36
Authors
2
Name
Order
Citations
PageRank
Radu Grigore11259.83
Hongseok Yang22313115.85