Title
Lazy Model Expansion: Interleaving Grounding with Search.
Abstract
Finding satisfying assignments for the variables involved in a set of constraints can be cast as a (bounded) model generation problem: search for (bounded) models of a theory in some logic. The state-of-the-art approach for bounded model generation for rich knowledge representation languages like Answer Set Programming (ASP) and FO(.) and a CSP modeling language such as Zinc, is ground-and-solve: reduce the theory to a ground or propositional one and apply a search algorithm to the resulting theory. An important bottleneck is the blow-up of the size of the theory caused by the grounding phase. Lazily grounding the theory during search is a way to overcome this bottleneck. We present a theoretical framework and an implementation in the context of the FO(.) knowledge representation language. Instead of grounding all parts of a theory, justifications are derived for some parts of it. Given a partial assignment for the grounded part of the theory and valid justifications for the formulas of the non-grounded part, the justifications provide a recipe to construct a complete assignment that satisfies the non-grounded part. When a justification for a particular formula becomes invalid during search, a new one is derived; if that fails, the formula is split in a part to be grounded and a part that can be justified. Experimental results illustrate the power and generality of this approach.
Year
DOI
Venue
2014
10.1613/jair.4591
JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
DocType
Volume
ISSN
Journal
52
1076-9757
Citations 
PageRank 
References 
10
0.62
46
Authors
4
Name
Order
Citations
PageRank
Broes De Cat1646.24
Marc Denecker21626106.40
Peter J. Stuckey34368457.58
Maurice Bruynooghe42767226.05