Abstract | ||
---|---|---|
We propose Nester, a method for injecting neural networks into constrained structured predictors. Nester first uses a neural network to compute an initial prediction that may or may not satisfy the constraints, and then applies a constraint-based structured predictor to refine the raw predictions according to hard and soft constraints. Nester combines the advantages of its two components: the network can learn complex representations from low-level data while the constraint program on top reasons about the high-level properties and requirements of the prediction task. An empirical evaluation on handwritten equation recognition shows that Nester achieves better performance than both the either component in isolation, especially when training examples are scarce, while scaling to more complex problems than other neuro-programming approaches. Nester proves especially useful to reduce errors at the semantic level of the problem, which is particularly challenging for neural network architectures. |
Year | Venue | Keywords |
---|---|---|
2021 | NESY 2021: NEURAL-SYMBOLIC LEARNING AND REASONING | Machine Learning, Neuro-Symbolic Integration, Structured Prediction, Constraint Programming |
DocType | Volume | ISSN |
Conference | 2986 | 1613-0073 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
paolo dragone | 1 | 7 | 3.44 |
stefano teso | 2 | 38 | 14.21 |
Andrea Passerini | 3 | 569 | 46.88 |