Abstract | ||
---|---|---|
This paper establishes a link between Bayesian inference (learning) and predicate and state transformer operations from programming semantics and logic. Specifically, a very general definition of backward inference is given via first applying a predicate transformer and then conditioning. Analogously, forward inference involves first conditioning and then applying a state transformer. These definitions are illustrated in many examples in discrete and continuous probability theory and also in quantum theory. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1016/j.entcs.2016.09.038 | Electronic Notes in Theoretical Computer Science |
Keywords | Field | DocType |
Inference,learning,Bayes,Kleisli category,effectus,predicate transformer,state transformer | Predicate transformer semantics,Universal generalization,Bayesian inference,Computer science,Inference,Theoretical computer science,Predicate (grammar),Probability theory,Semantics,Bayes' theorem | Journal |
Volume | ISSN | Citations |
325 | 1571-0661 | 9 |
PageRank | References | Authors |
0.69 | 7 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
B. Jacobs | 1 | 1046 | 100.09 |
Fabio Zanasi | 2 | 110 | 13.89 |