Abstract | ||
---|---|---|
Several mathematical distances between probabilistic languages have been investigated in the literature, motivated by applications in language modeling, computational biology, syntactic pattern matching and machine learning. In most cases, only pairs of probabilistic regular languages were considered. In this paper we extend the previous results to pairs of languages generated by a probabilistic context-free grammar and a probabilistic finite automaton. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1016/j.tcs.2008.01.010 | Theor. Comput. Sci. |
Keywords | DocType | Volume |
Kullback–Leibler divergence,Probabilistic finite automata,context-free probabilistic language,Language entropy,computational biology,language modeling,Probabilistic context-free languages,probabilistic regular language,syntactic pattern matching,probabilistic finite automaton,Probabilistic language distances,machine learning,probabilistic language,mathematical distance,probabilistic context-free grammar,previous result | Journal | 395 |
Issue | ISSN | Citations |
2-3 | Theoretical Computer Science | 9 |
PageRank | References | Authors |
0.63 | 21 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Mark-jan Nederhof | 1 | 387 | 53.30 |
Giorgio Satta | 2 | 902 | 90.85 |