Abstract | ||
---|---|---|
We present a new approach for language modeling based on dynamic Bayesian networks The philosophy behind this architecture is to learn from data the appropriate relations of dependency between the linguistic variables used in language modeling process It is an original and coherent framework that processes words and classes in the same model This approach leads to new data-driven language models capable of outperforming classical ones, sometimes with lower computational complexity We present experiments on a small and medium corpora The results show that this new technique is very promising and deserves further investigations. |
Year | DOI | Venue |
---|---|---|
2005 | 10.1007/11424918_47 | Canadian Conference on AI |
Keywords | Field | DocType |
linguistic variable,present experiment,new data-driven language model,dynamic bayesian network,coherent framework,rethinking language model,language modeling process,lower computational complexity,appropriate relation,new approach,new technique,computational complexity,language model | Data modeling,Architecture,Joint probability distribution,Computer science,Modeling language,Bayesian network,Artificial intelligence,Language model,Dynamic Bayesian network,Computational complexity theory | Conference |
Volume | ISSN | ISBN |
3501 | 0302-9743 | 3-540-25864-7 |
Citations | PageRank | References |
0 | 0.34 | 6 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Murat Deviren | 1 | 26 | 4.65 |
Khalid Daoudi | 2 | 145 | 23.68 |
Kamel Smaïli | 3 | 120 | 25.18 |