Title
Rethinking language models within the framework of dynamic bayesian networks
Abstract
We present a new approach for language modeling based on dynamic Bayesian networks The philosophy behind this architecture is to learn from data the appropriate relations of dependency between the linguistic variables used in language modeling process It is an original and coherent framework that processes words and classes in the same model This approach leads to new data-driven language models capable of outperforming classical ones, sometimes with lower computational complexity We present experiments on a small and medium corpora The results show that this new technique is very promising and deserves further investigations.
Year
DOI
Venue
2005
10.1007/11424918_47
Canadian Conference on AI
Keywords
Field
DocType
linguistic variable,present experiment,new data-driven language model,dynamic bayesian network,coherent framework,rethinking language model,language modeling process,lower computational complexity,appropriate relation,new approach,new technique,computational complexity,language model
Data modeling,Architecture,Joint probability distribution,Computer science,Modeling language,Bayesian network,Artificial intelligence,Language model,Dynamic Bayesian network,Computational complexity theory
Conference
Volume
ISSN
ISBN
3501
0302-9743
3-540-25864-7
Citations 
PageRank 
References 
0
0.34
6
Authors
3
Name
Order
Citations
PageRank
Murat Deviren1264.65
Khalid Daoudi214523.68
Kamel Smaïli312025.18