Abstract | ||
---|---|---|
We present a new method for voting exponential (in the number of attributes) size sets of Bayesian classifiers in polynomial time with polynomial memory requirements. Training is linear in the number of instances in the dataset and can be performed incrementally. This allows the collection to learn from massive data streams. The method allows for flexibility in balancing computational complexity, memory requirements and classification performance. Unlike many other incremental Bayesian methods, all statistics kept in memory are directly used in classification. Experimental results show that the classifiers perform well on both small and very large data sets, and that classification performance can be weighed against computational and memory costs. |
Year | DOI | Venue |
---|---|---|
2006 | 10.1007/11941439_28 | Australian Conference on Artificial Intelligence |
Keywords | Field | DocType |
massive data stream,large data set,incremental bayesian method,bayesian classifier,massive collection,bayesian network classifier,memory requirement,memory cost,new method,polynomial memory requirement,computational complexity,classification performance,bayesian method,polynomial time | Data stream mining,Polynomial,Computer science,Bayesian network,Artificial intelligence,Classifier (linguistics),Time complexity,Conditional probability table,Machine learning,Computational complexity theory,Bayesian probability | Conference |
Volume | ISSN | ISBN |
4304 | 0302-9743 | 3-540-49787-0 |
Citations | PageRank | References |
2 | 0.37 | 6 |
Authors | ||
1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Remco R. Bouckaert | 1 | 484 | 82.93 |