Abstract | ||
---|---|---|
A fast and accurate linear supervised algorithm is presented which compares favorably to other state of the art algorithms over several real data collections on the problem of text categorization. Although it has been already presented in [6], no proof of its convergence is given. From the geometric intuition of the algorithm it is evident that it is not a Perceptron or a gradient descent algorithm thus an algebraic proof of its convergence is provided in the case of linearly separable classes. Additionally we present experimental results on many standard text classification datasets and artificially generated linearly separable datasets. The proposed algorithm is very simple to use and easy to implement and it can be used in any domain without any modification on the data or parameter estimation. |
Year | DOI | Venue |
---|---|---|
2010 | 10.1007/978-3-642-17316-5_8 | ADMA (1) |
Keywords | Field | DocType |
standard text classification datasets,linearly separable class,algebraic proof,gradient descent algorithm,perceptron-like linear supervised algorithm,real data collection,art algorithm,accurate linear supervised algorithm,text categorization,proposed algorithm,linearly separable datasets,parameter estimation,data collection,machine learning,gradient descent | Convergence (routing),Data mining,Linear separability,Algebraic number,Computer science,Intuition,Artificial intelligence,Estimation theory,Text categorization,Gradient descent,Pattern recognition,Algorithm,Perceptron,Machine learning | Conference |
Volume | ISSN | ISBN |
6440 | 0302-9743 | 3-642-17315-2 |
Citations | PageRank | References |
1 | 0.35 | 14 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Anestis Gkanogiannis | 1 | 12 | 2.23 |
Theodore Kalamboukis | 2 | 51 | 8.43 |