Title
The Coding Divergence for Measuring the Complexity of Separating Two Sets
Abstract
In this paper we integrate two essential processes, discretization of continuous data and learning of a model that explains them, towards fully computational machine learning from continuous data. Discretization is fundamental for machine learning and data mining, since every continuous datum; e.g., a real-valued datum obtained by observation in the real world, must be discretized and converted from analog (continuous) to digital (discrete) form to store in databases. However, most machine learning methods do not pay attention to the situation; i.e., they use digital data in actual applications on a computer whereas assume analog data (usually vectors of real numbers) theoretically. To bridge the gap, we propose a novel measure of the difference between two sets of data, called the coding divergence, and unify two processes discretization and learning computationally. Discretization of continuous data is realized by a topological mapping (in the sense of mathematics) from the d-dimensional Euclidean space R-d into the Cantor space Sigma(omega), and the simplest model is learned in the Cantor space, which corresponds to the minimum open set separating the given two sets of data. Furthermore, we construct a classifier using the divergence, and experimentally demonstrate robust performance of it. Our contribution is not only introducing a new measure from the computational point of view, but also triggering more interaction between experimental science and machine learning.
Year
Venue
Keywords
2010
JMLR Workshop and Conference Proceedings
Coding divergence,Discretization,Cantor space,Binary encoding,Computable Analysis,Computational Learning Theory
Field
DocType
Volume
Divergence,Pattern recognition,Computer science,Coding (social sciences),Artificial intelligence
Journal
13
ISSN
Citations 
PageRank 
1938-7288
2
0.36
References 
Authors
7
2
Name
Order
Citations
PageRank
Mahito Sugiyama17713.27
Akihiro Yamamoto213526.84