Title
Learning figures with the hausdorff metric by fractals
Abstract
Discretization is a fundamental process for machine learning from analog data such as continuous signals. For example, the discrete Fourier analysis is one of the most essential signal processing methods for learning or recognition from continuous signals. However, only the direction of the time axis is discretized in the method, meaning that each datum is not purely discretized. To give a completely computational theoretical basis for machine learning from analog data, we construct a learning framework based on the Gold-style learning model. Using a modern mathematical computability theory in the field of Computable Analysis, we show that scalable sampling of analog data can be formulated as effective Gold-style learning. On the other hand, recursive algorithms are a key expression for models or rules explaining analog data. For example, FFT (Fast Fourier Transformation) is a fundamental recursive algorithm for discrete Fourier analysis. In this paper we adopt fractals, since they are general geometric concepts of recursive algorithms, and set learning objects as nonempty compact sets in the Euclidean space, called figures, in order to introduce fractals into Gold-style learning model, where the Hausdorff metric can be used to measure generalization errors. We analyze learnable classes of figures from informants (positive and negative examples) and from texts (positive examples), and reveal the hierarchy of learnabilities under various learning criteria. Furthermore, we measure the number of positive examples, one of complexities of learning, by using the Hausdorff dimension, which is the central concept of Fractal Geometry, and the VC dimension, which is used to measure the complexity of classes of hypotheses in the Valiant-style learning model. This work provides theoretical support for machine learning from analog data.
Year
DOI
Venue
2010
10.1007/978-3-642-16108-7_26
ALT
Keywords
Field
DocType
hausdorff dimension,fundamental recursive algorithm,effective gold-style learning,discrete fourier analysis,recursive algorithm,positive example,fast fourier transformation,analog data,various learning criterion,continuous signal,signal processing,computability theory,fast fourier transform,vc dimension,fractal geometry,generalization error,hausdorff metric,machine learning,euclidean space
Effective dimension,Discrete mathematics,Online machine learning,Algorithmic learning theory,Stability (learning theory),Computer science,Wake-sleep algorithm,Artificial intelligence,Computational learning theory,Hausdorff measure,Machine learning,Sample exclusion dimension
Conference
Volume
ISSN
ISBN
6331
0302-9743
3-642-16107-3
Citations 
PageRank 
References 
1
0.34
23
Authors
4
Name
Order
Citations
PageRank
Mahito Sugiyama17713.27
Eiju Hirowatari2163.09
Hideki Tsuiki35618.28
Akihiro Yamamoto413526.84