Title
VC Dimensions of Principal Component Analysis
Abstract
Motivated by statistical learning theoretic treatment of principal component analysis, we are concerned with the set of points in ℝd that are within a certain distance from a k-dimensional affine subspace. We prove that the VC dimension of the class of such sets is within a constant factor of (k+1)(d−k+1), and then discuss the distribution of eigenvalues of a data covariance matrix by using our bounds of the VC dimensions and Vapnik’s statistical learning theory. In the course of the upper bound proof, we provide a simple proof of Warren’s bound of the number of sign sequences of real polynomials.
Year
DOI
Venue
2010
10.1007/s00454-009-9236-5
Discrete & Computational Geometry
Keywords
Field
DocType
VC dimensions,Principal component analysis,Warren’s bound
Statistical learning theory,Discrete mathematics,VC dimension,Combinatorics,Affine space,Polynomial,Upper and lower bounds,Statistical learning,Mathematics,Eigenvalues and eigenvectors,Principal component analysis
Journal
Volume
Issue
ISSN
44
3
0179-5376
Citations 
PageRank 
References 
2
0.51
2
Authors
4
Name
Order
Citations
PageRank
Yohji Akama17912.89
Kei Irie261.08
Akitoshi Kawamura310215.84
Yasutaka Uwano420.51