Title
Dimensionality Reduction To Maximize Prediction Generalization Capability
Abstract
Predicting what comes next in a previously unseen time series of input data is a challenging task for machine learning. A novel unsupervised learning scheme termed predictive principal component analysis can extract the most informative components for predicting future inputs with low computational cost.Generalization of time series prediction remains an important open issue in machine learning; earlier methods have either large generalization errors or local minima. Here, we develop an analytically solvable, unsupervised learning scheme that extracts the most informative components for predicting future inputs, which we call predictive principal component analysis (PredPCA). Our scheme can effectively remove unpredictable noise and minimize test prediction error through convex optimization. Mathematical analyses demonstrate that, provided with sufficient training samples and sufficiently high-dimensional observations, PredPCA can asymptotically identify hidden states, system parameters and dimensionalities of canonical nonlinear generative processes, with a global convergence guarantee. We demonstrate the performance of PredPCA using sequential visual inputs comprising handwritten digits, rotating three-dimensional objects and natural scenes. It reliably estimates distinct hidden states and predicts future outcomes of previously unseen test input data, based exclusively on noisy observations. The simple architecture and low computational cost of PredPCA are highly desirable for neuromorphic hardware.
Year
DOI
Venue
2021
10.1038/s42256-021-00306-1
NATURE MACHINE INTELLIGENCE
DocType
Volume
Issue
Journal
3
5
Citations 
PageRank 
References 
0
0.34
0
Authors
2
Name
Order
Citations
PageRank
Takuya Isomura103.04
Taro Toyoizumi217217.52