Title
Finding Underlying Factors in Timeseries
Abstract
We compare four neural methods of pre-processing time series data - Principal Component Analysis (PCA) (Karhunen and Joutsensalo 1994), a neural implementation of Factor Analysis (FA), Independent Component Analysis (ICA) (Hyvarinen and Oja 1997) and Complexity Pursuit (CP) (Hyvarinen 2001) - with a view to subsequently using a multi-layer perceptron (MLP) to forecast on the data set. Our rationale is that forecasting the underlying factors will be easier than forecasting the original time series which is a combination of these factors. The projections of the data onto the filters found by the pre-processing method were fed into the MLP and it was trained to find Least Mean Square Error (LMSE). We show that forecasting the projections on the underlying factors reduces the need to consider the possibility of overtraining the MLP. The last method, CP, achieves by far the best (in terms of least mean square error) performance. Factor Analysis (FA) and particularly Independent Component Analysis (ICA) have the worst performance. Minor modifications to the Complexity Pursuit (CP) method are shown to improve performance in terms of LMSE.
Year
DOI
Venue
2002
10.1080/01969720290040614
CYBERNETICS AND SYSTEMS
Field
DocType
Volume
Overtraining,Time series,Pattern recognition,Least mean square error,Computer science,Artificial intelligence,Independent component analysis,Perceptron,Principal component analysis,Machine learning
Journal
33.0
Issue
ISSN
Citations 
4
0196-9722
3
PageRank 
References 
Authors
0.49
6
2
Name
Order
Citations
PageRank
Ying Han151.97
Colin Fyfe250855.62