Title
Linear Pooling of Sample Covariance Matrices
Abstract
We consider the problem of estimating high-dimensional covariance matrices of <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$K$</tex-math></inline-formula> -populations or classes in the setting where the sample sizes are comparable to the data dimension. We propose estimating each class covariance matrix as a distinct linear combination of all class sample covariance matrices. This approach is shown to reduce the estimation error when the sample sizes are limited, and the true class covariance matrices share a somewhat similar structure. We develop an effective method for estimating the coefficients in the linear combination that minimize the mean squared error under the general assumption that the samples are drawn from (unspecified) elliptically symmetric distributions possessing finite fourth-order moments. To this end, we utilize the spatial sign covariance matrix, which we show (under rather general conditions) to be an asymptotically unbiased estimator of the normalized covariance matrix as the dimension grows to infinity. We also show how the proposed method can be used in choosing the regularization parameters for multiple target matrices in a single class covariance matrix estimation problem. We assess the proposed method via numerical simulation studies including an application in global minimum variance portfolio optimization using real stock data.
Year
DOI
Venue
2022
10.1109/TSP.2021.3139207
IEEE Transactions on Signal Processing
Keywords
DocType
Volume
Covariance matrix,elliptical distribution,high-dimensional,multiclass,regularization,shrinkage,spatial sign covariance matrix
Journal
70
ISSN
Citations 
PageRank 
1053-587X
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Elias Raninen101.35
David E. Tyler200.34
Esa Ollila335133.51