Title
Linear Discriminant Analysis Under F-Divergence Measures
Abstract
In statistical inference, the information-theoretic performance limits can be often expressed in terms of a notion of divergence between the underlying statistical models (e.g., in binary hypothesis testing, the total error probability is related to the total variation between the models). As the data dimension grows, computing the statistics involved in decision-making and the attendant performance limits (divergence measures) face complexity and stability challenges. Dimensionality reduction addresses these challenges at the expense of compromising the performance (divergence reduces due to the data processing inequality for divergence). This paper considers linear dimensionality reduction such that the divergence between the models is maximally preserved. Specifically, this paper focuses on the Gaussian models and characterizes an optimal projection of the data onto a lower dimensional subspace with respect to four f-divergence measures (Kullback-Leibler, chi(2), Hellinger, and total variation). There are two key observations. First, projections are not necessarily along the largest modes of the covariance matrix of the data, and even in some situations can be along the smallest modes. Secondly, under specific regimes, the optimal design of subspace projection is identical under all the f-divergence measures considered, rendering a degree of universality to the design, independently of the inference problem of interest.
Year
DOI
Venue
2021
10.1109/ISIT45174.2021.9518004
2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Anmol Dwivedi100.34
Sihui Wang200.34
Ali Tajer300.34