Title
Universal and Composite Hypothesis Testing via Mismatched Divergence
Abstract
For the universal hypothesis testing problem, where the goal is to decide between the known null hypothesis distribution and some other unknown distribution, Hoeffding proposed a universal test in the nineteen sixties. Hoeffding's universal test statistic can be written in terms of Kullback-Leibler (K-L) divergence between the empirical distribution of the observations and the null hypothesis distribution. In this paper a modification of Hoeffding's test is considered based on a relaxation of the K-L divergence, referred to as the mismatched divergence. The resulting mismatched test is shown to be a generalized likelihood-ratio test (GLRT) for the case where the alternate distribution lies in a parametric family of distributions characterized by a finite-dimensional parameter, i.e., it is a solution to the corresponding composite hypothesis testing problem. For certain choices of the alternate distribution, it is shown that both the Hoeffding test and the mismatched test have the same asymptotic performance in terms of error exponents. A consequence of this result is that the GLRT is optimal in differentiating a particular distribution from others in an exponential family. It is also shown that the mismatched test has a significant advantage over the Hoeffding test in terms of finite sample size performance for applications involving large alphabet distributions. This advantage is due to the difference in the asymptotic variances of the two test statistics under the null hypothesis.
Year
DOI
Venue
2009
10.1109/TIT.2011.2104670
Clinical Orthopaedics and Related Research
Keywords
DocType
Volume
composite hypothesis testing,empirical distribution,generalized likelihood-ratio test,null hypothesis distribution,test statistic,hoeffding test,mismatched test,large alphabet distribution,alternate distribution,mismatched divergence,universal test statistic,universal test,statistical distributions,entropy,testing,kullback leibler,generalized likelihood ratio test,robustness,statistical testing,source coding,convergence,kullback leibler divergence,source code,hypothesis testing,hypothesis test
Journal
57
Issue
ISSN
Citations 
3
0018-9448
20
PageRank 
References 
Authors
1.74
12
5
Name
Order
Citations
PageRank
Jayakrishnan Unnikrishnan128021.34
Dayu Huang2487.81
S. P. Meyn320539.53
Amit Surana47815.15
V. V. Veeravalli51224126.52