Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: a unified study

Asymptotic behaviour and statistical applications of divergence measures in multinomial... Divergence measures play an important role in statistical theory, especially in large sample theories of estimation and testing. The underlying reason is that they are indices of statistical distance between probability distributions P and Q; the smaller these indices are the harder it is to discriminate between P and Q. Many divergence measures have been proposed since the publication of the paper of Kullback and Leibler (1951). Renyi (1961) gave the first generalization of Kullback-Leibler divergence, Jeffreys (1946) defined the J-divergences, Burbea and Rao (1982) introduced the R-divergences, Sharma and Mittal (1977) the (r,s)-divergences, Csiszar (1967) the ϕ-divergences, Taneja (1989) the generalized J-divergences and the generalized R-divergences and so on. In order to do a unified study of their statistical properties, here we propose a generalized divergence, called (h,ϕ)-divergence, which include as particular cases the above mentioned divergence measures. Under different assumptions, it is shown that the asymptotic distributions of the (h,ϕ)-divergence statistics are either normal or chi square. The chi square and the likelihood ratio test statistics are particular cases of the (h,ϕ)-divergence test statistics considered. From the previous results, asymptotic distributions of entropy statistics are derived too. Applications to testing statistical hypothesis in multinomial populations are given. The Pitman and Bahadur efficiencies of tests of goodness of fit and independence based on these statistics are obtained. To finish, apendices with the asymptotic variances of many well known divergence and entropy statistics are presented. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Statistical Papers Springer Journals

Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: a unified study

, Volume 36 (1) – Dec 1, 1995
29 pages

/lp/springer-journals/asymptotic-behaviour-and-statistical-applications-of-divergence-6KOa4mXJZ7

References (22)

Publisher
Springer Journals
Subject
Statistics; Statistics for Business, Management, Economics, Finance, Insurance; Probability Theory and Stochastic Processes; Economic Theory/Quantitative Economics/Mathematical Methods; Operations Research/Decision Theory
ISSN
0932-5026
eISSN
1613-9798
DOI
10.1007/BF02926015
Publisher site
See Article on Publisher Site

Abstract

Divergence measures play an important role in statistical theory, especially in large sample theories of estimation and testing. The underlying reason is that they are indices of statistical distance between probability distributions P and Q; the smaller these indices are the harder it is to discriminate between P and Q. Many divergence measures have been proposed since the publication of the paper of Kullback and Leibler (1951). Renyi (1961) gave the first generalization of Kullback-Leibler divergence, Jeffreys (1946) defined the J-divergences, Burbea and Rao (1982) introduced the R-divergences, Sharma and Mittal (1977) the (r,s)-divergences, Csiszar (1967) the ϕ-divergences, Taneja (1989) the generalized J-divergences and the generalized R-divergences and so on. In order to do a unified study of their statistical properties, here we propose a generalized divergence, called (h,ϕ)-divergence, which include as particular cases the above mentioned divergence measures. Under different assumptions, it is shown that the asymptotic distributions of the (h,ϕ)-divergence statistics are either normal or chi square. The chi square and the likelihood ratio test statistics are particular cases of the (h,ϕ)-divergence test statistics considered. From the previous results, asymptotic distributions of entropy statistics are derived too. Applications to testing statistical hypothesis in multinomial populations are given. The Pitman and Bahadur efficiencies of tests of goodness of fit and independence based on these statistics are obtained. To finish, apendices with the asymptotic variances of many well known divergence and entropy statistics are presented.