Access the full text.
Sign up today, get DeepDyve free for 14 days.
O. Onicescu (1966)
Energie informationalleC. R. Acad. Sc. Paris Ser. A, 263
S. Kullback, A. Leibler (1951)
On the information and sufficiencyAnn. Math. Statist., 27
I. J. Taneja (1989)
On generalized information measures and their applicationsAdv. Elect. and Elect. Phis., 76
L. Pardo, D. Morales, M. Salicrú, M. Menéndez (1993)
Theϕ-divergence statistic in bivariate multinomial populations including stratificationMetrika, 40
M.L. Menéndez, D. Morales, L. Pardo, M. Salicrú (1992)
Some statistical applications of (r,s)-directed divergencesUtilit. Mathemat., 42
Calyampudi Rao (1965)
Linear statistical inference and its applications
S. Arimoto (1971)
Information-Theoretical Considerations on Estimation ProblemsInf. Control., 19
H. Heyer (1982)
Information and Sufficiency
J. Havrda, F. Charvát (1967)
Quantification method of classification processes. Concept of structural a-entropyKybernetika, 3
R.R. Bahadur (1971)
Some Limits Theorems in Statistics
C. Shannon (1948)
A mathematical theory of comunicationsBell System Tech. J., 27
B. D. Sharma, D. P. Mittal (1977)
New nonadditive measures of entropy for discrete probability distributionsJ. Math. Sci., 10
K. Zografos, K. Ferentinos, T. Papaioannou (1990)
Divergence statistics: sampling properties and multinomial goodness of fit and divergence testsCommunications in Statistics-theory and Methods, 19
L. Pardo, M. Salicrú, M. Menéndez, D. Morales (1993)
Asymptotic properties of ( r, s )-directed divergences in a stratified samplingApplied Mathematics and Computation, 55
J. Dik, M. Gunst (1985)
THE DISTRIBUTION OF GENERAL QUADRATIC FORMS IN NORMAStatistica Neerlandica, 39
N. Cressie, Timothy Read (1984)
Multinomial goodness-of-fit testsJournal of the royal statistical society series b-methodological, 46
I. Vajda (1989)
Theory of statistical inference and information
I. Csiszar (1967)
Information type measures of difference of probability distributions and indirect observationsStudia Sci. Mat. Hung., 2
D. Lindley (1956)
On a Measure of the Information Provided by an ExperimentAnnals of Mathematical Statistics, 27
I. Taneja (1989)
On Generalized Information Measures and Their ApplicationsAdvances in electronics and electron physics, 76
A. Rényi (1961)
On Measures of Entropy and Information
J.J. Dik, M.C.M. Gunst (1985)
The distribution of general quadratic forms in normal variablesStatistica Neerlandica, 39
Divergence measures play an important role in statistical theory, especially in large sample theories of estimation and testing. The underlying reason is that they are indices of statistical distance between probability distributions P and Q; the smaller these indices are the harder it is to discriminate between P and Q. Many divergence measures have been proposed since the publication of the paper of Kullback and Leibler (1951). Renyi (1961) gave the first generalization of Kullback-Leibler divergence, Jeffreys (1946) defined the J-divergences, Burbea and Rao (1982) introduced the R-divergences, Sharma and Mittal (1977) the (r,s)-divergences, Csiszar (1967) the ϕ-divergences, Taneja (1989) the generalized J-divergences and the generalized R-divergences and so on. In order to do a unified study of their statistical properties, here we propose a generalized divergence, called (h,ϕ)-divergence, which include as particular cases the above mentioned divergence measures. Under different assumptions, it is shown that the asymptotic distributions of the (h,ϕ)-divergence statistics are either normal or chi square. The chi square and the likelihood ratio test statistics are particular cases of the (h,ϕ)-divergence test statistics considered. From the previous results, asymptotic distributions of entropy statistics are derived too. Applications to testing statistical hypothesis in multinomial populations are given. The Pitman and Bahadur efficiencies of tests of goodness of fit and independence based on these statistics are obtained. To finish, apendices with the asymptotic variances of many well known divergence and entropy statistics are presented.
Statistical Papers – Springer Journals
Published: Dec 1, 1995
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.