Access the full text.
Sign up today, get DeepDyve free for 14 days.
R. Courant, D. Hilbert (1947)
Methods of Mathematical PhysicsThe Mathematical Gazette, 31
L. Mead, N. Papanicolaou (1984)
Maximum entropy in the problem of momentsJournal of Mathematical Physics, 25
Calyampudi Rao (1965)
Linear statistical inference and its applications
J. Noonan, P. Natarajan (1997)
A general formulation for iterative restoration methodsIEEE Trans. Signal Process., 45
P.J. Bickel, K.A. Doksum
Mathematical Statistics
O. Barndorff-Nielsen (1970)
Information And Exponential Families
E. Jaynes (1983)
Papers on probability, statistics and statistical physicsActa Applicandae Mathematica, 20
E. Ziegel, E. Lehmann, G. Casella (1950)
Theory of point estimation
T. Cover, Joy Thomas (1991)
Elements of Information Theory
B. Crain (1974)
Estimation of Distributions Using Orthogonal ExpansionsAnnals of Statistics, 2
M. Tzannes, J. Noonan (1990)
On a relation between the principle of minimum relative entropy and maximum likelihood estimationIEEE International Symposium on Circuits and Systems
C. Craig, S. Kullback (1960)
Information Theory and StatisticsMathematics of Computation, 14
Calyampudi Rao (2008)
Theory of Statistical Inference
O. Barndorff-Nielsen (1970)
Exponential families exact theoty
H. Akaike (1974)
A new look at the statistical model identificationIEEE Transactions on Automatic Control, 19
V. Arnold (1974)
Mathematical Methods of Classical Mechanics
O. Barndorff-Nielsen (1980)
Information and Exponential Families in Statistical Theory
Purpose – In many problems involving decision‐making under uncertainty, the underlying probability model is unknown but partial information is available. In some approaches to this problem, the available prior information is used to define an appropriate probability model for the system uncertainty through a probability density function. When the prior information is available as a finite sequence of moments of the unknown probability density function (PDF) defining the appropriate probability model for the uncertain system, the maximum entropy (ME) method derives a PDF from an exponential family to define an approximate model. This paper, aims to investigate some optimality properties of the ME estimates. Design/methodology/approach – For n > m , when the exact model can be best approximated by one of an infinite number of unknown PDFs from an n parameter exponential family. The upper bound of the divergence distance between any PDF from this family and the m parameter exponential family PDF defined by the ME method are derived. A measure of adequacy of the model defined by ME method is thus provided. Findings – These results may be used to establish confidence intervals on the estimate of a function of the random variable when the ME approach is employed. Additionally, it is shown that when working with large samples of independent observations, a probability density function (PDF) can be defined from an exponential family to model the uncertainty of the underlying system with measurable accuracy. Finally, a relationship with maximum likelihood estimation for this case is established. Practical implications – The so‐called known moments problem addressed in this paper has a variety of applications in learning, blind equalization and neural networks. Originality/value – An upper bound for error in approximating an unknown density function, f ( x ) by its ME estimate based on m moment constraints, obtained as a PDF p ( x , α ) from an m parameter exponential family is derived. The error bound will help us decide if the number of moment constraints is adequate for modeling the uncertainty in the system under study. In turn, this allows one to establish confidence intervals on an estimate of some function of the random variable, X , given the known moments. It is also shown how, when working with a large sample of independent observations, instead of precisely known moment constraints, a density from an exponential family to model the uncertainty of the underlying system with measurable accuracy can be defined. In this case, a relationship to ML estimation is established.
Kybernetes – Emerald Publishing
Published: Feb 20, 2007
Keywords: Cybernetics; Error analysis; Decision making
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.