Access the full text.
Sign up today, get DeepDyve free for 14 days.
I. Csiszár (1996)
Almost Independence and Secrecy CapacityProbl. Peredachi Inf., 32
(1981)
Information Theory: Coding Theorems for Discrete Memoryless Systems, New York: Academic
(1996)
Almost Independence and Secrecy Capacity, Probl
M.S. Pinsker (2005)
On Estimation of Information via VariationProbl. Peredachi Inf., 41
M. Pinsker (2005)
On Estimation of Information via VariationProblems of Information Transmission, 41
(1960)
Informatsiya i informatsionnaya ustoichivost’ sluchainykh velichin i protsessov
I. Csiszár, J. Körner (1981)
Information Theory: Coding Theorems for Discrete Memoryless Systems
We continue studying the relationship between mutual information and variational distance started in Pinsker’s paper [1], where an upper bound for the mutual information via variational distance was obtained. We present a simple lower bound, which in some cases is optimal or asymptotically optimal. A uniform upper bound for the mutual information via variational distance is also derived for random variables with a finite number of values. For such random variables, the asymptotic behaviour of the maximum of mutual information is also investigated in the cases where the variational distance tends either to zero or to its maximum value.
Problems of Information Transmission – Springer Journals
Published: Apr 20, 2007
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.