Access the full text.
Sign up today, get DeepDyve free for 14 days.
Zhengmin Zhang (2007)
Estimating Mutual Information Via Kolmogorov DistanceIEEE Transactions on Information Theory, 53
K. Marton (1986)
A simple proof of the blowing-up lemmaIEEE Trans. Inf. Theory, 32
V Strassen (1965)
The Existence of Probability Measures with Given MarginalsAnn. Math. Statist., 36
D. Edwards (1978)
On the existence of probability measures with given marginalsAnnales de l'Institut Fourier, 28
I. Sason (2012)
Entropy Bounds for Discrete Random Variables via Maximal CouplingIEEE Transactions on Information Theory, 59
Let X and Y be discrete random variables having probability distributions P X and P Y , respectively. A necessary and sufficient condition is obtained for the existence of an α-coupling of these random variables, i.e., for the existence of their joint distribution such that Pr{X = Y} = α, where α, 0 ≤ α ≤ 1, is a given constant. This problem is closely related with the problem of determining the minima of the divergences D(P Z ‖ P X ) and D(P X ‖ P Z ) over all probability distributions P Z of a random variable Z given P X and under the condition that Pr{Z = X} = α. An explicit solution for this problem is also obtained.
Problems of Information Transmission – Springer Journals
Published: Jul 7, 2015
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.