The problem of determining both the maximum and minimum entropy of a random variable Y as well as the maximum absolute value of the difference between entropies of Y and another random variable X is considered under the condition that the probability distribution of X is fixed and the error probability (i.e., the probability of noncoincidence of random values of X and Y) is given. A precise expression for the minimum entropy of Y is found. Some conditions under which the entropy of Y takes its maximum value are pointed out. In other cases, some lower and upper bounds are obtained for the maximum entropy of Y as well as for the maximum absolute value of the difference between entropies of Y and X.
Problems of Information Transmission – Springer Journals
Published: Oct 16, 2014
It’s your single place to instantly
discover and read the research
that matters to you.
Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.
All for just $49/month
Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly
Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.
Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.
Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.
All the latest content is available, no embargo periods.
“Hi guys, I cannot tell you how much I love this resource. Incredible. I really believe you've hit the nail on the head with this site in regards to solving the research-purchase issue.”Daniel C.
“Whoa! It’s like Spotify but for academic articles.”@Phil_Robichaud
“I must say, @deepdyve is a fabulous solution to the independent researcher's problem of #access to #information.”@deepthiw
“My last article couldn't be possible without the platform @deepdyve that makes journal papers cheaper.”@JoseServera