Access the full text.
Sign up today, get DeepDyve free for 14 days.
G. Rota (2001)
Twelve problems in probability no one likes to bring up
Calyampudi Rao (1982)
Diversity and dissimilarity coefficients: A unified approach☆Theoretical Population Biology, 21
E. Simpson (1949)
Measurement of DiversityNature, 163
C. Tsallis (1988)
Possible generalization of Boltzmann-Gibbs statisticsJournal of Statistical Physics, 52
A. Turing, I. Good (1979)
Studies in the History of Probability and Statistics. XXXVII
A. Kolmogorov (1983)
Combinatorial foundations of information theory and the calculus of probabilitiesRussian Mathematical Surveys, 38
D. Ellerman (2017)
Logical information theory: new logical foundations for information theoryLog. J. IGPL, 25
C. Ricotta, L. Szeidl (2006)
Towards a unifying approach to diversity measures: bridging the gap between the Shannon entropy and Rao's quadratic index.Theoretical population biology, 70 3
Charles Bennett (2003)
Quantum Information: Qubits and Quantum Error CorrectionInternational Journal of Theoretical Physics, 42
D. Ellerman (2014)
An introduction to partition logicLog. J. IGPL, 22
M. Rejewski (1981)
How Polish Mathematicians Broke the Enigma CipherAnnals of the History of Computing, 3
D. Ellerman (2009)
Counting distinctions: on the conceptual foundations of Shannon’s information theorySynthese, 168
J. Kung, G. Rota, C. Yan (2009)
Combinatorics - The Rota Way
[This book presents a new foundation for information theory where the notion of information is defined in terms of distinctions, differences, distinguishability, and diversity. The direct measure is logical entropy which is the quantitative measure of the distinctions made by a partition. Shannon entropy is a transform or re-quantification of logical entropy for Claude Shannon’s “mathematical theory of communications.” The interpretation of the logical entropy of a partition is the two-draw probability of getting a distinction of the partition (a pair of elements distinguished by the partition) so it realizes a dictum of Gian-Carlo Rota: ProbabilitySubsets≈InformationPartitions\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\frac {Probability}{Subsets}\approx \frac {Information}{Partitions}$$ \end{document}. Andrei Kolmogorov suggested that information should be defined independently of probability, so logical entropy is first defined in terms of the set of distinctions of a partition and then a probability measure on the set defines the quantitative version of logical entropy. We give a history of the logical entropy formula that goes back to Corrado Gini’s 1912 “index of mutability” and has been rediscovered many times.]
Published: Sep 2, 2021
Keywords: Information-as-distinctions; Logical entropy; History of the formula
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.