Access the full text.
Sign up today, get DeepDyve free for 14 days.
D. Ellerman (2018)
Logical Entropy: Introduction to Classical and Quantum Logical Information TheoryEntropy, 20
W. Zurek (2001)
Decoherence, einselection, and the quantum origins of the classicalReviews of Modern Physics, 75
G. Manfredi, M. Feix (2000)
Entropy and wigner functionsPhysical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics, 62 4 Pt A
D. Ellerman (2018)
The quantum logic of direct-sumdecompositions: the dual to the quantum logic of subspacesLog. J. IGPL, 26
U. Fano (1957)
Description of States in Quantum Mechanics by Density Matrix and Operator TechniquesReviews of Modern Physics, 29
B. Tamir, E. Cohen (2015)
A Holevo-type bound for a Hilbert Schmidt distance measureArXiv, abs/1502.04691
Charles Bennett (2003)
Quantum Information: Qubits and Quantum Error CorrectionInternational Journal of Theoretical Physics, 42
D. Ellerman (2009)
Counting distinctions: on the conceptual foundations of Shannon’s information theorySynthese, 168
[The transition to density matrices in QM is facilitated by reformulating the ‘classical’ (i.e., non-quantum) results about logical entropy using density matrices. Then the transition to the quantum version of logical entropy is made using the semi-algorithmic procedure of “linearization.” Given a concept applied to sets, apply that concept to the basis set of a vector space and whatever it linearly generates gives the corresponding vector space concept. Classically, the logical entropy hf−1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$h\left ( f^{-1}\right ) $$ \end{document} of the inverse-image partition of f:U→ℝ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$f:U\rightarrow \mathbb {R} $$ \end{document} with point probabilities on U is the application of the product probability p × p measure on U × U to the ditset ditf−1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\operatorname {dit}\left ( f^{-1}\right ) $$ \end{document}. That is linearized in quantum logical information theory with qudit spaces replacing the ditsets. Another set of classical definitions start with two probability distributions p=p1,…,pn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$p=\left ( p_{1},\ldots ,p_{n}\right ) $$ \end{document} and q=q1,…,qn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$q=\left ( q_{1},\ldots ,q_{n}\right ) $$ \end{document} over the same index set. Since probability distributions are supplied by density matrices, the corresponding quantum logical notions are developed starting with two density matrices ρ and τ. Finally, we show how to make the generalization of Hamming distance and that it in fact ends up being the same as the existing quantum notion of the Hilbert-Schmidt distance between two density matrices—which is the quantum version of the Euclidean distance squared d(p||q) between two probability distributions in ‘classical’ logical information theory.]
Published: Sep 2, 2021
Keywords: Density matrices; Linearization; Quantum logical entropy; Projective measurement; Quantum Hamming distance; Hilbert-Schmidt norm
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.