Conditional quantum entropy: Difference between revisions

Content deleted Content added
changed to more modern notation: S(A|B)_rho instead of S(rho | sigma). I've explained the advantages of this on the talk page.
Line 1:
The '''conditional quantum entropy''' is an [[entropy measure]] used in [[quantum information theory]]. It is a generalization of the [[conditional entropy]] of [[classical information theory]]. TheFor a bipartite state <math>\rho^{AB}</math>, the conditional entropy is written <math>S(\rhoA|\sigmaB)_\rho</math>, or <math>H(\rhoA|\sigmaB)_\rho</math>, depending on the notation being used for the [[von Neumann entropy]].
 
For the remainder of the article, we use the notation <math>S(\rhocdot)</math> for the [[von Neumann entropy]], which we simply call "entropy".
 
== Definition ==
 
Given twoa bipartite quantum statesstate <math>\rho^{AB}</math>, andthe entropy of the entire system is <math>S(AB)_\sigmarho \ \stackrel{\mathrm{def}}{=}\ S(\rho^{AB})</math>, and the vonentropies Neumannof entropiesthe subsystems are <math>S(A)_\rho \ \stackrel{\mathrm{def}}{=}\ S(\rho^A) = S(\mathrm{tr}_B\rho^{AB})</math> and <math>S(\sigmaB)_\rho</math>. The von Neumann entropy measures how uncertain we are about the value of the state; how much the state is a [[mixed state (physics)|mixed state]]. The [[joint quantum entropy]] <math>S(\rho,\sigma)</math> measures our uncertainty about the [[joint system]] which contains both states.
 
By analogy with the classical conditional entropy, one defines the conditional quantum entropy as <math>S(A|B)_\rho|\sigma) \ \stackrel{\mathrm{def}}{=}\ S(AB)_\rho,\sigma) - S(\sigmaA)_\rho</math>.
 
An equivalent (and more intuitive) operational definition of the quantum conditional entropy (as a measure of the [[quantum communication]] cost or surplus when performing [[quantum state]] merging) was given by [[Michał Horodecki]], [[Jonathan Oppenheim]], and [[Andreas Winter]] in their paper "Quantum Information can be negative" [http://arxiv.org/abs/quant-ph/0505062].