Skip to content

Metrics

qudit.tools provides classes of information-theoretic functionals: Fidelity, Entropy, Info, and Distance.

All inputs can be statevectors (1D arrays) or density matrices (2D arrays). Methods will promote pure states as statevectors, to density matrices automatically.

Fidelity

Fidelity measures overlap between quantum states and channels.

MethodDescription
Fidelity.default(rho, sigma)Uhlmann fidelity F(ρ,σ)
Fidelity.channel(kraus, rho)Apply a Kraus channel E(ρ)=kKkρKk
Fidelity.entanglement(R, E, codes)Entanglement fidelity for encode→noise→recovery
Fidelity.bare_qubit(R, E, state)Average fidelity of a single logical state through noise→recovery
Fidelity.cafaro(kraus)Cafaro proxy Fe=k|Tr(Kk)|2/N2
Fidelity.negativity(rho, dA, dB)Negativity N(ρ)=(|ρTB|11)/2

State fidelity

For pure statevectors, fidelity is F=|ψ|ϕ|2. For density matrices, F(ρ,σ)=(Trρσρ)2.

python
psi = Ket("0")
phi = (Ket("0") + Ket("1")).norm()

print(Fidelity.default(psi, phi))  # 0.5
python
from qudit import Basis
from qudit.tools import Fidelity

Ket = Basis(2)

Entanglement fidelity

Used to benchmark quantum error correction: how well does a code+recovery pipeline preserve the logical subspace under noise?

Fe=QR|(RE)(|QRQR|)|QR

where |QR=1ki|i¯|i is the purification of the maximally mixed code state.

python
ops = Process.GAD(2, 4, Y=0.01, p=0.001)
rec = Recovery.petz(ops, code)

fid = Fidelity.entanglement(rec, ops, code)
print(fid)  # close to 1.0 for small noise
python
from qudit.noise import Process, Recovery
from qudit.tools import Fidelity
import numpy as np

code = np.array([
    [1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1.0],
    [0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0.0],
], dtype=np.complex64)
code /= np.linalg.norm(code, axis=1)[:, None]

Entropy

Entropy computes various entropic quantities. All density-matrix methods accept statevectors and promote them to ρ=|ψψ|.

MethodFormulaNotes
Entropy.neumann(rho)S(ρ)=Tr(ρlog2ρ)Default entropy
Entropy.shannon(probs)H(p)=ipilog2piClassical probabilities
Entropy.tsallis(rho, q)Sq=(1iλiq)/(q1)q1 recovers von Neumann
Entropy.renyi(rho, alpha)Sα=11αlog2iλiαα1 recovers von Neumann
Entropy.hartley(probs)H0=log2|supp(p)|Cardinality of support
Entropy.unified(rho, q, alpha)(q,α)-entropy familyInterpolates Tsallis/Renyi
Entropy.relative(rho, sigma)D(ρ|σ)=Tr[ρ(logρlogσ)]Quantum KL divergence
Entropy.conditional(rho, dA, dB)S(A|B)=S(AB)S(A)Bipartite state required

All methods accept an optional base keyword (default 2.0) to change the logarithm base.

python
rho = np.array([[0.5, 0], [0, 0.5]])

print(Entropy.neumann(rho))      # 1.0  (maximally mixed qubit)
print(Entropy.tsallis(rho, q=2)) # 0.5  (linear entropy)
print(Entropy.renyi(rho, alpha=2))  # 1.0
python
from qudit.tools import Entropy
import numpy as np

NOTE

Entropy.default is an alias for Entropy.neumann. Shannon entropy takes a probability vector (not a density matrix).

Info

Info derives higher-level correlations from entropies on bipartite states ρAB.

MethodFormulaDescription
Info.mutual(rho, dA, dB)I(A:B)=S(A)+S(B)S(AB)Quantum mutual information
Info.coherent(rho_AB, dA, dB)Ic(AB)=S(B)S(AB)Coherent information
Info.conditional(rho, dA, dB)see belowTwo conventions

Info.conditional has a true_case flag:

  • true_case=True (default): measurement-induced conditional entropy on B given a projective measurement on A
  • true_case=False: algebraic conditional entropy S(AB)S(A)
python
# Bell state (maximally entangled)
psi = (np.kron([1, 0], [1, 0]) + np.kron([0, 1], [0, 1])) / np.sqrt(2)
rho = np.outer(psi, psi.conj())

print(Info.mutual(rho, 2, 2))    # 2.0  (maximum for 2 qubits)
print(Info.coherent(rho, 2, 2))  # 1.0
python
from qudit.tools import Info
import numpy as np

Distance

Distance measures distinguishability between quantum states.

MethodFormulaDescription
Distance.trace(rho, sigma)12|ρσ|1Operational distinguishability
Distance.bures(rho, sigma)22F(ρ,σ)Metric on density matrices
Distance.jensen_shannon(rho, sigma)12(D(ρ|m)+D(σ|m)), m=ρ+σ2Symmetric, bounded in [0,1]
Distance.relative_entropy(rho, sigma)D(ρ|σ)=Tr[ρ(logρlogσ)]Alias for Entropy.relative
python
rho = np.array([[1, 0], [0, 0]], dtype=complex)     # |0><0|
sigma = np.array([[0.5, 0], [0, 0.5]], dtype=complex)  # maximally mixed

print(Distance.trace(rho, sigma))  # 0.5
print(Distance.bures(rho, sigma))  #~0.765
python
from qudit.tools import Distance
import numpy as np

TIP

All four classes accept both np.ndarray density matrices and 1D pure statevectors; 1D inputs are automatically promoted to rank-1 density matrices where needed.