# qml.math.max_entropy¶

max_entropy(state, indices, base=None, check_state=False, c_dtype='complex128')[source]

Compute the maximum entropy from a state vector or density matrix on a given subsystem. It supports all interfaces (Numpy, Autograd, Torch, Tensorflow and Jax).

$S_{\text{max}}( \rho ) = \log( \text{rank} ( \rho ))$
Parameters
• state (tensor_like) – (2**N) state vector or (2**N, 2**N) density matrix.

• indices (list(int)) – List of indices in the considered subsystem.

• base (float) – Base for the logarithm. If None, the natural logarithm is used.

• check_state (bool) – If True, the function will check the state validity (shape and norm).

• c_dtype (str) – Complex floating point precision type.

Returns

The maximum entropy of the considered subsystem.

Return type

float

Example

The maximum entropy of a subsystem for any state vector can be obtained. Here is an example for the maximally entangled state, where the subsystem entropy is maximal (default base for log is exponential).

>>> x = [1, 0, 0, 1] / np.sqrt(2)
>>> max_entropy(x, indices=[0])
0.6931472


The logarithm base can be changed. For example:

>>> max_entropy(x, indices=[0], base=2)
1.0


The maximum entropy can be obtained by providing a quantum state as a density matrix. For example:

>>> y = [[1/2, 0, 0, 1/2], [0, 0, 0, 0], [0, 0, 0, 0], [1/2, 0, 0, 1/2]]
>>> max_entropy(y, indices=[0])
0.6931472


The maximum entropy is always greater or equal to the Von Neumann entropy. In this maximally entangled example, they are equal:

>>> vn_entropy(x, indices=[0])
0.6931472


However, in general, the Von Neumann entropy is lower:

>>> x = [np.cos(np.pi/8), 0, 0, -1j*np.sin(np.pi/8)]
>>> vn_entropy(x, indices=[1])
0.4164955
>>> max_entropy(x, indices=[1])
0.6931472