qml.math.jacobian

jacobian(f, argnums=0)[source]

Compute the Jacobian in a jax-like manner for any interface.

Parameters:
  • f (Callable) – a function with a vector valued output

  • argnums (Sequence[int] | int) – which arguments to differentiate

Returns:

a function with the same signature as f that returns the jacobian

Return type:

Callable

Note that this function follows the same design as jax. By default, the function will return the gradient of the first argument, whether or not other arguments are trainable.

>>> import jax, torch, tensorflow as tf
>>> def f(x, y):
...     return  x * y
>>> qml.math.jacobian(f)(qml.numpy.array([2.0, 3.0]), qml.numpy.array(3.0))
array([[3., 0.],
          [0., 3.]])
>>> qml.math.jacobian(f)(jax.numpy.array([2.0, 3.0]), jax.numpy.array(3.0))
Array([[3., 0.],
           [0., 3.]], dtype=float32)
>>> x_torch = torch.tensor([2.0, 3.0], requires_grad=True)
>>> y_torch = torch.tensor(3.0, requires_grad=True)
>>> qml.math.jacobian(f)(x_torch, y_torch)
tensor([[3., 0.],
            [0., 3.]])
>>> qml.math.jacobian(f)(tf.Variable([2.0, 3.0]), tf.Variable(3.0))
<tf.Tensor: shape=(2, 2), dtype=float32, numpy=
array([[3., 0.],
          [0., 3.]], dtype=float32)>

argnums can be provided to differentiate multiple arguments.

>>> qml.math.jacobian(f, argnums=(0,1))(x_torch, y_torch)
(tensor([[3., 0.],
        [0., 3.]]),
tensor([2., 3.]))

While jax can handle taking jacobians of outputs with any pytree shape:

>>> def pytree_f(x):
...     return {"a": 2*x, "b": 3*x}
>>> qml.math.jacobian(pytree_f)(jax.numpy.array(2.0))
{'a': Array(2., dtype=float32, weak_type=True),
'b': Array(3., dtype=float32, weak_type=True)}

Torch can only differentiate arrays and tuples:

>>> def tuple_f(x):
...     return x**2, x**3
>>> qml.math.jacobian(tuple_f)(torch.tensor(2.0))
(tensor(4.), tensor(12.))
>>> qml.math.jacobian(pytree_f)(torch.tensor(2.0))
TypeError: The outputs of the user-provided function given to jacobian must be
either a Tensor or a tuple of Tensors but the given outputs of the user-provided
function has type <class 'dict'>.

But tensorflow and autograd can only handle array-valued outputs:

>>> qml.math.jacobian(tuple_f)(qml.numpy.array(2.0))
ValueError: autograd can only differentiate with respect to arrays, not <class 'tuple'>
>>> qml.math.jacobian(tuple_f)(tf.Variable(2.0))
ValueError: qml.math.jacobian does not work with tensorflow and non-tensor outputs.
Got (<tf.Tensor: shape=(), dtype=float32, numpy=4.0>,
<tf.Tensor: shape=(), dtype=float32, numpy=8.0>) of type <class 'tuple'>.