qml.math.in_backprop¶

in_backprop(tensor, interface=None)[source]

Returns True if the tensor is considered to be in a backpropagation environment, it works for Autograd, Tensorflow and Jax. It is not only checking the differentiability of the tensor like requires_grad(), but rather checking if the gradient is actually calculated.

Parameters
• tensor (tensor_like) – input tensor

• interface (str) – The name of the interface. Will be determined automatically if not provided.

Example

>>> x = tf.Variable([0.6, 0.1])