qml.math.in_backprop¶
- in_backprop(tensor, interface=None)[source]¶
Returns True if the tensor is considered to be in a backpropagation environment, it works for Autograd, TensorFlow and Jax. It is not only checking the differentiability of the tensor like
requires_grad()
, but rather checking if the gradient is actually calculated.- Parameters
tensor (tensor_like) – input tensor
interface (str) – The name of the interface. Will be determined automatically if not provided.
Example
>>> x = tf.Variable([0.6, 0.1]) >>> requires_grad(x) False >>> with tf.GradientTape() as tape: ... print(requires_grad(x)) True
See also
code/api/pennylane.math.in_backprop
Download Python script
Download Notebook
View on GitHub