请注意python_grad_func需要与ops.RegisterGradient(https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/framework/function.py#L349)相同的接口。
这是修改后的代码示例:
def my_op_grad(op, grad): ### instead of my_op_grad(x) return tf.sigmoid(op.inputs[0])@function.Defun(a=tf.float32, python_grad_func=my_op_grad) def my_op(a): return tf.identity(a)def main(unused_argv): a = tf.Variable(tf.constant([-5., 4., -3., 2., 1.], dtype=tf.float32))sess = tf.Session() sess.run(tf.initialize_all_variables()) a = tf.identity(a) #workaround for bug github.com/tensorflow/tensorflow/issues/3710 grad = tf.gradients(my_op(a), [a])[0]result = sess.run(grad) print(result) sess.close()
输出:
[ 0.00669286 0.98201376 0.04742587 0.88079709 0.7310586 ]



