Extending torch.autograd
Adding operation to autograd
requires implementing a new Function
subclass ofr each operation. Every new function requires you to implement 2 methods:
forward()
backward()
- gradient formula. It will be given as manyVariable
arguments as there were outputs, with each of them representing gradient w.r.t. that output.
1 | # Inherit from Function |
1 |