How to Write Custom Autograd Functions in PyTorch
A practical guide to torch.autograd.Function for ML engineers: implementing custom forward and backward passes, ctx.save_for_backward rules, numerically stable operations, straight-through estimation for quantisation-aware training, handling non-differentiable inputs, and verifying correctness with gradcheck and gradgradcheck.