Skip to content

Commit

Permalink
Merge pull request #239 from jeshraghian/ahenkes1-patch-1
Browse files Browse the repository at this point in the history
Update snntorch.surrogate.rst
  • Loading branch information
jeshraghian authored Sep 24, 2023
2 parents c84be2c + 45e9679 commit b756d17
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions docs/snntorch.surrogate.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ snntorch.surrogate

By default, PyTorch's autodifferentiation tools are unable to calculate the analytical derivative of the spiking neuron graph.
The discrete nature of spikes makes it difficult for ``torch.autograd`` to calculate a gradient that facilitates learning.
:mod:`snntorch` overrides the default gradient by using :mod:`snntorch.LIF.Heaviside`.
:mod:`snntorch` overrides the default gradient by using :mod:`snntorch.surrogate.ATan`.

Alternative gradients are also available in the :mod:`snntorch.surrogate` module.
These represent either approximations of the backward pass or probabilistic models of firing as a function of the membrane potential.
Expand All @@ -29,7 +29,7 @@ How to use surrogate
^^^^^^^^^^^^^^^^^^^^^^^^

The surrogate gradient must be passed as the ``spike_grad`` argument to the neuron model.
If ``spike_grad`` is left unspecified, it defaults to :mod:`snntorch.neurons.Heaviside`.
If ``spike_grad`` is left unspecified, it defaults to :mod:`snntorch.surrogate.ATan`.
In the following example, we apply the fast sigmoid surrogate to :mod:`snntorch.Synaptic`.

Example::
Expand Down

0 comments on commit b756d17

Please sign in to comment.