You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
During tutorial 4, I have found that Synaptic neuron model does not simulate correctly the expected behaviour. When passing the syn and mem value in the function call, lif1(spk_in[step], syn, mem), both values tend to infinity. Also, I ran the code that SNNTorch has for google colab and found the same issue.
What I Did
When reviewing the Synaptic code, I observed that when passing the syn and mem parameters the forward function overrides the syn attribute (self.syn) with the mem value instead of the syn value, we can see this behaviour in lines 218-219:
if not syn == None:
self.syn = mem
So, I change this line with the following statement and the bug is removed.
if not syn == None:
self.syn = syn
The text was updated successfully, but these errors were encountered:
Roiwa
changed the title
Bug in Synaptic neuron forward function
Bug in Synaptic neuron forward function [solved]
Oct 22, 2024
Roiwa
changed the title
Bug in Synaptic neuron forward function [solved]
Bug in Synaptic neuron forward function
Oct 22, 2024
Roiwa
changed the title
Bug in Synaptic neuron forward function
Bug in Synaptic neuron forward function [solved]
Oct 22, 2024
Description
During tutorial 4, I have found that Synaptic neuron model does not simulate correctly the expected behaviour. When passing the
syn
andmem
value in the function call,lif1(spk_in[step], syn, mem)
, both values tend to infinity. Also, I ran the code that SNNTorch has for google colab and found the same issue.What I Did
When reviewing the Synaptic code, I observed that when passing the
syn
andmem
parameters the forward function overrides the syn attribute (self.syn
) with themem
value instead of thesyn
value, we can see this behaviour in lines 218-219:So, I change this line with the following statement and the bug is removed.
The text was updated successfully, but these errors were encountered: