-
Notifications
You must be signed in to change notification settings - Fork 223
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Noisy spiking neural nets implementations. #230
Conversation
This could be an interesting addition - thanks for your effort. So before deciding whether to merge this in its present form, or if it might be better suited outside of the neuron class (i.e., in the model definition), or if membrane noise should be treated as a separate layer (e.g., king like graded spikes), I have a few questions.
|
Thanks for your reply! 2.1- In terms of performing machine learning tasks,
2.2- Reproducing real neural activity. Trial-by-trial variability is common in neural recordings. Using the noisy LIF method can easily reproduce the variability in real neural activity. For example, using the correlation coefficient (CC) of firing rate as a measure (0~1, the higher the value, the better), noisy SNN can bring a CC increase of up to 0.2, and fit the activity of real neurons accurately. 2.3- As mentioned in the great review [2], the surrogate gradient is adapted from STE in the quantized NN field. This means we can use it to optimize our SNNs, but it does not provide useful insights into how neurobiology would achieve similar optimization. In this regard, the noise-driven learning induced by noisy LIF provides new insight. 2.4- For neuromorphic hardware deployment. Since the input-level and spike-level perturbations mentioned in 2.1 have practical meaning for neuromorphic implementation, which might encounter sensor noises and circuit noise. Therefore, using noisy SNN can help to robustly deploy pre-trained models on mixed-signal neuromorphic hardware [3]. [1] Gerstner, Wulfram, et al. Neuronal dynamics: From single neurons to networks and models of cognition. Cambridge University Press, 2014. |
Great, thank you for your detailed reply! This looks like a nice addition. There seems to be some code that should be removed or modified.
Finally, could you point me to the line where the noise is applied so I can look in closer detail? Let me know if you need any help with the above changes. |
Thanks for your effort! I've made changes as requested as follows.
Re:
Re:
Re: (Here I use the line number in the latest commit) They were added to In addition, for the sake of running efficiency, I removed the implementation of arctangent noise in this commit because there is currently no good way to efficiently implement its sampling process. |
Hi @genema , would it be possible to add some unit tests to the PR in order to check the functionality of the new features? |
According to your request, module tests have been added to the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From my point of view, everything is fine! I'll run the tests and let @jeshraghian take the final decision.
All tests are passing! I will merge it to a new branch. We will check out some internals (where to put what) and then merge it to master! Thank you very much for your extensive contribution! |
Thank you for your efforts! While writing the unit tests, I noticed that existing tests based on "repeated computation produces the same results" may not be fully applicable to noisy lif units. Anyway, I thoroughly examined their functionality in my local tests as much as possible. Please feel free to contact if you find any problems : ) |
Hi,
I have added a basic implementation of noisy LIF neuron for builds noisy spiking neural nets (NSNN). NSNN is a theoretical framework introduced in literature nsnn that introduces internal noise into the SNN model by considering more realistic noisy neuronal dynamics in biophysics and uses it as a computational and learning resource.
Despite extensive research on spiking neural networks (SNNs), most studies are established on deterministic models, overlooking the inherent non-deterministic, noisy nature of neural computations. We introduce the noisy spiking neural network (NSNN) and the noise-driven learning rule (NDL) by incorporating noisy neuronal dynamics to exploit the computational advantages of noisy neural processing. The NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation. We demonstrate that NSNN leads to spiking neural models with competitive performance, improved robustness against challenging perturbations than deterministic SNNs, and better reproducing probabilistic neural computation in neural coding. We hope that this implementation could enrich your impressive repo.