Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Noisy spiking neural nets implementations. #230

Merged
merged 11 commits into from
Aug 15, 2023

Conversation

genema
Copy link

@genema genema commented Jul 26, 2023

Hi,
I have added a basic implementation of noisy LIF neuron for builds noisy spiking neural nets (NSNN). NSNN is a theoretical framework introduced in literature nsnn that introduces internal noise into the SNN model by considering more realistic noisy neuronal dynamics in biophysics and uses it as a computational and learning resource.

Despite extensive research on spiking neural networks (SNNs), most studies are established on deterministic models, overlooking the inherent non-deterministic, noisy nature of neural computations. We introduce the noisy spiking neural network (NSNN) and the noise-driven learning rule (NDL) by incorporating noisy neuronal dynamics to exploit the computational advantages of noisy neural processing. The NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation. We demonstrate that NSNN leads to spiking neural models with competitive performance, improved robustness against challenging perturbations than deterministic SNNs, and better reproducing probabilistic neural computation in neural coding. We hope that this implementation could enrich your impressive repo.

@jeshraghian
Copy link
Owner

This could be an interesting addition - thanks for your effort.
As this seems to be a relatively new and not yet a widely adopted approach, it could be good to understand the components a little better, as there is a lot of code reuse from the Leaky class.

So before deciding whether to merge this in its present form, or if it might be better suited outside of the neuron class (i.e., in the model definition), or if membrane noise should be treated as a separate layer (e.g., king like graded spikes), I have a few questions.

  • How do the different noise distributions help (i.e., arctangent, gaussian, etc.)?
  • Have you had successful experimental demonstrations that significantly helped with experiments? It'd be great to have an example that shows their benefit.

@genema
Copy link
Author

genema commented Jul 31, 2023

This could be an interesting addition - thanks for your effort. As this seems to be a relatively new and not yet a widely adopted approach, it could be good to understand the components a little better, as there is a lot of code reuse from the Leaky class.

So before deciding whether to merge this in its present form, or if it might be better suited outside of the neuron class (i.e., in the model definition), or if membrane noise should be treated as a separate layer (e.g., king like graded spikes), I have a few questions.

1- How do the different noise distributions help (i.e., arctangent, gaussian, etc.)?
2- Have you had successful experimental demonstrations that significantly helped with experiments? It'd be great to have an example that shows their benefit.

Thanks for your reply!
1- Based on our current results, different types of noise perform similarly (according to our tests, after carefully adjusting the noise scale, the performance of different types of noise is roughly the same). As a theoretical framework for SNN research based on noisy spiking models (more specifically, we considered the LIF model), the noisy LIF model we use is mainly based on the input noise and output noise models in [1] (which are quite classical), that is, a model that considers membrane potential with a Wiener process (discretization produces an additive Gaussian noise). Other types of noise, such as logistic and triangular, are extensions based on this. As far as current theory is concerned, we support all continuous noise (epsilon) distributions that satisfy eps=-eps.

2.1- In terms of performing machine learning tasks,

  • Generally, the spiking model built on noisy LIF (Noisy SNN) is comparable to the ordinary LIF (SNN).
    Taking the recognition accuracy of the CIFAR-10 benchmark as an example, the noisy SNN and SNN indicators are 92.87% v.s. 93.18% (residual network), 93.90% v.s. 91.88% (CIFARNet). On CIFAR-100 (noisy SNN v.s. SNN), it is 69.57% v.s. 70.15% (residual network), 73.36% v.s. 72.25% (CIFARNet). On benchmarks that are usually prone to overfitting, due to the regularization effect of introducing internal noise, noisy SNN has a stable advantage. For example, on DVS-CIFAR (noisy SNN v.s. SNN), it is 74.3% v.s. 71.74% (residual network), 76.97% v.s. 75.5%, and on DVS-Gesture, it is 96.9% v.s. 95.8%.

  • When facing spiking model perturbations, such as input perturbations (using adversarial attacks for static image inputs, randomly adding and deleting events for events) and internal pulse perturbations (randomly adding and deleting spikes generated layer by layer). In these challenging perturbed situations, noisy spiking networks show significantly higher resilliance.
    For example, in terms of static image input, when applying adversarial attacks of the same intensity, the accuracy of the two drops to (noisy SNN v.s. SNN) 63.23% v.s. 39.07% (CIFAR-10, residual net), 53.95% v.s. 39.38% (residual net). In terms of event stream input, when applying event perturbations of the same intensity, the performance of noisy SNN and SNN is 70.28%, and 64.98%, respectively. When directly interfering with the internal pulse firing state, it is (noisy SNN v.s. SNN) 71.04% v.s. 23.53% (CIFAR-10, residual net), 42.07% v.s. 15.28% (CIFAR-100, residual net), and 67.27% v.s. 53.6%(DVS-CIFAR, residual net).

2.2- Reproducing real neural activity. Trial-by-trial variability is common in neural recordings. Using the noisy LIF method can easily reproduce the variability in real neural activity. For example, using the correlation coefficient (CC) of firing rate as a measure (0~1, the higher the value, the better), noisy SNN can bring a CC increase of up to 0.2, and fit the activity of real neurons accurately.

2.3- As mentioned in the great review [2], the surrogate gradient is adapted from STE in the quantized NN field. This means we can use it to optimize our SNNs, but it does not provide useful insights into how neurobiology would achieve similar optimization. In this regard, the noise-driven learning induced by noisy LIF provides new insight.

2.4- For neuromorphic hardware deployment. Since the input-level and spike-level perturbations mentioned in 2.1 have practical meaning for neuromorphic implementation, which might encounter sensor noises and circuit noise. Therefore, using noisy SNN can help to robustly deploy pre-trained models on mixed-signal neuromorphic hardware [3].

[1] Gerstner, Wulfram, et al. Neuronal dynamics: From single neurons to networks and models of cognition. Cambridge University Press, 2014.
[2] Eshraghian, Jason K., et al. "Training spiking neural networks using lessons from deep learning." arXiv preprint arXiv:2109.12894 (2021)
[3] Giacomo Indiveri group: "ml-hw codesign of noise-robust tinyml models ...", "supervised training for spiking neural networks for robust deployment..." ...

@jeshraghian
Copy link
Owner

Great, thank you for your detailed reply! This looks like a nice addition.

There seems to be some code that should be removed or modified.

NoisyLIF

  • Lines 516-523 don't appear to be in use
  • I don't think Lines 525-593 are in use. We can safely delete them, but will need to include an initialization for init_noisylif. I think we can just rename the init_leaky function to init_noisylif

NoisyLeaky

  • Please define \epsilon after Line 45
  • Line 73 noisy_type --> noise_type
  • In the docs on Line 94: this isn't clear to me; I don't understand what is meant by eps=-eps
  • Line 98: "For instance, std for the gaussian noise, scale for the logistic noise, etc." I don't understand what is meant here. Do you mean the scale defines the standard deviation of the gaussian noise, and a multiplier factor to all others? I think it'd be better to explicitly state this.
  • Delete print statement on Line 186

Finally, could you point me to the line where the noise is applied so I can look in closer detail? Let me know if you need any help with the above changes.

@genema
Copy link
Author

genema commented Aug 11, 2023

Thanks for your effort! I've made changes as requested as follows.

NoisyLIF

  • Lines 516-523 don't appear to be in use
  • I don't think Lines 525-593 are in use. We can safely delete them, but will need to include an initialization for init_noisylif. I think we can just rename the init_leaky function to init_noisylif

Re: _neurons/neurons.py class NoisyLIF:

  • The unused function _V_register_buffer (line 516-523) is deleted.
  • init_rleaky, init_synaptic, init_rsynaptic, init_lapicque, init_alpha (line 525-593) were deleted. The function init_leaky was renamed to the init_noisyleaky function now.

NoisyLeaky

  • Please define \epsilon after Line 45
  • Line 73 noisy_type --> noise_type
  • In the docs on Line 94: this isn't clear to me; I don't understand what is meant by eps=-eps
  • Line 98: "For instance, std for the gaussian noise, scale for the logistic noise, etc." I don't understand what is meant here. Do you mean the scale defines the standard deviation of the gaussian noise, and a multiplier factor to all others? I think it'd be better to explicitly state this.
  • Delete print statement on Line 186

Re: _neurons/noisyleaky.py :

  • line 45: $\epsilon$ definition added.
  • line 73: typo fixed.
  • line 94 doc is updated
  • line 98: I've reorganized this part of the description to make it clearer.
  • line 186: deleted.

Finally, could you point me to the line where the noise is applied so I can look in closer detail? Let me know if you need any help with the above changes.

Re: (Here I use the line number in the latest commit)

They were added to neurons.py in the latest commit, which may make the implementation more modular. Specifically, you can find the membrane voltage update process with noise (which was not included in commits before 03f5c87 as the final implementation form was not finally determined) in lines 569, 634-640, and 671. This membrane voltage updating process is described by $U[t+1] = \beta U[t] + I_{\text{in}}[t+1] - RU_{\text{thr}} + \epsilon$. The probability firing process $S[t+1] \sim \mathtt{Bernoulli}\big(\mathbb{P}(S[t+1] = 1\big)$, where $\mathbb{P}(S[t+1]=1) = CDF_\epsilon (U[t+1]-U_{\text{thr}})$, and the noise-drive learning updates were already included in the commits before.

In addition, for the sake of running efficiency, I removed the implementation of arctangent noise in this commit because there is currently no good way to efficiently implement its sampling process.

@ahenkes1 ahenkes1 added the enhancement New feature or request label Aug 14, 2023
@ahenkes1
Copy link
Collaborator

Hi @genema , would it be possible to add some unit tests to the PR in order to check the functionality of the new features?

@genema
Copy link
Author

genema commented Aug 15, 2023

Hi @genema , would it be possible to add some unit tests to the PR in order to check the functionality of the new features?

According to your request, module tests have been added to the test folder and have passed the pytest.

Copy link
Collaborator

@ahenkes1 ahenkes1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my point of view, everything is fine! I'll run the tests and let @jeshraghian take the final decision.

@ahenkes1 ahenkes1 changed the base branch from master to noisy_leaky August 15, 2023 11:47
@ahenkes1
Copy link
Collaborator

All tests are passing! I will merge it to a new branch. We will check out some internals (where to put what) and then merge it to master! Thank you very much for your extensive contribution!

@ahenkes1 ahenkes1 merged commit fb8f2f9 into jeshraghian:noisy_leaky Aug 15, 2023
2 checks passed
@genema
Copy link
Author

genema commented Aug 16, 2023

All tests are passing! I will merge it to a new branch. We will check out some internals (where to put what) and then merge it to master! Thank you very much for your extensive contribution!

Thank you for your efforts! While writing the unit tests, I noticed that existing tests based on "repeated computation produces the same results" may not be fully applicable to noisy lif units. Anyway, I thoroughly examined their functionality in my local tests as much as possible. Please feel free to contact if you find any problems : )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants