Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NMNIST tutorial is added #67

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
101 changes: 101 additions & 0 deletions tutorials/lava/lib/dl/netx/nmnist/Trained/accuracy.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
# Train Test
0.801933 0.945200
0.907750 0.949300
0.921467 0.948500
0.929283 0.961600
0.935350 0.965100
0.940217 0.962600
0.943083 0.964600
0.947050 0.967700
0.948183 0.971400
0.951867 0.970100
0.952467 0.970700
0.954117 0.971600
0.954650 0.974300
0.957717 0.973900
0.957400 0.976500
0.957783 0.976400
0.960300 0.977500
0.960233 0.975200
0.960883 0.978500
0.962850 0.976200
0.962550 0.978400
0.963967 0.977000
0.965200 0.976900
0.964583 0.980100
0.965033 0.977800
0.967050 0.980000
0.966933 0.981000
0.967300 0.981500
0.968483 0.980100
0.968900 0.981700
0.969117 0.980800
0.970300 0.981400
0.969217 0.981800
0.970117 0.982600
0.969650 0.982700
0.971133 0.981800
0.971833 0.981800
0.970983 0.984200
0.972217 0.982600
0.971983 0.982400
0.971883 0.982400
0.972850 0.982500
0.973100 0.982900
0.973600 0.981800
0.974300 0.981900
0.973533 0.983700
0.973517 0.983200
0.974517 0.983400
0.974217 0.984000
0.974300 0.983500
0.974883 0.982700
0.974883 0.983400
0.975650 0.983400
0.975167 0.983400
0.975767 0.984000
0.976217 0.983700
0.975950 0.984000
0.976350 0.983900
0.976850 0.984500
0.976633 0.984400
0.976917 0.984800
0.976200 0.984000
0.977100 0.984700
0.977050 0.984100
0.976767 0.983800
0.976400 0.985800
0.977667 0.985500
0.977383 0.984300
0.977300 0.985800
0.977933 0.985500
0.977633 0.984800
0.978333 0.985600
0.978650 0.985400
0.977933 0.985100
0.978017 0.986000
0.978067 0.985300
0.978667 0.985900
0.978567 0.986700
0.978917 0.986100
0.979300 0.986300
0.978850 0.984000
0.978617 0.985300
0.978733 0.984900
0.979117 0.985800
0.979317 0.985700
0.979083 0.985500
0.979800 0.985600
0.979667 0.985800
0.979217 0.985400
0.980300 0.985900
0.980517 0.986200
0.980217 0.986200
0.980033 0.985300
0.980183 0.986100
0.981033 0.984900
0.981183 0.985700
0.981250 0.985000
0.980783 0.985300
0.981417 0.986400
0.981300 0.986400
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
101 changes: 101 additions & 0 deletions tutorials/lava/lib/dl/netx/nmnist/Trained/loss.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
# Train Test
0.392544 0.176659
0.239416 0.167550
0.210520 0.149321
0.194289 0.125025
0.177958 0.109535
0.171833 0.118700
0.164164 0.111476
0.158248 0.100817
0.152601 0.100551
0.149008 0.098966
0.145127 0.102916
0.141551 0.097027
0.138394 0.083000
0.134470 0.092456
0.132844 0.086755
0.128918 0.085847
0.125912 0.079843
0.122580 0.083085
0.118639 0.074248
0.117358 0.073422
0.115235 0.072302
0.113196 0.067478
0.109428 0.075555
0.109808 0.067043
0.108042 0.082480
0.104578 0.071933
0.103017 0.064316
0.101517 0.062741
0.101174 0.061511
0.099525 0.060693
0.098340 0.062145
0.097604 0.067442
0.096623 0.059294
0.095129 0.067488
0.095098 0.058241
0.095281 0.061556
0.092980 0.057933
0.094452 0.062046
0.090770 0.059484
0.092471 0.075752
0.090367 0.058769
0.090964 0.058157
0.089748 0.059631
0.089739 0.060288
0.089227 0.064481
0.089888 0.055119
0.089511 0.059206
0.089242 0.061213
0.089111 0.061570
0.092063 0.055300
0.089502 0.060598
0.088578 0.052308
0.086110 0.059129
0.085914 0.056834
0.084754 0.055252
0.086317 0.054813
0.084645 0.064556
0.085033 0.055137
0.082931 0.054086
0.084955 0.059385
0.084632 0.052532
0.085062 0.055110
0.082406 0.053513
0.082559 0.054450
0.080746 0.058334
0.081946 0.075627
0.080497 0.057141
0.080500 0.055179
0.078971 0.053432
0.079405 0.051200
0.077730 0.053164
0.078549 0.049746
0.076051 0.049622
0.077405 0.047853
0.075366 0.046746
0.077024 0.052136
0.075466 0.047198
0.074504 0.048256
0.073298 0.044842
0.072507 0.050104
0.070690 0.050632
0.070856 0.047994
0.070303 0.052489
0.070981 0.049408
0.069325 0.048902
0.070418 0.046527
0.067466 0.044350
0.068549 0.043540
0.068011 0.045525
0.066175 0.048599
0.064212 0.046466
0.063411 0.046759
0.063887 0.049415
0.062605 0.046096
0.062057 0.047598
0.062520 0.044356
0.061909 0.056894
0.063045 0.046516
0.061905 0.040300
0.063753 0.040203
Binary file not shown.
Binary file not shown.
117 changes: 117 additions & 0 deletions tutorials/lava/lib/dl/netx/nmnist/nmnist.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
# Copyright (C) 2022 Intel Corporation
# SPDX-License-Identifier: BSD-3-Clause
import glob
import os
import zipfile
import h5py
import numpy as np
import matplotlib.pyplot as plt
import torch

import lava.lib.dl.slayer as slayer


def augment(event):
x_shift = 4
y_shift = 4
theta = 10
xjitter = np.random.randint(2*x_shift) - x_shift
yjitter = np.random.randint(2*y_shift) - y_shift
ajitter = (np.random.rand() - 0.5) * theta / 180 * 3.141592654
sin_theta = np.sin(ajitter)
cos_theta = np.cos(ajitter)
event.x = event.x * cos_theta - event.y * sin_theta + xjitter
event.y = event.x * sin_theta + event.y * cos_theta + yjitter
return event


class NMNISTDataset():
"""NMNIST dataset method

Parameters
----------
path : str, optional
path of dataset root, by default 'data'
train : bool, optional
train/test flag, by default True
sampling_time : int, optional
sampling time of event data, by default 1
sample_length : int, optional
length of sample data, by default 300
transform : None or lambda or fx-ptr, optional
transformation method. None means no transform. By default Noney.
download : bool, optional
enable/disable automatic download, by default True
"""
def __init__(
self, path='data',
train=True,
sampling_time=1, sample_length=300,
transform=None, download=True,
):
super(NMNISTDataset, self).__init__()
self.path = path

if train:
data_path = path + '/Train'
source = 'https://www.dropbox.com/sh/tg2ljlbmtzygrag/'\
'AABlMOuR15ugeOxMCX0Pvoxga/Train.zip'
else:
data_path = path + '/Test'
source = 'https://www.dropbox.com/sh/tg2ljlbmtzygrag/'\
'AADSKgJ2CjaBWh75HnTNZyhca/Test.zip'

if download is True:
attribution_text = '''
NMNIST dataset is freely available here:
https://www.garrickorchard.com/datasets/n-mnist

(c) Creative Commons:
Orchard, G.; Cohen, G.; Jayawant, A.; and Thakor, N.
"Converting Static Image Datasets to Spiking Neuromorphic Datasets Using
Saccades",
Frontiers in Neuroscience, vol.9, no.437, Oct. 2015
'''.replace(' '*12, '')
if train is True:
print(attribution_text)

if len(glob.glob(f'{data_path}/')) == 0: # dataset does not exist
print(
f'NMNIST {"training" if train is True else "testing"} '
'dataset is not available locally.'
)
print('Attempting download (This will take a while) ...')
os.system(f'wget {source} -P {self.path}/ -q --show-progress')
print('Extracting files ...')
with zipfile.ZipFile(data_path + '.zip') as zip_file:
for member in zip_file.namelist():
zip_file.extract(member, self.path)
print('Download complete.')
else:
assert len(glob.glob(f'{data_path}/')) == 0, \
f'Dataset does not exist. Either set download=True '\
f'or download it from '\
f'https://www.garrickorchard.com/datasets/n-mnist '\
f'to {data_path}/'

self.samples = glob.glob(f'{data_path}/*/*.bin') #TODO

self.sampling_time = sampling_time
self.num_time_bins = int(sample_length/sampling_time)
self.transform = transform

def __getitem__(self, i):
filename = self.samples[i]
label = int(filename.split('/')[-2])
event = slayer.io.read_2d_spikes(filename)
if self.transform is not None:
event = self.transform(event)
spike = event.fill_tensor(
np.zeros((2, 34, 34, self.num_time_bins)),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess the only difference between the dataloader here and the dataloader for training is this line. So I would suggest deriving this NMNISTdataset class from tutorials.lava.lib.dl.slayer.nmnist.NMNISTdataset and just override the getitem(). It would just be conversion of spike from torch to numpy. spike = spike_torch.cpu().data.numpy()

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I see; I did not know that NMNIST is included in the library. I just took the NMNIST slayer tutorial as the reference. Thanks for the advice.

sampling_time=self.sampling_time,
)
return spike.reshape(-1, self.num_time_bins), label

def __len__(self):
return len(self.samples)

Loading