You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried using psp0 to convert the inputs into binary spikes then the psp0 is sent SDNN layers.
Is there a way to verify that the input was converted? because I don't see much difference while training a network, if it takes in normal channels or binary spikes, it computes at the same rate.
My forward operations looks something like this. Secondly I cant change number of output channels, I have to then to create separate blocks.
self.blocks = torch.nn.ModuleList([
slayer.block.sigma_delta.Conv(sdnn_params1, 2, 2, 3, padding=1),
slayer.block.sigma_delta.Conv(sdnn_params1, 2, 2, 1, padding=0),
])
def forward(self, x):
count = []
event_cost = 0
device = torch.device("cuda") # or "cpu" if you want to use the CPU
scale = 1#<<12 # scale factor for integer simulation
decay = torch.FloatTensor([0.1 * scale]).to(device)
initial_state = torch.FloatTensor([0]).to(device)
threshold = 0.5
B, C, H, W, T = x.shape
psp = slayer.neuron.dynamics.leaky_integrator._li_dynamics_fwd(x, decay=decay, state=initial_state, w_scale=scale,threshold= threshold)
for block in self.blocks:
x = block(psp)
if hasattr(block, 'neuron'):
event_cost += event_rate_loss(x)
count.append(torch.sum(torch.abs((x[..., 1:]) > 0).to(x.dtype)).item())
return x, event_cost, torch.FloatTensor(count).reshape((1, -1)).to(x.device)
Hi @bamsumit @tangores ,
I tried using psp0 to convert the inputs into binary spikes then the psp0 is sent SDNN layers.
Is there a way to verify that the input was converted? because I don't see much difference while training a network, if it takes in normal channels or binary spikes, it computes at the same rate.
My forward operations looks something like this. Secondly I cant change number of output channels, I have to then to create separate blocks.
Originally posted by @Kristi1217 in #225 (comment)
The text was updated successfully, but these errors were encountered: