Lava Deep Learning 0.5.0 #265
mgkwill
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Lava Deep Learning v0.5.0 Release Notes
November 9, 2023
What's Changed
New Features and Improvements
Lava-dl SLAYER now has extended support for training and inference of video object detection networks and the associated pre and post processing utilities used for object detection. The object detection module is available as
lava.lib.dl.slayer.obd
. The modules are described below:obd.yolo_base
obd.models
obd.dataset
obd.bbox.metrics
obd.{bbox, dataset}.utils
Extensive tutorials for
are also available.
In addition, the lava-dl SLAYER tutorials now include XOR regression tutorial as a basic example to get started with lava-dl training.
Finally, lava-dl SLAYER now supports SpikeMoid loss, the official implementation of the spike-based loss introduced in
which enables more advanced tuning of SNNs for classification.
Lava-dl NetX now supports users to configure inference of fully connected layers using sparse synapse instead of the default dense synapse. This allows the network to leverage the compression offered by sparse synapse if the fully connected weights are sparse enough. It is as simple as setting
sparse_fc_layer=True
when initializing anetx.hdf5.Network
.netx.hdf5.Network
also supports global control of spike exponent (the fraction portion of spike message) by settingspike_exp
keyword. This allows users to control the network behavior in a more fine-grained manner and potentially avoid data overflow on Loihi hardware.In addition, lava-dl NetX now includes sequential modules
netx.modules
. These modules allow the creation of PyTorch style callable constructs whose behavior is described in theforward
function. In addition, these sequential modules also allow the execution of non-critical, but expensive management between calls in a parallel thread so that the execution flow is not blocked.netx.modules.Quantize
andnetx.modules.Dequantize
are now pre-built to allow for consistent quantization and dequantization to/from the fixed precision representation in the NetX network. Their usage can be seen in the YOLO SDNN inference on Lava and Loihi tutorial.Bug Fixes and Other Changes
Breaking Changes
Known Issues
New Contributors
Full Changelog: v0.4.0...v0.5.0
This discussion was created from the release Lava Deep Learning 0.5.0.
Beta Was this translation helpful? Give feedback.
All reactions