The lava-dl library now supports automated generation of Lava processes for a trained network described by hdf5 network configuration using our Network Exchange (NetX) library.
- Released Network Exchange (NetX) library to support automated creation of Lava process for a deep network. We support hdf5 network exchange format. Support for more formats will be introduced in future. (PR #30, Issue #29)
- Fixed bug with pre-hook quantization function on conv blocks (PR#13)
- No breaking changes in this release
- Issue training with GPU for lava-dl-slayer on Windows machine.
- Create PULL_REQUEST_TEMPLATE.md & ISSUE_TEMPLATE.md by @mgkwill in lava-nc#27
- Hardware neuron parameters exchange and fixed precision instruction precision compatibility by @bamsumit in lava-nc#25
- Pilotnet link fix by @bamsumit in lava-nc#31
- Bugfix: CUBA neuron normalization applied to current state by @bamsumit in lava-nc#35
- Netx by @bamsumit in lava-nc#30
- Streamline PilotNet SNN notebook with RefPorts by @bamsumit in lava-nc#37
- Fix for failing tests/lava/lib/dl/netx/test_hdf5.py by @bamsumit in lava-nc#44
- Update ci-build.yml by @mgkwill in lava-nc#42
- Install by @mgkwill in lava-nc#45
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.1.1...v0.2.0