Pruning neural networks directly with back-propagation
-
Updated
Jul 21, 2021 - Python
Pruning neural networks directly with back-propagation
This repository contains a Pytorch implementation of the article "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" and an application of this hypothesis to reinforcement learning
LSTM, NLP task에 대한 lt hypothesis의 범용성을 검증하는 연구입니다.
Reimplementation of Sparse Variational Dropout in Keras-Core/Keras 3.0
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning, IEEE Transactions on Knowledge and Data Engineering 2024
Network acceleration methods
Counting currency from video using RepNet as a base model.
Code for the project "SNIP: Single-Shot Network Pruning"
Make Structured Pruning Methods Smooth and Adaptive: Decay Pruning Method (DPM) is a novel smooth and dynamic pruning approach, that can be seemingly integrated with various existing structured pruning methods, providing significant improvement.
collection of works aiming at reducing model sizes or the ASIC/FPGA accelerator for machine learning
The official code for our ACCV2022 poster paper: Network Pruning via Feature Shift Minimization.
Implementation of Autoslim using Tensorflow2
Sparse variational droput in tensorflow2
[Preprint] Why is the State of Neural Network Pruning so Confusing? On the Fairness, Comparison Setup, and Trainability in Network Pruning
Pytorch implementation of our paper (TNNLS) -- Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters
[ICCV 2017] Learning Efficient Convolutional Networks through Network Slimming
Channel-Prioritized Convolutional Neural Networks for Sparsity and Multi-fidelity
Reducing the computational overhead of Deep CNNs through parameter pruning and tensor decomposition.
[ICLR'23] Trainability Preserving Neural Pruning (PyTorch)
Add a description, image, and links to the network-pruning topic page so that developers can more easily learn about it.
To associate your repository with the network-pruning topic, visit your repo's landing page and select "manage topics."