Easily create and optimize PyTorch networks as in the Deep Rewiring paper (https://igi-web.tugraz.at/PDF/241.pdf). Install using 'pip install deep_rewire'
-
Updated
Jun 27, 2024 - Python
Easily create and optimize PyTorch networks as in the Deep Rewiring paper (https://igi-web.tugraz.at/PDF/241.pdf). Install using 'pip install deep_rewire'
Robustness of Sparse Multilayer Perceptrons for Supervised Feature Selection
Neural Network Sparsification via Pruning
Adaptive Sparsity Level during Training for Efficient Time Series Forecasting with Transformers
Master's Thesis Project - Lottery Tickets contain independent subnetworks when trained on independent tasks.
Neural Networks with Sparse Weights in Rust using GPUs, CPUs, and FPGAs via CUDA, OpenCL, and oneAPI
Simple C++ implementation of a sparsely connected multi-layer neural network using OpenMP and CUDA for parallelization.
Sparse Matrix Library for GPUs, CPUs, and FPGAs via CUDA, OpenCL, and oneAPI
Characterization study repository for pruning, a popular way to compress a DL model. this repo also investigates optimal sparse tensor layouts for pruned nets
This is the repository for the SNN-22 Workshop paper on "Generalization and Memorization in Sparse Neural Networks".
Offical implementation of "Sparser spiking activity can be better: Feature Refine-and-Mask spiking neural network for event-based visual recognition" (Neural Networks 2023)
Implementation of artcile "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks"
A neural net with a terminal-based testing program.
[ICLR 2022] "Peek-a-Boo: What (More) is Disguised in a Randomly Weighted Neural Network, and How to Find It Efficiently", by Xiaohan Chen, Jason Zhang and Zhangyang Wang.
[Machine Learning Journal (ECML-PKDD 2022 journal track)] A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
[TMLR] Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks
PyTorch Implementation of TopKAST
Implementation for the paper "SpaceNet: Make Free Space For Continual Learning" in PyTorch.
Code for "Variational Depth Search in ResNets" (https://arxiv.org/abs/2002.02797)
Add a description, image, and links to the sparse-neural-networks topic page so that developers can more easily learn about it.
To associate your repository with the sparse-neural-networks topic, visit your repo's landing page and select "manage topics."