Skip to content

Exploring model compression algorithms and experimenting new ideas for NN pruning.

Notifications You must be signed in to change notification settings

srinath2022/Neural-Network-Pruning

Repository files navigation

Neural-Network-Pruning

Exploring model compression algorithms and experimenting new ideas for NN pruning.

related-work

High Level Approaches

1 : Using GAN to learn distribution of 'Good Pruning Masks'.

Use existing algorithms like IMP to generate various masks(which achieve a reasonable test accuracy) for channel pruning in a given network. Train a GAN which could learn the distribution of these good enough masks.

Details and Experiments.

2 : Enhancement to AutoML, by using 'Learned Filter Pruning Criteria' instead of mere magnitude based criteria.

The idea is essentially to combine two papers, Learning Filter Pruning Criteria for Deep Convolutional Neural Networks Acceleration (CVPR 2020) and AMC: AutoML for Model Compression and Acceleration on Mobile Devices (ECCV 2018).

Useful Links.

  1. https://pytorch.org/tutorials/intermediate/pruning_tutorial.html.
  2. https://pytorch.org/tutorials/beginner/blitz/cifar10_tutorial.html.

NN Pruning and Dataset Distillation

https://arxiv.org/pdf/2006.05929.pdf
https://openaccess.thecvf.com/content/CVPR2022/papers/Cazenavette_Dataset_Distillation_by_Matching_Training_Trajectories_CVPR_2022_paper.pdf

About

Exploring model compression algorithms and experimenting new ideas for NN pruning.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published