Coder Social home page Coder Social logo

lucaslie / torchprune Goto Github PK

View Code? Open in Web Editor NEW
163.0 5.0 23.0 6.78 MB

A research library for pytorch-based neural network pruning, compression, and more.

Home Page: https://people.csail.mit.edu/lucasl/

License: MIT License

Python 27.64% Dockerfile 0.06% Shell 72.29%
neural-networks filter-pruning weight-pruning coresets deep-learning generalization-ability pytorch machine-learning tensor-decomposition sparsification

torchprune's Introduction

torchprune

Main contributors of this code base: Lucas Liebenwein, Cenk Baykal.

Please check individual paper folders for authors of each paper.

Papers

This repository contains code to reproduce the results from the following papers:

Paper Venue Title & Link
Node NeurIPS 2021 Sparse Flows: Pruning Continuous-depth Models
ALDS NeurIPS 2021 Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition
Lost MLSys 2021 Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy
PFP ICLR 2020 Provable Filter Pruning for Efficient Neural Networks
SiPP SIAM 2022 SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks

Packages

In addition, the repo also contains two stand-alone python packages that can be used for any desired pruning experiment:

Packages Location Description
torchprune ./src/torchprune This package can be used to run any of the implemented pruning algorithms. It also contains utilities to use pre-defined networks (or use your own network) and utilities for standard datasets.
experiment ./src/experiment This package can be used to run pruning experiments and compare multiple pruning methods for different prune ratios. Each experiment is configured using a .yaml-configuration files.

Paper Reproducibility

The code for each paper is implemented in the respective packages. In addition, for each paper we have a separate folder that contains additional information about the paper and scripts and parameter configuration to reproduce the exact results from the paper.

Paper Location
Node paper/node
ALDS paper/alds
Lost paper/lost
PFP paper/pfp
SiPP paper/sipp

Setup

We provide three ways to install the codebase:

  1. Github repo + full conda environment
  2. Installation via pip
  3. Docker image

1. Github Repo

Clone the github repo:

git pull [email protected]:lucaslie/torchprune.git
# (or your favorite way to pull a repo)

We recommend installing the packages in a separate conda environment. Then to create a new conda environment run

conda create -n prune python=3.8 pip
conda activate prune

To install all required dependencies and both packages, run:

pip install -r misc/requirements.txt

Note that this will also install pre-commit hooks for clean commits :-)

2. Pip Installation

To separately install each package with minimal dependencies without cloning the repo manually, run the following commands:

# "torchprune" package
pip install git+https://github.com/lucaslie/torchprune/#subdirectory=src/torchprune

# "experiment" package
pip install git+https://github.com/lucaslie/torchprune/#subdirectory=src/experiment

Note that the experiment package does not automatically install the torchprune package.

3. Docker Image

You can simply pull the docker image from our docker hub:

docker pull liebenwein/torchprune

You can run it interactively with

docker run -it liebenwein/torchprune bash

For your reference you can find the Dockerfile here.

More Information and Usage

Check out the following READMEs in the sub-directories to find out more about using the codebase.

READMEs More Information
src/torchprune/README.md more details to prune neural networks, how to use and setup the data sets, how to implement custom pruning methods, and how to add your data sets and networks.
src/experiment/README.md more details on how to configure and run your own experiments, and more information on how to re-produce the results.
paper/node/README.md check out for more information on the Node paper.
paper/alds/README.md check out for more information on the ALDS paper.
paper/lost/README.md check out for more information on the Lost paper.
paper/pfp/README.md check out for more information on the PFP paper.
paper/sipp/README.md check out for more information on the SiPP paper.

Citations

Please cite the respective papers when using our work.

@article{liebenwein2021sparse,
  title={Sparse flows: Pruning continuous-depth models},
  author={Liebenwein, Lucas and Hasani, Ramin and Amini, Alexander and Rus, Daniela},
  journal={Advances in Neural Information Processing Systems},
  volume={34},
  pages={22628--22642},
  year={2021}
}
@inproceedings{liebenwein2021alds,
 author = {Lucas Liebenwein and Alaa Maalouf and Dan Feldman and Daniela Rus},
 booktitle = {Advances in Neural Information Processing Systems},
 title = {Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition},
 url = {https://arxiv.org/abs/2107.11442},
 volume = {34},
 year = {2021}
}
@article{liebenwein2021lost,
title={Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy},
author={Liebenwein, Lucas and Baykal, Cenk and Carter, Brandon and Gifford, David and Rus, Daniela},
journal={Proceedings of Machine Learning and Systems},
volume={3},
year={2021}
}
@inproceedings{liebenwein2020provable,
title={Provable Filter Pruning for Efficient Neural Networks},
author={Lucas Liebenwein and Cenk Baykal and Harry Lang and Dan Feldman and Daniela Rus},
booktitle={International Conference on Learning Representations},
year={2020},
url={https://openreview.net/forum?id=BJxkOlSYDH}
}

SiPPing Neural Networks (Weight Pruning)

@article{baykal2022sensitivity,
  title={Sensitivity-informed provable pruning of neural networks},
  author={Baykal, Cenk and Liebenwein, Lucas and Gilitschenski, Igor and Feldman, Dan and Rus, Daniela},
  journal={SIAM Journal on Mathematics of Data Science},
  volume={4},
  number={1},
  pages={26--45},
  year={2022},
  publisher={SIAM}
}

torchprune's People

Contributors

lucaslie avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

torchprune's Issues

How to use ALDS in my model

I tried to use ALDS in my own network, but such a mistake occurred: 'Mymodel' object has no attribute 'compressible_layers'.
I use the process recommended by you, but I still encounter such problems.
THANK YOU

What is the code runtime environment?

I encountered this error while installing the torchprune package, Currently, I have found many methods but cannot solve it

"Installed e:\app\aconda\envs\prune\lib\site-packages\torchprune-2.0.0-py3.8.egg
Processing dependencies for torchprune==2.0.0
Traceback (most recent call last):
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_init_.py", line 3109, in _dep_map
return self._dep_map
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_init
.py", line 2902, in getattr
raise AttributeError(attr)
AttributeError: _DistInfoDistribution__dep_map

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_vendor\packaging\requirements.py", line 35, in init
parsed = _parse_requirement(requirement_string)
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_vendor\packaging_parser.py", line 64, in parse_requirement
return _parse_requirement(Tokenizer(source, rules=DEFAULT_RULES))
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_vendor\packaging_parser.py", line 82, in _parse_requirement
url, specifier, marker = _parse_requirement_details(tokenizer)
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_vendor\packaging_parser.py", line 120, in _parse_requirement_details
specifier = _parse_specifier(tokenizer)
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_vendor\packaging_parser.py", line 216, in _parse_specifier
parsed_specifiers = _parse_version_many(tokenizer)
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_vendor\packaging_parser.py", line 231, in _parse_version_many
tokenizer.raise_syntax_error(
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_vendor\packaging_tokenizer.py", line 165, in raise_syntax_error
raise ParserSyntaxError(
pkg_resources.extern.packaging._tokenizer.ParserSyntaxError: .* suffix can only be used with == or != operators
numpy (>=1.19.*) ; python_version >= "3.7"
~~~~~~~^

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "setup.py", line 12, in
setuptools.setup(
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools_init_.py", line 107, in setup
return distutils.core.setup(*attrs)
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools_distutils\core.py", line 185, in setup
return run_commands(dist)
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools_distutils\core.py", line 201, in run_commands
dist.run_commands()
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools_distutils\dist.py", line 969, in run_commands
self.run_command(cmd)
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools\dist.py", line 1233, in run_command
super().run_command(command)
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools_distutils\dist.py", line 988, in run_command
cmd_obj.run()
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools\command\install.py", line 84, in run
self.do_egg_install()
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools\command\install.py", line 140, in do_egg_install
cmd.run(show_deprecation=False)
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools\command\easy_install.py", line 442, in run
self.easy_install(spec, not self.no_deps)
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools\command\easy_install.py", line 690, in easy_install
return self.install_item(None, spec, tmpdir, deps, True)
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools\command\easy_install.py", line 740, in install_item
self.process_distribution(spec, dist, deps)
File "E:\app\aconda\envs\prune\lib\site-packages\setuptools\command\easy_install.py", line 788, in process_distribution
distros = WorkingSet([]).resolve(
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_init_.py", line 834, in resolve
new_requirements = dist.requires(req.extras)[::-1]
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_init_.py", line 2822, in requires
dm = self.dep_map
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_init
.py", line 3111, in _dep_map
self.__dep_map = self.compute_dependencies()
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_init
.py", line 3121, in compute_dependencies
reqs.extend(parse_requirements(req))
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_init
.py", line 3174, in init
super(Requirement, self).init(requirement_string)
File "E:\app\aconda\envs\prune\lib\site-packages\pkg_resources_vendor\packaging\requirements.py", line 37, in init
raise InvalidRequirement(str(e)) from e
pkg_resources.extern.packaging.requirements.InvalidRequirement: .
suffix can only be used with == or != operators
numpy (>=1.19.*) ; python_version >= "3.7"
~~~~~~~^"

Generalization of the Repo

Hi, Thank you for your research, it is quite an interesting one.
Btw, could I know, is this repo generalizable to other CNN model/is it already PnP to other model?

TIA :)

FileNotFoundError: This parameter file does not exist

Thanks for sharing!
I want to reproduce the experiment of ALDS.
When I execute the command python -m experiment.main param/cifar/prune/resnet20.yaml, I get the following error FileNotFoundError: This parameter file does not exist.
I tried to trace the code but still can't solve the problem.
Maybe I misunderstood some part,I hope you can answer it for me, thank you very much!

Reproduce the informative features analysis?

Dear authors,

Thanks for the nice paper and code repo. Is there a way to use this code to reproduce the informative features analysis (Lost in pruning / what is preserved)?

Thanks in advance!

export to onnx

Hi
Thanks for sharing this awesome code!

I noticed some customize pytorch modules is used in the trained pytorch models.
I'm wondering if one could export the fine-tuned model to onnx after, for example, applying ALDS methods?
Or is there any plan to support this?

It would be great if the trained model can be exported to onnx for deployment.

codes for reproducing your results

Hi,

I would like to perform the sparse flow method proposed in your paper "Sparse Flows: Pruning Continuous-depth Models".
Could you please share the links to some of your examples in the paper?

Thanks

where is the lib?

Hi,when i use the trochprune, it show ModuleNotFoundError: No module named 'torchprune.util.external.ffjord.lib'
I can't find the lib file in the code, maybe the git ignore the lib file?

ResNet56 architecture

src/provable_pruning/provable_pruning/util/models/cnn/models/cifar/resnet.py

In your code for ResNet on Cifar dataset, block = Bottleneck if depth >=44 else BasicBlock. This point is really confusing.

Are you sure you used Bottleneck not BasicBlock when you were conducting experiments? As far as I know, Bottleneck is designed for ResNets on ImageNet data, like ResNet50. I read your paper recently, and I am reading your code. Could you please explain what's your resnet56's architecture like?

Discussion on the role of some classes

Hi,

I'm trying to understand the structure of the core pruning classes. I wanted to start this thread, and see if you can provide some explanation on the roles of the below classes, and how they generally interact with the main net class:

  • Allocator
  • Pruner
  • Sparsifier
  • Tracker

Any other extra information is also much appreciated.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.