Coder Social home page Coder Social logo

nathanhubens / fasterai Goto Github PK

View Code? Open in Web Editor NEW
238.0 7.0 17.0 34.35 MB

FasterAI: Prune and Distill your models with FastAI and PyTorch

Home Page: https://nathanhubens.github.io/fasterai/

License: Apache License 2.0

Jupyter Notebook 66.20% Python 2.07% SCSS 0.21% HTML 29.31% CSS 0.43% JavaScript 1.79%
pruning compression fastai pytorch knowledge-distillation

fasterai's Introduction

Fasterai

header

FeaturesInstallationTutorialsCommunityCitingLicense

fasterai is a library created to make neural network smaller and faster. It essentially relies on common compression techniques for networks such as pruning, knowledge distillation, Lottery Ticket Hypothesis, ...

The core feature of fasterai is its Sparsifying capabilities, constructed on 4 main modules: granularity, context, criteria, schedule. Each of these modules is highly customizable, allowing you to change them according to your needs or even to come up with your own !

Project Documentation

Visit Read The Docs Project Page or read following README to know more about using fasterai.


Features

1. Sparsifying

alt text

Make your model sparse (i.e. prune it) according to a:

  • Sparsity: the percentage of weights that will be replaced by 0
  • Granularity: the granularity at which you operate the pruning (removing weights, vectors, kernels, filters)
  • Context: prune either each layer independantly (local pruning) or the whole model (global pruning)
  • Criteria: the criteria used to select the weights to remove (magnitude, movement, ...)
  • Schedule: which schedule you want to use for pruning (one shot, iterative, gradual, ...)

This can be achieved by using the SparsifyCallback(sparsity, granularity, context, criteria, schedule)

2. Pruning

alt text

Once your model has useless nodes due to zero-weights, they can be removed to not be a part of the network anymore.

This can be achieved by using the Pruner() method

3. Regularization

alt text

Instead of explicitely make your network sparse, let it train towards sparse connections by pushing the weights to be as small as possible.

Regularization can be applied to groups of weights, following the same granularities as for sparsifying, i.e.:

  • Granularity: the granularity at which you operate the regularization (weights, vectors, kernels, filters, ...)

This can be achieved by using the RegularizationCallback(granularity)

4. Knowledge Distillation

alt text

Distill the knowledge acquired by a big model into a smaller one, by using the KnowledgeDistillation callback.

5. Lottery Ticket Hypothesis

alt text

Find the winning ticket in you network, i.e. the initial subnetwork able to attain at least similar performances than the network as a whole.


Quick Start

0. Import fasterai

from fasterai.sparse.all import *

1. Create your model with fastai

learn = cnn_learner(dls, model)

2. Get you Fasterai Callback

sp_cb=SparsifyCallback(sparsity, granularity, context, criteria, schedule)

3. Train you model to make it sparse !

learn.fit_one_cycle(n_epochs, cbs=sp_cb)

Installation

pip install git+https://github.com/nathanhubens/fasterai.git

or

pip install fasterai

Tutorials


Join the community

Join our discord server to meet other FasterAI users and share your projects!


Citing

@software{Hubens,
  author       = {Nathan Hubens},
  title        = {fasterai},
  year         = 2022,
  publisher    = {Zenodo},
  version      = {v0.1.6},
  doi          = {10.5281/zenodo.6469868},
  url          = {https://doi.org/10.5281/zenodo.6469868}
}

License

Apache-2.0 License.

footer

fasterai's People

Contributors

nathanhubens avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

fasterai's Issues

[QUESTION] Apply tutorial to fine_tune output

Hi,
I'm pretty new to FastAI and just got to know FasterAI, I was trying to apply the concepts from the Get Started tutorial to my own code.
Currently I'm doing something like the following:

from fastai.vision import all as fst
from fastai.learner import Learner
from fastai.vision.learner import vision_learner
from fastai.tabular.core import df_shrink
# Code to get the datasets...

# Create dataloaders
data_loaders = data_block.dataloaders(tmp_dir, bs=config["batch_size"])  # Config is a dictionay containing useful info
# Create model
base_model = getattr(fst, config["model"])
model = vision_learner(data_loaders, base_model, pretrained=True, lr=0.001,
                       metrics=fst.error_rate)

# Launch training
model.fine_tune(epochs)

And this works fine, I'm able to save and use the model with no issues. But then when I try to do Knowledge Distillation:

from fasterai.distill.all import *
from fastai.vision.all import *
# FasterAI tutorial replica
student = Learner(data_loaders, getattr(fst, config["model"]),
                  metrics=[accuracy])
kd_cb = KnowledgeDistillationCallback(model, SoftTarget)
student.fit_one_cycle(10, 1e-4, cbs=kd_cb)

I'm getting the following error:

File "/home/deeplearning/workspace/refactor/.venv/lib/python3.9/site-packages/fastai/torch_core.py", line 649, in trainable_params
    return [p for p in m.parameters() if p.requires_grad]
AttributeError: 'function' object has no attribute 'parameters'

Other important stuff:

  • Python version: 3.9
  • FastAI version: 2.7.11
  • FasterAI version: 0.1.13

Can you please help me?

[QUESTION] Layer with no bias while prunning?

Hello again,

Continuing the example from my last issue (I'm trying to compress a resnet18 vision_learner model by following the Get Started tutorial), when performing the following operation:

# Launch training
teacher.fine_tune(epochs)

# Distill
m = globals()[config["model"]] # resnet18()
student = Learner(data_loaders, m(num_classes=int(teacher.n_out)),
                  metrics=[accuracy])
kd_cb = KnowledgeDistillationCallback(teacher.model, SoftTarget)
student.fit_one_cycle(epochs, 1e-4, cbs=kd_cb)

# Sparsify
sp_cb = SparsifyCallback(sparsity=50, granularity='filter',
                         context='global', criteria=large_final,
                         schedule=cos)
student.fit(5, 1e-5, cbs=sp_cb)

# Prune
pruner = Pruner()
pruned_model = pruner.prune_model(student.model)

I'm encountering the following error.

 File "/home/deeplearning/workspace/refactor/src/pipeline/training/training_classification.py", line 66, in train
    pruned_model = pruner.prune_model(student.model)
  File "/home/deeplearning/workspace/refactor/.venv/lib/python3.9/site-packages/fasterai/sparse/pruner.py", line 135, in prune_model
    new_m, new_next_m = self.prune_conv(m, layers[layer_names[next_conv_ix]]) # Prune the current conv layer
  File "/home/deeplearning/workspace/refactor/.venv/lib/python3.9/site-packages/fasterai/sparse/pruner.py", line 32, in prune_conv
    new_weights, new_biases, new_next_weights = self.filters_to_keep(layer, nxt_layer)
  File "/home/deeplearning/workspace/refactor/.venv/lib/python3.9/site-packages/fasterai/sparse/pruner.py", line 23, in filters_to_keep
    biases_keep = layer.bias.index_select(0, ixs[0]).data
AttributeError: 'NoneType' object has no attribute 'index_select'

Could you please give me a hand?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.