Coder Social home page Coder Social logo

ebugger / label-free-network-compression Goto Github PK

View Code? Open in Web Editor NEW

This project forked from holmesshuan/label-free-network-compression

0.0 0.0 0.0 65 KB

Caffe implementation of "Learning Compression from Limited Unlabeled Data" (ECCV2018).

License: BSD 2-Clause "Simplified" License

C++ 8.56% Cuda 5.69% Python 85.75%

label-free-network-compression's Introduction

Label-free-Network-Compression

Caffe implementation of "Learning Compression from Limited Unlabeled Data" (ECCV2018). Quantizing full-precision deep neural networks to 4-bit using only 1K unlabeled images.

How to use?

Part I. Create Quantized Model and Prototxt
# Python2.7
cd ./python
vim config.py # edit pycaffe_path / model_name / train_dataset path / val_dataset path according to your env
python weights_quan.py # quantize weights to 4-bit
python renorm.py # Batch-Norm re-normalization in CPU mode
python activations_quan.py # quantize activations to 8-bit
Part II. Test on validation set
  1. Add act_quantize.cpp and act_quantize.cu to your_caffe_root/src/caffe/layers/.
  2. Add act_quantize.hpp to your_caffe_root/include/caffe/layers/.
  3. make all -j2
  4. make pycaffe
  5. ./build/tools/caffe test --weights /your/BN_quantized_caffemodel/in/config.py --model /your/val_prototxt/in/config.py --gpu XX --iterations 1000 # val_batch_size = 50 in default (Line 10 in config.py)
WARNING:

renorm.py will use 1K images to update Batch-Norm parameters in default. The memory consumption can be pretty large for deep networks (>12G). You may edit Line 8 in config.py to alleviate this problem.

Results:

Models Weights Activations Top-1 (%) Top-5 (%)
AlexNet-BN 32-bit 32-bit 60.43 82.47
ReNorm 4-bit 8-bit 60.12 82.22
ResNet-18 32-bit 32-bit 69.08 89.03
ReNorm 4-bit 8-bit 67.48 88.02
ResNet-50 32-bit 32-bit 75.30 92.11
ReNorm 4-bit 8-bit 73.82 91.33
VGG16-BN 32-bit 32-bit 70.44 89.94
ReNorm 4-bit 8-bit 69.15 89.52
Details:
  1. We report the 224x224 single-crop (cropped from 256xN/Nx256 images) validation accuracy on the ImageNet validation set. BN parameters are updated using 1K randomly selected unlabeled training images.

  2. We quantize the first and last layer to 8-bit using fixed-point quantizer.

Network Structure:

We add a scale layer after each quantized convolution layer, i.e.

equation

equation

  • Blob[0] in Conv : equation
  • Blob[1] in Conv : equation
  • Blob[0] in newly-added scale layer : equation
  • Blob[1] in newly-added scale layer : equation

Reference:

@inproceedings{Label-free,
  author    = {Xiangyu He and
               Jian Cheng},
  title     = {Learning Compression from Limited Unlabeled Data},
  booktitle = {Computer Vision - {ECCV} 2018 - 15th European Conference, Munich,
               Germany, September 8-14, 2018, Proceedings, Part {I}},
  pages     = {778--795},
  year      = {2018}
}

label-free-network-compression's People

Contributors

holmesshuan avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.