Coder Social home page Coder Social logo

yangkk2019 / knowledge-distillation-for-super-resolution Goto Github PK

View Code? Open in Web Editor NEW

This project forked from vincent-hoo/knowledge-distillation-for-super-resolution

0.0 0.0 0.0 460 KB

ICIP 2020, FAKD: Feature-Affinity Based Knowledge Distillation for Efficient Image Super-Resolution

Shell 4.69% Python 95.31%

knowledge-distillation-for-super-resolution's Introduction

Knowledge Distillation for Super-Resolution

Introduction

This repository is the official implementation of the paper "FAKD: Feature-Affinity Based Knowledge Distillation for Efficient Image Super-Resolution" from ICIP 2020. In this work, we propose a novel and efficient SR model, name Feature Affinity-based Knowledge Distillation (FAKD), by transferring the structural knowledge of a heavy teacher model to a lightweight student model. To transfer the structural knowledge effectively, FAKD aims to distill the second-order statistical information from feature maps and trains a lightweight student network with low computational and memory cost. Experimental results demonstrate the efficacy of our method and superiority over other knowledge distillation based methods in terms of both quantitative and visual metrics.

framework

Main Results

Here is the quantitative results (PSNR and SSIM) of RCAN and SAN with and w/o FAKD. Teacher Network (TN) and Student Network (SN) are under the same network architecture, but with different network depth or width.

result

Note:

Quick Start

Dependencies

  • python 3.6.9
  • pytorch 1.1.0
  • skimage 0.15.0
  • numpy 1.16.4
  • imageio 2.6.1
  • matplotlib
  • tqdm

Data Preparation

We use DIV2K dataset as training set which you can download from here and use four benchmark dataset (Set5, Set14, B100, Urban100) as testing set which you can down from here.

Unpack the tar file and arrange the data directory as follows. Then change the dir_data argument in the code/option.py to {DATA_ROOT}.

${DATA_ROOT}
|-- DIV2K
|-- benchmark
    |-- Set5
    |-- Set14
    |-- B100
    |-- Urban100

Training

Download the teacher model from here and place it into folder teacher_checkpoint.

python train.py --ckp_dir overall_distilation/rcan/SA_x4/ --scale 4 --teacher [RCAN] --model RCAN --alpha 0.5 --feature_loss_used 1 --feature_distilation_type 10*SA --features [1,2,3] --epochs 200 --save_results --chop --patch_size 192

More training scripts can be seen in code/scripts.

Testing

Download the distilled model of RCANx4 from here and test the result.

python test.py --ckp_path <checkpoint path> --TS S --scale 4 --model RCAN --n_resgroups 10 --n_resblocks 6

Acknowledgement

The code is built on EDSR (Pytorch). We thank the authors for sharing the codes.

knowledge-distillation-for-super-resolution's People

Contributors

vincent-hoo avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.