Coder Social home page Coder Social logo

Seunghyun Lee πŸ‘‹ sseung0703

  • Welcome to my Github page. I am a Ph.D at Inha University in South Korea.
    My research areas are machine learning, deep learning, and especially the light-weighting the convolutional neural networks such as knowledge distillation and filter pruning.
    You can find my curriculum vitae here.

ML libraries 🧱

  • Tensorflow (1.x and 2.x): Professional
  • Pytorch: Upper intermediate
  • JAX: Upper intermediate

Anurag's github stats

Academic activity πŸ•Ή

  • Google developers experts from May 2022
  • Leader of deep learning paper study group: link
  • Major contributor of the implementation project for Putting NeRF on a Diet in πŸ€—HuggingFace X GoogleAI Flax/JAX Community Week Event (won the 2nd price! πŸ˜†)
  • Have served as a reviewer for CVPR, ICCV, ECCV, and so on.

Publication πŸ“œ

First author of

  • "Fast Filter Pruning via Coarse-to-Fine Neural Architecture Search and Contrastive Knowledge Transfer" on IEEE TNNLS (2023) [paper]
  • "Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning" on ECCV2022 [paper] [code]
  • "Interpretable Embedding Procedure Knowledge Transfer via Stacked Principal Component Analysis and Graph Neural Network" on AAAI2021 [paper] [code]
  • "Knowledge Transfer via Decomposing Essential Information in Convolutional Neural Networks" on IEEE TNNLS (2020) [paper] [TF1 code, TF2 code]
  • "Filter Pruning and Re-Initialization via Latent Space Clustering" on IEEE Access (2020) [paper]
  • "Transformation of Non-Euclidean Space to Euclidean Space for Efficient Learning of Singular Vectors" on IEEE Access (2020) [paper]
  • "Graph-based Knowledge Distillation by Multi-head Attention Network." on BMVC2019 oral [paper] [code]
  • "Self-supervised Knowledge Distillation Using Singular Value Decomposition" on ECCV2018 [paper] [TF1 code, TF2 code]

Co-author of

  • "CFA: Coupled-hypersphere-based Feature Adaptation for Target-Oriented Anomaly Localization" on IEEE Access (2022) [paper] [code]
  • "Balanced knowledge distillation for one-stage object detector" on Neurocomputing (2022) [paper]
  • "Vision Transformer for Small-Size Datasets" on arxiv preprint [paper] [code]
  • "Contextual Gradient Scaling for Few-Shot Learning" on WACV2022 [paper] [code]
  • "Zero-Shot Knowledge Distillation Using Label-Free Adversarial Perturbation With Taylor Approximation" on IEEE Access (2021) [paper] [code]
  • "Channel Pruning Via Gradient Of Mutual Information For Light-Weight Convolutional Neural Networks" on ICIP 2020 [paper]
  • "Real-time purchase behavior recognition system based on deep learning-based object detection and tracking for an unmanned product cabinet" on ESWA (2020) [paper]
  • "Metric-Based Regularization and Temporal Ensemble for Multi-Task Learning using Heterogeneous Unsupervised Tasks" on ICCVW2019 [paper]
  • "MUNet: macro unit-based convolutional neural network for mobile devices" on CVPRW2018 [paper]

and so on πŸŽ“

Seunghyun Lee's Projects

ab_distillation icon ab_distillation

Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)

access_kd icon access_kd

This project solves a problem of ZSKT in neurips2019

ekg icon ekg

Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning

gala_tf2.0 icon gala_tf2.0

Tensorflow 2.0 implementation of "Symmetric Graph Convolutional Autoencoder for Unsupervised Graph Representation Learning" in ICCV2019

graph2vec_tf icon graph2vec_tf

This repository contains the "tensorflow" implementation of our paper "graph2vec: Learning distributed representations of graphs".

iepkt icon iepkt

Implementation of "Interpretable embedding procedure knowledge transfer" on AAAI2021

kd_methods_with_tf icon kd_methods_with_tf

Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)

lightweighting_cookbook icon lightweighting_cookbook

This project attempts to build neural network training and lightweighting cookbook including three kinds of lightweighting solutions, i.e., knowledge distillation, filter pruning, and quantization.

maml icon maml

Code for "Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks"

mhgd icon mhgd

the presentation materials for Multi-head Graph Distillation ( BMVC2019 oral )

ocr_kor icon ocr_kor

λ”₯λŸ¬λ‹μ„ ν™œμš©ν•œ ν•œκΈ€λ¬Έμž₯ OCR 연ꡬ

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.