yoshitomo-matsubara / torchdistill Goto Github PK
View Code? Open in Web Editor NEWA coding-free framework built on PyTorch for reproducible deep learning studies. ๐25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. ๐ Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Home Page: https://yoshitomo-matsubara.net/torchdistill/
License: MIT License