g2sd's People
g2sd's Issues
Could you show more details of the layer decay schedule when training the student for downstream tasks?
A layer decay schedule is mentioned in Section 5.1 Implementation details "To avoid deteriorating the general representations obtained from the previous stage, a layer decay schedule is adopted to train the student model for all downstream tasks."
Could you show more details of the layer decay schedule? Or point me to the code/reference of the schedule?
Thanks
我在eval发布的权重时出现了一些问题,能否提供一下eval的方法
在我对您在Google Drive上发布的权重文件进行eval的时候出现了一些错误,出现了大面积的准确率为0的情况,这是我的命令
python main_finetune_dis.py --eval --resume "/data_hdd1/wjl/checkpoint-199.pth" --model vit_small_patch16 --batch_size 16 --data_path '/data/ILSVRC2012/ImageNet1000' --eval_data_path '/data/ILSVRC2012/ImageNet1000' --teacher_path '/data_hdd1/wjl/mae_finetuned_vit_base.pth' --finetune "/data_hdd1/wjl/checkpoint-199.pth" --teacher_model vit_base_patch16
checkpoint-199.pth下载自google drive中的vit_s_gd_sd_classification,事实上我在fintune的时候也遇到了同样的问题,期待您的回复,非常感谢
Missing ViT_ti_GD_SD_classification weights
Thanks for releasing the model weights! But there is no checkpoint.pth file when I opened the "ViT_ti_GD_SD_classification" folder at the google drive link. Was this file somehow missing? Would you please upload that vit_tiny weights again? I'll really appreciate it!
detectron2
How to solve ImportError: cannot import name 'build_lr_scheduler_distill' from 'detectron2.solver.lr_scheduler' ?
[NEED HELP] Can't reproduce Generic Distillation experiment in ImageNet1k
Hi! Thanks for the great job!
I encountered some difficulties in reproducing generic distillation experiments. I use official mae_pretrain_base.pth as teacher model and mae_vit_small_patch16_dec256d4b as student network. Then I kept the original code unchanged and only modified the GD.sh script to conduct the Generic Distillation experiment. But the loss only decreased during the first few epochs and was stuck at about 0.2 in the rest epochs, which was far away from the official result in the google drive log(about 0.02). I doubted my training process and finally verified it in the image classification task with Specific Distilltion. After the first epoch in SD, I only got about 36% accuracy in ImageNet1k(official log is 56.34%). Also I changed the seed to 1 and 42 then conducted the GD experiment on ImageNet1k again, still got the similar loss(0.19/0.18). I've checked the implement detail in the paper and found no conflict.
So I can't reproduce the Generic Distillation exp. Is there something I need to modify in the code? Or was there something I'v done wrong? How can I reproduce the GD experiment? I'll really appreciate it if someone could help me! Thank you very much!
Here is my GD.sh script https://drive.google.com/file/d/121dMog7rW5I7gGz8U2c-DzDM4s_x3JOW/view?usp=drive_link
And the program running log before the training epochs https://drive.google.com/file/d/15Z9yHOuR1Tzp8JdOwSYvEN7SGJZtfUfy/view?usp=drive_link
And my python environment (pip list) https://drive.google.com/file/d/1dWQPJ87wc6anvwmsNkVC0aXh9GmQxL_S/view?usp=drive_link
And the output log file https://drive.google.com/file/d/1uv53nyKq_-msFIoZA5H76brSdwHyBvzc/view?usp=drive_link
And the tensorboard result image lr: https://drive.google.com/file/d/1KCGBFOFMM3zPA6QVUu8PK3PbD8Jf_0O4/view?usp=drive_link loss:https://drive.google.com/file/d/1pSQLhKV4K3eErnm9RA5yeLD4GCluJhnx/view?usp=drive_link
And finally my SD log in the first 4 epochs:
{"train_lr": 0.0003996802557953637, "train_loss": 5.64891408925815, "train_loss_gt": 5.9685534642373534, "train_loss_dis": 5.3292747152318585, "test_loss": 2.9740411103988182, "test_acc1": 36.42200002670288, "test_acc5": 64.59000001724243, "epoch": 0, "n_parameters": 22436048}
{"train_lr": 0.0011996802557953635, "train_loss": 4.463338032424402, "train_loss_gt": 5.090619727695207, "train_loss_dis": 3.8360563386544335, "test_loss": 2.264968312212399, "test_acc1": 48.64200002410889, "test_acc5": 74.98000002075196, "epoch": 1, "n_parameters": 22436048}
{"train_lr": 0.001999680255795364, "train_loss": 4.1618424449369105, "train_loss_gt": 4.8575777951285515, "train_loss_dis": 3.4661070930181173, "test_loss": 2.061447801638623, "test_acc1": 52.56200001815796, "test_acc5": 77.82600002288818, "epoch": 2, "n_parameters": 22436048}
{"train_lr": 0.002799680255795364, "train_loss": 4.027493118024845, "train_loss_gt": 4.751268744706917, "train_loss_dis": 3.303717490866316, "test_loss": 1.9335608093106016, "test_acc1": 54.88400001312256, "test_acc5": 79.80000002593994, "epoch": 3, "n_parameters": 22436048}
Realease weights
I'm very interested in your work! I will appreciate if you can realease weights as soon as possible. Thank you very much!
ADE20K results of tiny model
Need env config detail!
Great work I've ever seen! Could you please provide the python/torch version and requirement.txt file in G2SD? Thank you so much!
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.