mechanicalsea / lighthubert Goto Github PK
View Code? Open in Web Editor NEWLightHuBERT: Lightweight and Configurable Speech Representation Learning with Once-for-All Hidden-Unit BERT
License: MIT License
LightHuBERT: Lightweight and Configurable Speech Representation Learning with Once-for-All Hidden-Unit BERT
License: MIT License
Hello, I have a question about 10-hour ASR fine-tuning in your paper.
Can you give me a procedure about this experiment? (or the link I can refer)
I just want to conduct the my own experiments for 10-hour ASR fine-tuning using fairseq.
Thanks!
This work is great, and the performence of light hubert is even better than Hubert-Large(according to ).
So I was wondering how to train a light hubert model. Can you opensource the training code?
Hi, this work is awesome and helps me a lot. But I don't how to save the subnet into a checkpoint , could you provide some ways to get the checkpoint of subnet? Thanks a lot.
Hello Mr. Wang!
First of all, I would like to thank you for your work and effort to make it open source.
I've been working on the robustness of SRL models and I'm trying to reproduce the downstream models from SUPERB.
Do you have the CKPT files generated when training the SUPERB models? If not, could you inform the parameters used in the config.yaml file from the tasks? With this, I could reproduce the numbers in the table.
Best regards,
Heitor
Hi,
I'm trying to reproduce lighthubert_stage1 and lighthubert_small, but got a big performance gap... Could you please supply more details of your training process (such as lr, scheduler or loss function code) for stage1 and stage2 training?
Thank you very much
Hi, thank you for uploading the code. It is really helpful. :)
Is it possible to also upload the code to train the pipeline e2e? Thank you!!
Hi,
Thanks for your great work! I have some questions about the two-stage training. I'd appreciate it if you could share more details.
Stage 2 - Once-for-All Training
, which model is used as the teacher? Is it the original HuBERT base, or the distilled model from Stage 1?small
supernet initialized in Stage 2? I guess it is also initialized with the distilled model from Stage 1, but their sizes are different?Thank you for your time!
Hello!
Thanks for the great work!
My colleague @edward0804 and I are thinking about integrating lighthubert into S3PRL to enable more research.
Instead of copying all the lighthubert code into S3PRL, we are wondering whether adding a setup.py in this repo would be a good alternative so that we can simply install it, enabling lighthubert in the S3PRL codebase, and link the interested user to this repo for the actual implementation.
I have made a minimal fork for this and so lighthubert can be installed in S3PRL after this commit s3prl/s3prl@07c5bd8, and @edward0804 is working on adding a wrapper for lighthubert. Do you think it would be nice to add an official setup.py ? :)
Thanks!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.