Comments (5)
The dataset is actually quite okay but we have had some issues with translation of a few languages so far, so we will have to redo a few too in the coming weeks.
As for oasst2: es it takes longer but you can already do that using the current scripts if you want to. Same for Mistral.
Mixtral we are not sure yet but will support in the future too.
from llama2lang.
Yeah madlad was broken on special characters. I fixed this now, try again.
from llama2lang.
Yeah madlad was broken on special characters. I fixed this now, try again.
Yes I confirm that it works.
For this translation steps, based on your experience, do you think that a 12Gb Vram could be enough ? (In colab I see it under 6Gb Vram at the moment - just started - so was wondering if I could rather translate on prem). Thanks
from llama2lang.
You can always get it to work by lowering batch_size and/or using quantization but both slow down translation speeds unfortunately. We will be looking into adding a lot more translation models to pick from and making it easy to switch between those so hopefully in the (near) future this will become easier.
For now I would say that if your (target) language is well supported in OPUS (meaning from at least English and Spanish, there is a direct model, check HF) you should go with those. In other cases you have to opt for madlad but tweak the sizing and quantization (--madlad_quant for 8bit and --madlad_quant4 for 4bit).
from llama2lang.
hat if your (target) language is well supported in OPUS (meaning from at least English and Spanish, there is a direct model, check HF) you should go with those.
Thanks for the details.
As for "translating the Dataset", Yes last week I made all the steps and use what you guys already made (big thanks!) all the way through without this initial training, but rather by reusing your QLora. The thing is that this morning I had a chance to do some tests with the resulting model and it was not really good. I guess that this might depend on the Dataset, rather than the translation (actually I saw that there is a bigger version of the same OpenAssistant : oasst2 but probably this requires much more translation time).
Last question: If I use another base model, let's say one that is based on Mistral 7b, I guess it's as simple as changing the base model arguments and no other attention points ?
(apologies for been off topic, ticket is fine and about to close it)
from llama2lang.
Related Issues (20)
- Feature request: ChatML support
- Madlad: unrecognized arguments: --model_size 7b HOT 2
- Got this error during finetuning HOT 3
- Support finetuning from local disk too
- Issue with THREAD_TEMPLATE HOT 3
- Sample Example for finetuning
- Feedback on the hindi fientuned model HOT 7
- SeamlessM4T-v2 default (medium) model removed from huggingface HOT 1
- nllb.py and madlad.py points to the incorrect HF repositories HOT 9
- error running benchmark.py with seamless HOT 2
- Dataset chat format independent HOT 2
- problem with run_inference.py HOT 4
- Best translation model for turkish HOT 2
- Translating takes too long (How to finetuning with QLoRA?) HOT 2
- Question: How would I do this with Phi 2? HOT 1
- Question: translating a monolingual HF Dataset HOT 2
- Can you make a dataset of LLaMa3-8B translated into Japanese? HOT 2
- [Question] What framework is able to load this adapter and serve the resulting model as an OpenAI endpoint? HOT 1
- [Bug] Error with benchmarking: 'NoneType' object is not iterable HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from llama2lang.