Comments (4)
Hey Jo! Thank you for this, getting full Vespa compatibility is definitely something I want included very soon.
I don't think HF_ColBERT
, in practice, does much more than VespaColBERT
here in terms of model initialisation.
There's a potential clarity issue in inheriting BertPreTrainedModel
: at the moment, all the open-source ColBERT models are BERT-based (and the attempts with RoBERta appear universally worse), but it might be an issue in the future if people initialise from better models/if a new, shinier encoder-only model appears and becomes the logical base rather than normal BERT. In practice, I don't think it'll actually hinder anything, since you're only using components that you're manually loading/overwriting.
The bert
and linear
property should both be easy to export, since RAGatouille uses the normal ColBERT training code, the model that gets trained will have them as they're defined features of BaseColBERT
. The bert
property here is a bit of misnomer, it actually stores any underlying model, even if not BERT 🤔
JaColBERT was trained with RAGatouille-like utils (although quite different back then!), and this works as expected (all models have similar, properly-initiated weights):
VespaJaColBERT = VespaColBERT.from_pretrained("bclavie/JaColBERT", dim=128)
HFColBERT = class_factory("bclavie/JaColBERT")
HFJaColBERT = HFColBERT.from_pretrained("bclavie/JaColBERT", colbert_config=ColBERTConfig(dim=128))
BaseColBERT = BaseColBERT("bclavie/JaColBERT")
assert torch.equal(HFJaColBERT.linear.weight, VespaJaColBERT.linear.weight) and \
torch.equal(HFJaColBERT.linear.weight, BaseColBERT.linear.weight)
assert torch.equal(HFJaColBERT.bert.embeddings.word_embeddings.weight, VespaJaColBERT.bert.embeddings.word_embeddings.weight) and \
torch.equal(HFJaColBERT.bert.embeddings.word_embeddings.weight, BaseColBERT.bert.embeddings.word_embeddings.weight)
# assert torch.equal(VespaJaColBERT.linear.weight, VespaColBERTV2.linear.weight) crashes as expected, too!
I think that adding your current exporter, mostly as-is (if you're fine with me re-using the code!) would be fine, and have a ragatouille.models.export_to_vespa
function which could be called via RAGTrainer
and RAGPreTrainedModel
as well would make sense.
I'm also planning on adding an export_to_huggingface_hub()
soon (requiring a credential as a env variable), and will make sure it has an export_vespa_onnx: bool
parameter to it 😄
from ragatouille.
Thanks for that! having this type of utility would make it a lot easier to promote RAGatouille as a way to train ColBERT models!
I think that adding your current exporter, mostly as-is (if you're fine with me re-using the code!) would be fine, and have a ragatouille.models.export_to_vespa function which could be called via RAGTrainer and RAGPreTrainedModel as well would make sense.
Fantastic, would greatly appreciate it! I'm going to try to open a PR to upload onnx artifacts to colbert-ir/colbertv2.0, cc @okhat
from ragatouille.
(tentative implementation in #19)
from ragatouille.
Implemented in #19
from ragatouille.
Related Issues (20)
- How to get token level similarity scores? HOT 1
- Cannot access pre-trained ColBERT model on Windows 11 (CPU-only) HOT 2
- ImportError: DLL load failed while importing segmented_maxsim_cpp: The specified module could not be found. HOT 1
- Can't search with k over 128 HOT 2
- Inconsistent search results length for high top-k values HOT 4
- Rework Dependencies: ship with barebones dependencies & bundle different features as extras HOT 1
- 02-basic_training.ipynb fails HOT 1
- You have a GPU available, but only `faiss-cpu` is currently installed. HOT 4
- TypeError: array([15055, 320, 22479, 2853, 8197, ..., 374, 3827]) is not JSON serializable HOT 5
- Can't install on WSL 2 Windows 10 or Crashes (using faiss-gpu) HOT 8
- mac m1: trainer.train: ImportError: incompatible architecture (have 'x86_64', need 'arm64') HOT 2
- Pytorch 2.1 on Runpod running Examples hangs with message HOT 5
- llama-index version 0.10.x not compatible HOT 2
- Training resume feature isn't available due to removal in upstream ColBERT HOT 1
- Issue with indexing BGE-M3 (large dimensionality vectors) HOT 4
- Replace ColBERT with jina-colbert-v1-en HOT 2
- ImportError: cannot import name 'Document' from 'llama_index' (unknown location) HOT 11
- ImportError: cannot import name 'LLM' from 'llama_index.core.llms' HOT 1
- Discussion / forum for RAGatouille? HOT 1
- Is there a way to quiet the progress bar printout?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ragatouille.