Skip to content

Latest commit

 

History

History
15 lines (12 loc) · 642 Bytes

README.md

File metadata and controls

15 lines (12 loc) · 642 Bytes

finetune-jina-clip-v2

Thanks to https://huggingface.co/jinaai/jina-clip-v2, I can get a good pretrained model suporting multi-language. Beacause I want to further finetune the model using our own domain-specific data. But there is no public training code for jina-clip-v2. So I write this project to training it.

You organize your own data in

dataset_own.py

then starting training:

python train_clip.py

or starting training using accerlerator

python train_clip_accelerator.py

training starting like

EB18249A-09CF-462D-9470-A53C27B3E49C