Skip to content

Latest commit

 

History

History
24 lines (12 loc) · 798 Bytes

pretrain.md

File metadata and controls

24 lines (12 loc) · 798 Bytes

Pretraining

We provide the pretraining script as well as the pretrained model to support the finetuning on downstream tasks. More models will be updated continuously.

Pretraining Scripts

The following script run the pretraining on the MSCOCO, VG, SBU, Conceptual datasets.

$ bash scripts/train-pt.sh

As the pretraining stage complete, you may run the finetuning scripts on downstream tasks.

Pretrained Models

name model size download
ROSITA-base 116M model