Skip to content

Latest commit

 

History

History
22 lines (17 loc) · 910 Bytes

README.md

File metadata and controls

22 lines (17 loc) · 910 Bytes

Training Code for LLM360 K2-65B

This repository contains the code for training K2-65B, a 65 billion parameter large language model from LLM360.

Note

This repository is under active development. If you have suggestions or find bugs, please open a GitHub issue or reach out.

Environment

The simplest way to launch the training should be using our Docker Image. We will provide a more detailed writeup of the environment later.

Launch Training

To launch training, run:

bash scripts/pretrain_65b.sh

Converting Megatron Checkpoints to HuggingFace Format

To convert model checkpoints from Megatron to HuggingFace format, run:

python convert_ckpt_to_hf.py --load_path <megatron_ckpt_dir> --save_path <huggingface_ckpt_dir>