*This file contains a high-level description of changes that were merged into the mergoo main branch since the last release.
0.0.9:
- Support Phi3
0.0.8:
- Support LLaMa3
- Notebook added for LLaMa3 based LLMs
0.0.7:
- Supports Mixture of adapters
- Notebook added for Mixture of adapters
0.0.6:
- Supports recent merging methods including Mixture-of-Experts and Layer-wise merging
- Flexible merging choice for each layer
- Base Models supported : Llama and Mistral
- Trainers supported : 🤗 Trainer, SFTrainer
- Device Supported: CPU, MPS, GPU
- Training choices: Finetune Only Router of MoE layers, Fully fine-tuning of Merged LLM
0.0.10: