v1.0.0 (Beginner's Luck)
Three weeks on the heels of our tech preview we are excited to announce first stable release of Curated Transformers! 🎉 From this release onwards, we provide a stable API following semantic versioning guidelines. Of course, this release is also packed with new features.
✨ New features and improvements since version 0.9.0
- Support Llama 2 (#263, #265).
- Support the Falcon new decoder architecture, used by the 40B parameter models (#253).
- Support for ALiBi to the self attention layer (#252) and support Falcon with ALiBi (#260).
- Support
torch.compile
(#257) and TorchScript tracing for all models (#262, #266). - New logit transforms: vocabulary masking and (#245) and top-p filtering (#255).
- Support authentication when downloading tokenizers from Hugging Face Hub (#267).
- Many improvements to the building blocks, such as shared configuration between models (#258) and shared encoder/decoder layers (#248). As a result, most model definitions are very short.
- The APIs and documentation were polished a lot, to make the semantic versioning guarantees for 1.x.y possible.