Skip to content

Latest commit

 

History

History
49 lines (29 loc) · 2.1 KB

README.md

File metadata and controls

49 lines (29 loc) · 2.1 KB

Metaseq

A codebase for working with Open Pre-trained Transformers.

Community Integrations

Using OPT with 🤗 Transformers

The OPT 125M--30B models are now available in HuggingFace Transformers.

Using OPT-175B with Alpa

The OPT 125M--175B models are now supported in the Alpa project, which enables serving OPT-175B with more flexible parallelisms on older generations of GPUs, such as 40GB A100, V100, T4, M60, etc.

Getting Started in Metaseq

Follow setup instructions here to get started.

Documentation on workflows

Background Info

Support

If you have any questions, bug reports, or feature requests regarding either the codebase or the models released in the projects section, please don't hesitate to post on our Github Issues page.

Please remember to follow our Code of Conduct.

Contributing

We welcome PRs from the community!

You can find information about contributing to metaseq in our Contributing document.

The Team

Metaseq is currently maintained by the CODEOWNERS: Susan Zhang, Stephen Roller, Naman Goyal, Punit Singh Koura, Moya Chen, and Kurt Shuster.

Previous maintainers include: Anjali Sridhar, Christopher Dewan.

License

The majority of metaseq is licensed under the MIT license, however portions of the project are available under separate license terms: