Skip to content

Training a model similar to OpenAI DALL-E with volunteers from all over the Internet using hivemind and dalle-pytorch (NeurIPS 2021 demo)

License

Notifications You must be signed in to change notification settings

learning-at-home/dalle-hivemind

Repository files navigation

Training DALL-E with volunteers from all over the Internet

This repository is a part of the NeurIPS 2021 demonstration "Training Transformers Together".

In this demo, we train a model similar to OpenAI DALL-E — a Transformer "language model" that generates images from text descriptions. Training happens collaboratively — volunteers from all over the Internet contribute to the training using hardware available to them. We use LAION-400M, the world's largest openly available image-text-pair dataset with 400 million samples. Our model is based on the dalle‑pytorch implementation by Phil Wang with a few tweaks to make it communication-efficient.

See details about how to join and how it works on our website.

About

Training a model similar to OpenAI DALL-E with volunteers from all over the Internet using hivemind and dalle-pytorch (NeurIPS 2021 demo)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages