Awesome
Training DALL-E with volunteers from all over the Internet
<p class="mb-2">
This repository is a part of the NeurIPS 2021 demonstration <u><a href="https://training-transformers-together.github.io/">"Training Transformers Together"</a></u>.
</p>
<p class="mb-2">
In this demo, we train a model similar to <u><a target="_blank" href="https://openai.com/blog/dall-e/">OpenAI DALL-E</a></u> —
a Transformer "language model" that generates images from text descriptions.
Training happens collaboratively — volunteers from all over the Internet contribute to the training using hardware available to them.
We use <u><a target="_blank" href="https://laion.ai/laion-400-open-dataset/">LAION-400M</a></u>,
the world's largest openly available image-text-pair dataset with 400 million samples. Our model is based on
the <u><a target="_blank" href="https://github.com/lucidrains/DALLE-pytorch">dalle‑pytorch</a></u> implementation
by <u><a target="_blank" href="https://github.com/lucidrains">Phil Wang</a></u> with a few tweaks to make it communication-efficient.
</p>
<p class="mb-2">
See details about how to join and how it works on <u><a target="_blank" href="https://training-transformers-together.github.io/">our website</a></u>.
</p>