Home

Awesome


Task: TextGeneration Tags:


Model-GPT2-Megatron-dvc

🔥🔥🔥 Deploy GPT-2 Megatron-LM model on VDP. It is trained on text sourced from Wikipedia, RealNews, OpenWebText, and CC-Stories and contains 345 million parameters.

This repo contains GPT-2 model in FasterTransformer format manged by DVC.