Home

Awesome

locutus GPT-2 Text Generation

This sample takes folks through the end-to-end process of deploying the GPT-2 model from HuggingFace fine-tuned on the writings of Homer. These are the demos that were shown in the "Scaling responsible MLOps with Azure Machine Learning" session

Features

This repo has a number of things that can be tried in isolation or together as an end-to-end exercise. These include:

Getting Started

Prerequisites

Generally it is easier to run these exercises in the cloud (given that part of the exercise is creating custom environments). If you want to run these things locally you need to have either a virtual or conda environment that supports PyTorch.

Other requirements are:

Quickstart

A detailed walkthrough of the entire process is available. Some instructions basic for getting everything kicked off:

  1. Clone the repo
  2. Azure Machine Learning Infrastructure Setup
./provision.ps1 -name <YOUR_APP_NAME> -location <LOCATION|i.e. westus2>
  1. Experimentation Notebooks
  2. AzureML Components and Pipelines for finetuning GPT-2
  3. Managed Inference
  4. GitHub Actions for ML Component CI/CD
  5. Responsible AI Dashboard

Resources