Home

Awesome

<h1 id="top" align="center">Awesome LLMOps</h1> <p align="center"><a href="https://awesome.re"><img src="https://awesome.re/badge.svg" alt="Awesome" /></a></p> <p align="center"><img src="./cover.png" height="240" alt="Awesome LLMOps - Awesome list of LLMOps" /></p>

Table of Contents

What is LLMOps?

LLMOps is a part of MLOps practices, specialized form of MLOps that focuses on managing the entire lifecycle of large language models(LLM).

Starting in 2021, as LLMs evolved rapidly and the technology matured, we began to focus on practices for managing LLMs efficiently, and LLMOps, which are adaptations of traditional MLOps practices to LLMs, began to be talked about.

LLMOps vs MLOps

LLMOpsMLOps
DefinitionTools and infrastructure specifically for the development and deployment of large language modelsTools and infrastructure for general machine learning workflows
FocusUnique requirements and challenges of large language modelsGeneral machine learning workflows
Key technologiesLanguage model, Transformers library, human-in-the-loop annotation platformsKubeflow, MLflow, TensorFlow Extended
Key skillsNLP expertise, knowledge of large language models, data management for text dataData engineering, DevOps, Software engineering, Machine learning expertise
Key challengesManaging and labeling large amounts of text data, fine-tuning foundation models for specific tasks, ensuring fairness and ethics in language modelsManaging complex data pipelines, ensuring model interpretability and explainability, addressing model bias and fairness
Industry adoptionEmerging, with a growing number of startups and companies focusing on LLMOpsEstablished, with a large ecosystem of tools and frameworks available
Future outlookLLMOps is expected to become an increasingly important area of study as large language models become more prevalent and powerfulMLOps will continue to be a critical component of the machine learning industry, with a focus on improving efficiency, scalability, and model reliability

:arrow_up: Go to top

Prompt Engineering

:arrow_up: Go to top

Models

NameParameter sizeAnnouncement date
BERT-Large (336M)336 million2018
T5 (11B)11 billion2020
Gopher (280B)280 billion2021
GPT-J (6B)6 billion2021
LaMDA (137B)137 billion2021
Megatron-Turing NLG (530B)530 billion2021
T0 (11B)11 billion2021
Macaw (11B)11 billion2021
GLaM (1.2T)1.2 trillion2021
T5 FLAN (540B)540 billion2022
OPT-175B (175B)175 billion2022
ChatGPT (175B)175 billion2022
GPT 3.5 (175B)175 billion2022
AlexaTM (20B)20 billion2022
Bloom (176B)176 billion2022
BardNot yet announced2023
GPT 4Not yet announced2023
AlphaCode (41.4B)41.4 billion2022
Chinchilla (70B)70 billion2022
Sparrow (70B)70 billion2022
PaLM (540B)540 billion2022
NLLB (54.5B)54.5 billion2022
Alexa TM (20B)20 billion2022
Galactica (120B)120 billion2022
UL2 (20B)20 billion2022
Jurassic-1 (178B)178 billion2022
LLaMA (65B)65 billion2023
Stanford Alpaca (7B)7 billion2023
GPT-NeoX 2.0 (20B)20 billion2023
BloombergGPT50 billion2023
Dolly6 billion2023
Jurassic-2Not yet announced2023
OpenAssistant LLaMa30 billion2023
Koala13 billion2023
Vicuna13 billion2023
PaLM2Not yet announced, Smaller than PaLM12023
LIMA65 billion2023
MPT7 billion2023
Falcon40 billion2023
Llama 270 billion2023
Google GeminiNot yet announced2023
Microsoft Phi-22.7 billion2023
Grok-033 billion2023
Grok-1314 billion2023
Solar10.7 billion2024
Gemma7 billion2024
Grok-1.5Not yet announced2024
DBRX132 billion2024
Claude 3Not yet announced2024
Gemma 1.17 billion2024
Llama 370 billion2024

:arrow_up: Go to top

Optimization

:arrow_up: Go to top

Tools (GitHub)

:arrow_up: Go to top

Tools (Other)

:arrow_up: Go to top

RLHF

:arrow_up: Go to top

Awesome

:arrow_up: Go to top

Contributing

We welcome contributions to the Awesome LLMOps list! If you'd like to suggest an addition or make a correction, please follow these guidelines:

  1. Fork the repository and create a new branch for your contribution.
  2. Make your changes to the README.md file.
  3. Ensure that your contribution is relevant to the topic of LLMOps.
  4. Use the following format to add your contribution:
[Name of Resource](Link to Resource) - Description of resource
  1. Add your contribution in alphabetical order within its category.
  2. Make sure that your contribution is not already listed.
  3. Provide a brief description of the resource and explain why it is relevant to LLMOps.
  4. Create a pull request with a clear title and description of your changes.

We appreciate your contributions and thank you for helping to make the Awesome LLMOps list even more awesome!

:arrow_up: Go to top