Awesome
<h1 align="center"> <br> <img height="300" src="https://github.com/go-skynet/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd"> <br> LocalAI <br> </h1> <p align="center"> <a href="https://github.com/go-skynet/LocalAI/fork" target="blank"> <img src="https://img.shields.io/github/forks/go-skynet/LocalAI?style=for-the-badge" alt="LocalAI forks"/> </a> <a href="https://github.com/go-skynet/LocalAI/stargazers" target="blank"> <img src="https://img.shields.io/github/stars/go-skynet/LocalAI?style=for-the-badge" alt="LocalAI stars"/> </a> <a href="https://github.com/go-skynet/LocalAI/pulls" target="blank"> <img src="https://img.shields.io/github/issues-pr/go-skynet/LocalAI?style=for-the-badge" alt="LocalAI pull-requests"/> </a> <a href='https://github.com/go-skynet/LocalAI/releases'> <img src='https://img.shields.io/github/release/go-skynet/LocalAI?&label=Latest&style=for-the-badge'> </a> </p> <p align="center"> <a href="https://hub.docker.com/r/localai/localai" target="blank"> <img src="https://img.shields.io/badge/dockerhub-images-important.svg?logo=Docker" alt="LocalAI Docker hub"/> </a> <a href="https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest" target="blank"> <img src="https://img.shields.io/badge/quay.io-images-important.svg?" alt="LocalAI Quay.io"/> </a> </p> <p align="center"> <a href="https://twitter.com/LocalAI_API" target="blank"> <img src="https://img.shields.io/twitter/follow/LocalAI_API?label=Follow: LocalAI_API&style=social" alt="Follow LocalAI_API"/> </a> <a href="https://discord.gg/uJAeKSAGDy" target="blank"> <img src="https://dcbadge.vercel.app/api/server/uJAeKSAGDy?style=flat-square&theme=default-inverted" alt="Join LocalAI Discord Community"/> </a> </p>:bulb: Get help - βFAQ πDiscussions :speech_balloon: Discord :book: Documentation website
π» Quickstart πΌοΈ Models π Roadmap π₯½ Demo π Explorer π« Examples
LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API thatβs compatible with OpenAI (Elevenlabs, Anthropic... ) API specifications for local AI inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU. It is created and maintained by Ettore Di Giacinto.
Run the installer script:
curl https://localai.io/install.sh | sh
Or run with docker:
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-cpu
# Alternative images:
# - if you have an Nvidia GPU:
# docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12
# - without preconfigured models
# docker run -ti --name local-ai -p 8080:8080 localai/localai:latest
# - without preconfigured models for Nvidia GPUs
# docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-gpu-nvidia-cuda-12
## AIO images (it will pre-download a set of models ready for use, see https://localai.io/basics/container/)
# docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu
To load models:
# From the model gallery (see available models with `local-ai models list`, in the WebUI from the model tab, or visiting https://models.localai.io)
local-ai run llama-3.2-1b-instruct:q4_k_m
# Start LocalAI with the phi-2 model directly from huggingface
local-ai run huggingface://TheBloke/phi-2-GGUF/phi-2.Q8_0.gguf
# Install and run a model from the Ollama OCI registry
local-ai run ollama://gemma:2b
# Run a model from a configuration file
local-ai run https://gist.githubusercontent.com/.../phi-2.yaml
# Install and run a model from a standard OCI registry (e.g., Docker Hub)
local-ai run oci://localai/phi-2:latest
π° Latest project news
- Oct 2024: examples moved to LocalAI-examples
- Aug 2024: π FLUX-1, P2P Explorer
- July 2024: π₯π₯ π P2P Dashboard, LocalAI Federated mode and AI Swarms: https://github.com/mudler/LocalAI/pull/2723
- June 2024: π You can browse now the model gallery without LocalAI! Check out https://models.localai.io
- June 2024: Support for models from OCI registries: https://github.com/mudler/LocalAI/pull/2628
- May 2024: π₯π₯ Decentralized P2P llama.cpp: https://github.com/mudler/LocalAI/pull/2343 (peer2peer llama.cpp!) π Docs https://localai.io/features/distribute/
- May 2024: π₯π₯ Openvoice: https://github.com/mudler/LocalAI/pull/2334
- May 2024: π Function calls without grammars and mixed mode: https://github.com/mudler/LocalAI/pull/2328
- May 2024: π₯π₯ Distributed inferencing: https://github.com/mudler/LocalAI/pull/2324
- May 2024: Chat, TTS, and Image generation in the WebUI: https://github.com/mudler/LocalAI/pull/2222
- April 2024: Reranker API: https://github.com/mudler/LocalAI/pull/2121
Roadmap items: List of issues
π₯π₯ Hot topics (looking for help):
- Multimodal with vLLM and Video understanding: https://github.com/mudler/LocalAI/pull/3729
- Realtime API https://github.com/mudler/LocalAI/issues/3714
- π₯π₯ Distributed, P2P Global community pools: https://github.com/mudler/LocalAI/issues/3113
- WebUI improvements: https://github.com/mudler/LocalAI/issues/2156
- Backends v2: https://github.com/mudler/LocalAI/issues/1126
- Improving UX v2: https://github.com/mudler/LocalAI/issues/1373
- Assistant API: https://github.com/mudler/LocalAI/issues/1273
- Moderation endpoint: https://github.com/mudler/LocalAI/issues/999
- Vulkan: https://github.com/mudler/LocalAI/issues/1647
- Anthropic API: https://github.com/mudler/LocalAI/issues/1808
If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22
π Features
- π Text generation with GPTs (
llama.cpp
,gpt4all.cpp
, ... :book: and more) - π£ Text to Audio
- π Audio to Text (Audio transcription with
whisper.cpp
) - π¨ Image generation with stable diffusion
- π₯ OpenAI-alike tools API
- π§ Embeddings generation for vector databases
- βοΈ Constrained grammars
- πΌοΈ Download Models directly from Huggingface
- π₯½ Vision API
- π Reranker API
- ππ§ P2P Inferencing
- π Integrated WebUI!
π» Usage
Check out the Getting started section in our documentation.
π Community and integrations
Build and deploy custom containers:
WebUIs:
- https://github.com/Jirubizu/localai-admin
- https://github.com/go-skynet/LocalAI-frontend
- QA-Pilot(An interactive chat project that leverages LocalAI LLMs for rapid understanding and navigation of GitHub code repository) https://github.com/reid41/QA-Pilot
Model galleries
Other:
- Helm chart https://github.com/go-skynet/helm-charts
- VSCode extension https://github.com/badgooooor/localai-vscode-plugin
- Terminal utility https://github.com/djcopley/ShellOracle
- Local Smart assistant https://github.com/mudler/LocalAGI
- Home Assistant https://github.com/sammcj/homeassistant-localai / https://github.com/drndos/hass-openai-custom-conversation / https://github.com/valentinfrlch/ha-gpt4vision
- Discord bot https://github.com/mudler/LocalAGI/tree/main/examples/discord
- Slack bot https://github.com/mudler/LocalAGI/tree/main/examples/slack
- Shell-Pilot(Interact with LLM using LocalAI models via pure shell scripts on your Linux or MacOS system) https://github.com/reid41/shell-pilot
- Telegram bot https://github.com/mudler/LocalAI/tree/master/examples/telegram-bot
- Github Actions: https://github.com/marketplace/actions/start-localai
- Examples: https://github.com/mudler/LocalAI/tree/master/examples/
π Resources
- LLM finetuning guide
- How to build locally
- How to install in Kubernetes
- Projects integrating LocalAI
- How tos section (curated by our community)
:book: π₯ Media, Blogs, Social
- Run Visual studio code with LocalAI (SUSE)
- π Run LocalAI on Jetson Nano Devkit
- Run LocalAI on AWS EKS with Pulumi
- Run LocalAI on AWS
- Create a slackbot for teams and OSS projects that answer to documentation
- LocalAI meets k8sgpt
- Question Answering on Documents locally with LangChain, LocalAI, Chroma, and GPT4All
- Tutorial to use k8sgpt with LocalAI
Citation
If you utilize this repository, data in a downstream project, please consider citing it with:
@misc{localai,
author = {Ettore Di Giacinto},
title = {LocalAI: The free, Open source OpenAI alternative},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/go-skynet/LocalAI}},
β€οΈ Sponsors
Do you find LocalAI useful?
Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.
A huge thank you to our generous sponsors who support this project covering CI expenses, and our Sponsor list:
<p align="center"> <a href="https://www.spectrocloud.com/" target="blank"> <img height="200" src="https://github.com/go-skynet/LocalAI/assets/2420543/68a6f3cb-8a65-4a4d-99b5-6417a8905512"> </a> <a href="https://www.premai.io/" target="blank"> <img height="200" src="https://github.com/mudler/LocalAI/assets/2420543/42e4ca83-661e-4f79-8e46-ae43689683d6"> <br> </a> </p>π Star history
π License
LocalAI is a community-driven project created by Ettore Di Giacinto.
MIT - Author Ettore Di Giacinto mudler@localai.io
π Acknowledgements
LocalAI couldn't have been built without the help of great software already available from the community. Thank you!
- llama.cpp
- https://github.com/tatsu-lab/stanford_alpaca
- https://github.com/cornelk/llama-go for the initial ideas
- https://github.com/antimatter15/alpaca.cpp
- https://github.com/EdVince/Stable-Diffusion-NCNN
- https://github.com/ggerganov/whisper.cpp
- https://github.com/saharNooby/rwkv.cpp
- https://github.com/rhasspy/piper
π€ Contributors
This is a community project, a special thanks to our contributors! π€ <a href="https://github.com/go-skynet/LocalAI/graphs/contributors"> <img src="https://contrib.rocks/image?repo=go-skynet/LocalAI" /> </a>