Home

Awesome

๐ŸŽ’ local.ai

A desktop app for local, private, secured AI experimentation. Included out-of-the box are:

It's made to be used alongside https://github.com/alexanderatallah/window.ai/ as a simple way to have a local inference server up and running in no time. window.ai + local.ai enable every web app to utilize AI without incurring any cost from either the developer or the user!

Right now, local.ai uses the https://github.com/rustformers/llm rust crate at its core. Check them out, they are super cool!

๐Ÿš€ Install

Go to the site at https://www.localai.app/ and click the button for your machine's architecture. You can also find the build manually in the GitHub release page.

Windows and MacOS binaries are signed under Plasmo Corp. - a company owned by the author of this project (@louisgv).

You may also build from source!

๐Ÿ“บ Demo

<!-- https://github.com/louisgv/local.ai/assets/6723574/900f6d83-0867-4aa1-886a-e3c59b144864 --> <video src="https://github.com/louisgv/local.ai/assets/6723574/ba4a04dc-5087-4725-b619-165ad774aedd" controls="controls" style="max-width: 470px;"> </video> <!-- https://github.com/louisgv/local.ai/assets/6723574/c56400b4-4520-47da-80fb-ab8552a2683b -->

๐Ÿงต Development

Here's how to run the project locally:

Prerequisites

  1. node >= 18
  2. rust >= 1.69
  3. pnpm >= 8

Workflow

git submodule update --init --recursive
pnpm i
pnpm dev

๐Ÿชช License

๐Ÿค” Trivia

Why the backpack?

Ties into the bring your own model concept -- Alex from window.ai

Why GPLv3?

Anything AI-related including their derivatives should be open-source for all to inspect. GPLv3 enforces this chain of open-source.

Is there a community?

Where should I ask question?

I made something with local.ai, where should I post it?

I have some nice things to say about local.ai, where should I post it?

The naming seems close to LocalAI?

Do you accept contribution/PR?

Absolutely - Please note that any contribution toward this repo shall be relicensed under GPLv3. There are many ways to contribute, such as: