Home

Awesome

<h1 align="center"> <br> <img height="300" src="https://github.com/mudler/LocalAGI/assets/2420543/b69817ce-2361-4234-a575-8f578e159f33"> <br> LocalAGI <br> </h1>

AutoGPT, babyAGI, ... and now LocalAGI!

LocalAGI is a small 🤖 virtual assistant that you can run locally, made by the LocalAI author and powered by it.

The goal is:

Note: Be warned! It was hacked in a weekend, and it's just an experiment to see what can be done with local LLMs.

Screenshot from 2023-08-05 22-40-40

🚀 Features

Demo

Search on internet (interactive mode)

https://github.com/mudler/LocalAGI/assets/2420543/23199ca3-7380-4efc-9fac-a6bc2b52bdb3

Plan a road trip (batch mode)

https://github.com/mudler/LocalAGI/assets/2420543/9ba43b82-dec5-432a-bdb9-8318e7db59a4

Note: The demo is with a GPU and 30b models size

:book: Quick start

No frills, just run docker-compose and start chatting with your virtual assistant:

# Modify the configuration
# vim .env
# first run (and pulling the container)
docker-compose up
# next runs 
docker-compose run -i --rm localagi

How to use it

By default localagi starts in interactive mode

Examples

Road trip planner by limiting searching to internet to 3 results only:

docker-compose run -i --rm localagi \
  --skip-avatar \
  --subtask-context \
  --postprocess \
  --search-results 3 \
  --prompt "prepare a plan for my roadtrip to san francisco"

Limit results of planning to 3 steps:

docker-compose run -i --rm localagi \
  --skip-avatar \
  --subtask-context \
  --postprocess \
  --search-results 1 \
  --prompt "do a plan for my roadtrip to san francisco" \
  --plan-message "The assistant replies with a plan of 3 steps to answer the request with a list of subtasks with logical steps. The reasoning includes a self-contained, detailed and descriptive instruction to fullfill the task."

Advanced

localagi has several options in the CLI to tweak the experience:

Customize

To use a different model, you can see the examples in the config folder. To select a model, modify the .env file and change the PRELOAD_MODELS_CONFIG variable to use a different configuration file.

Caveats

The "goodness" of a model has a big impact on how LocalAGI works. Currently 13b models are powerful enough to actually able to perform multi-step tasks or do more actions. However, it is quite slow when running on CPU (no big surprise here).

The context size is a limitation - you can find in the config examples to run with superhot 8k context size, but the quality is not good enough to perform complex tasks.

What is LocalAGI?

It is a dead simple experiment to show how to tie the various LocalAI functionalities to create a virtual assistant that can do tasks. It is simple on purpose, trying to be minimalistic and easy to understand and customize for everyone.

It is different from babyAGI or AutoGPT as it uses LocalAI functions - it is a from scratch attempt built on purpose to run locally with LocalAI (no API keys needed!) instead of expensive, cloud services. It sets apart from other projects as it strives to be small, and easy to fork on.

How it works?

LocalAGI just does the minimal around LocalAI functions to create a virtual assistant that can do generic tasks. It works by an endless loop of intent detection, function invocation, self-evaluation and reply generation (if it decides to reply! :)). The agent is capable of planning complex tasks by invoking multiple functions, and remember things from the conversation.

In a nutshell, it goes like this:

Under the hood LocalAI converts functions to llama.cpp BNF grammars. While OpenAI fine-tuned a model to reply to functions, LocalAI constrains the LLM to follow grammars. This is a much more efficient way to do it, and it is also more flexible as you can define your own functions and grammars. For learning more about this, check out the LocalAI documentation and my tweet that explains how it works under the hoods: https://twitter.com/mudler_it/status/1675524071457533953.

Agent functions

The intention of this project is to keep the agent minimal, so can be built on top of it or forked. The agent is capable of doing the following functions:

Roadmap

Development

Run docker-compose with main.py checked-out:

docker-compose run -v main.py:/app/main.py -i --rm localagi

Notes