Home

Awesome

PAR LLAMA

Table of Contents

PyPI PyPI - Python Version
Runs on Linux | MacOS | Windows Arch x86-63 | ARM | AppleSilicon
PyPI - License

About

PAR LLAMA is a TUI application designed for easy management and use of Ollama based LLMs. The application was built with Textual and Rich and runs on all major OS's including but not limited to Windows, Windows WSL, Mac, and Linux.

"Buy Me A Coffee"

Screenshots

Supports Dark and Light mode as well as custom themes.

Local Models Dark

Model View Dark

Site Models Dark

Chat Dark

Custom Prompt Dark

Options Dark

Local Models Light

Videos

V0.3.5 demo

Prerequisites for running

Prerequisites for dev

Prerequisites for huggingface model quantization

If you want to be able to quantize custom models from huggingface, download the following tool from the releases area: HuggingFaceModelDownloader

Install Docker Desktop

Pull the docker image ollama/quantize

docker pull ollama/quantize

Using uv

Installing uv

If you don't have uv installed you can run the following:

curl -LsSf https://astral.sh/uv/install.sh | sh

MyPi install

uv tool install parllama

To upgrade an existing uv installation use the -U --force flags:

uv tool install parllama -U --force

Installing / running using uvx

uvx parllama

Source install from GitHub

uv tool install git+https://github.com/paulrobello/parllama

To upgrade an existing installation use the --force flag:

uv tool install git+https://github.com/paulrobello/parllama -U --force

pipx

Installing

If you don't have pipx installed you can run the following:

pip install pipx
pipx ensurepath

MyPi install

pipx install parllama

To upgrade an existing pipx installation use the --force flag:

pipx install parllama --force

Source install from GitHub

pipx install git+https://github.com/paulrobello/parllama

To upgrade an existing installation use the --force flag:

pipx install git+https://github.com/paulrobello/parllama --force

Installing for dev mode

Clone the repo and run the setup make target. Note uv is required for this.

git clone https://github.com/paulrobello/parllama
cd parllama
make setup

Command line arguments

usage: parllama [-h] [-v] [-d DATA_DIR] [-u OLLAMA_URL] [-t THEME_NAME] [-m {dark,light}]
                [-s {local,site,chat,prompts,tools,create,options,logs}] [--use-last-tab-on-startup {0,1}] [-p PS_POLL] [-a {0,1}]
                [--restore-defaults] [--purge-cache] [--purge-chats] [--purge-prompts] [--no-save] [--no-chat-save]

PAR LLAMA -- Ollama TUI.

options:
  -h, --help            show this help message and exit
  -v, --version         Show version information.
  -d DATA_DIR, --data-dir DATA_DIR
                        Data Directory. Defaults to ~/.parllama
  -u OLLAMA_URL, --ollama-url OLLAMA_URL
                        URL of your Ollama instance. Defaults to http://localhost:11434
  -t THEME_NAME, --theme-name THEME_NAME
                        Theme name. Defaults to par
  -m {dark,light}, --theme-mode {dark,light}
                        Dark / Light mode. Defaults to dark
  -s {local,site,chat,prompts,tools,create,options,logs}, --starting-tab {local,site,chat,prompts,tools,create,options,logs}
                        Starting tab. Defaults to local
  --use-last-tab-on-startup {0,1}
                        Use last tab on startup. Defaults to 1
  -p PS_POLL, --ps-poll PS_POLL
                        Interval in seconds to poll ollama ps command. 0 = disable. Defaults to 3
  -a {0,1}, --auto-name-session {0,1}
                        Auto name session using LLM. Defaults to 0
  --restore-defaults    Restore default settings and theme
  --purge-cache         Purge cached data
  --purge-chats         Purge all chat history
  --purge-prompts       Purge all custom prompts
  --no-save             Prevent saving settings for this session
  --no-chat-save        Prevent saving chats for this session

Unless you specify "--no-save" most flags such as -u, -t, -m, -s are sticky and will be used next time you start PAR_LLAMA.

Environment Variables

Variables are loaded in the following order, last one to set a var wins

Environment Variables for PAR LLAMA configuration

Running PAR_LLAMA

with pipx or uv tool installation

From anywhere:

parllama

with pip installation

From parent folder of venv

source venv/Scripts/activate
parllama

Running against a remote Ollama instance

parllama -u "http://REMOTE_HOST:11434"

Running under Windows WSL

Ollama by default only listens to localhost for connections, so you must set the environment variable OLLAMA_HOST=0.0.0.0:11434 to make it listen on all interfaces.
Note: this will allow connections to your Ollama server from other devices on any network you are connected to.
If you have Ollama installed via the native Windows installer you must set OLLAMA_HOST=0.0.0.0:11434 in the "System Variable" section of the "Environment Variables" control panel.
If you installed Ollama under WSL, setting the var with export OLLAMA_HOST=0.0.0.0:11434 before starting the Ollama server will have it listen on all interfaces. If your Ollama server is already running, stop and start it to ensure it picks up the new environment variable.
You can validate what interfaces the Ollama server is listening on by looking at the server.log file in the Ollama config folder.
You should see as one of the first few lines "OLLAMA_HOST:http://0.0.0.0:11434"

Now that the server is listening on all interfaces you must instruct PAR_LLAMA to use a custom Ollama connection url with the "-u" flag.
The command will look something like this:

parllama -u "http://$(hostname).local:11434"

Depending on your DNS setup if the above does not work, try this:

parllama -u "http://$(grep -m 1 nameserver /etc/resolv.conf | awk '{print $2}'):11434"

PAR_LLAMA will remember the -u flag so subsequent runs will not require that you specify it.

Dev mode

From repo root:

make dev

Quick start chat workflow

Custom Prompts

You can create a library of custom prompts for easy starting of new chats.
You can set up system prompts and user messages to prime conversations with the option of sending immediately to the LLM upon loading of the prompt.
Currently, importing prompts from the popular Fabric project is supported with more on the way.

Themes

Themes are json files stored in the themes folder in the data directory which defaults to ~/.parllama/themes

The default theme is "par" so can be located in ~/.parllama/themes/par.json

Themes have a dark and light mode are in the following format:

{
  "dark": {
    "primary": "#e49500",
    "secondary": "#6e4800",
    "warning": "#ffa62b",
    "error": "#ba3c5b",
    "success": "#4EBF71",
    "accent": "#6e4800",
    "panel": "#111",
    "surface":"#1e1e1e",
    "background":"#121212",
    "dark": true
  },
  "light": {
    "primary": "#004578",
    "secondary": "#ffa62b",
    "warning": "#ffa62b",
    "error": "#ba3c5b",
    "success": "#4EBF71",
    "accent": "#0178D4",
    "background":"#efefef",
    "surface":"#f5f5f5",
    "dark": false
  }
}

You must specify at least one of light or dark for the theme to be usable.

Theme can be changed via command line with the --theme-name option.

Contributing

Start by following the instructions in the section Installing for dev mode.

Please ensure that all pull requests are formatted with black, pass mypy and pylint with 10/10 checks.
You can run the make target pre-commit to ensure the pipeline will pass with your changes.
There is also a pre-commit config to that will assist with formatting and checks.
The easiest way to setup your environment to ensure smooth pull requests is:

With uv installed:

uv tool install pre-commit

With pipx installed:

pipx install pre-commit

From repo root run the following:

pre-commit install
pre-commit run --all-files

After running the above all future commits will auto run pre-commit. pre-commit will fix what it can and show what if anything remains to be fixed before the commit is allowed.

FAQ

Roadmap

Where we are

Where we're going

What's new

v0.3.8

v0.3.7

v0.3.6

v0.3.5

v0.3.4

v0.3.3

v0.3.2

v0.3.1

v0.3.0

v0.2.51

v0.2.5