Home

Awesome

oterm

the text-based terminal client for Ollama.

Features

Installation

Using brew for MacOS:

brew tap ggozad/formulas
brew install ggozad/formulas/oterm

Using pip:

pip install oterm

Using

In order to use oterm you will need to have the Ollama server running. By default it expects to find the Ollama API running on http://0.0.0.0:11434. If you are running Ollama inside docker or on a different host/port, use the OLLAMA_HOST environment variable to customize the host/port. Alternatively you can use OLLAMA_URL to specify the full http(s) url. Setting OTERM_VERIFY_SSL to False will disable SSL verification.

OLLAMA_URL=http://host:port/api

The following keyboard shortcuts are supported:

In multiline mode, you can press <kbd>Enter</kbd> to send the message, or <kbd>Shift</kbd>+<kbd>Enter</kbd> to add a new line at the cursor.

While Ollama is inferring the next message, you can press <kbd>Esc</kbd> to cancel the inference.

Note that some of the shortcuts may not work in a certain context, for example pressing <kbd></kbd> while the prompt is in multi-line mode.

Copy / Paste

It is difficult to properly support copy/paste in terminal applications. You can copy blocks to your clipboard as such:

For most terminals there exists a key modifier you can use to click and drag to manually select text. For example:

Customizing models

When creating a new chat, you may not only select the model, but also customize the the system instruction as well as the parameters (such as context length, seed, temperature etc) passed to the model. For a list of all supported parameters refer to the Ollama documentation. Checking the JSON output checkbox will force the model to reply in JSON format. Please note that oterm will not (yet) pull models for you, use ollama to do that. All the models you have pulled or created will be available to oterm.

You can also "edit" the chat to change the system prompt, parameters or format. Note, that the model cannot be changed once the chat has started. In addition whatever "context" the chat had (an embedding of the previous messages) will be kept.

Chat session storage

All your chat sessions are stored locally in a sqlite database. You can customize the directory where the database is stored by setting the OTERM_DATA_DIR environment variable.

You can find the location of the database by running oterm --db.

Screenshots

Chat Model selection Image selection

License

This project is licensed under the MIT License.