Home

Awesome

<div align="center"> <picture> <img alt="logo" height="128px" src="assets/icon@dark.png"> </picture> <h1 align="center">Raycast Ollama</h1> </div>

Use Ollama for local llama inference on Raycast. This application is not directly affiliated with Ollama.ai.

Requirements

Ollama installed and running on your mac. At least one model need to be installed throw Ollama cli tools or with 'Manage Models' Command. You can find all available model here.

How to Use

Command: Manage Models

View, add, and remove models that are installed locally or on a configured remote Ollama Server. To manage and utilize models from the remote server, use the Add Server action.

Command: Chat With Ollama

Chat with your preferred model from Raycast, with the following features:

From extentions preferences you can chose how many messages use as memory. By default it use the last 20 messages.

Command: Create Custom Commands

All preconfigured commands are crafted for general use. This command allow you to create a custom command for your specific needs.

Prompt use Raycast Prompt Explorer format with the following tags supported: