Home

Awesome

Minimalistic Interface for Local Language Models (LLMs) (Powered by Ollama)

Introduction

Watch the video

This minimalistic UI is designed to act as a simple interface for Ollama models, allowing you to chat with your models, save conversations and toggle between different ones easily. The tool is built using React, Next.js, and Tailwind CSS, with LangchainJs and Ollama providing the magic behind the scenes.

Features

Technical Details

Getting Started

  1. Download and run Ollama on your machine with ollama serve or ollama run <model-name> (it will run at: http://localhost:11434/)

  2. Open a new terminal and navigate to the root of this project.

  3. Install the dependencies npm install in your terminal.

  4. Also check whether your node by doing:

node -v

If it is less than 14.0.1. You can do this to update it:

npm install -g n

Use n to install a specific Node.js version: bash:

n 20.0.9

Verify the Node.js version:

node -v
  1. Optional: If running Ollama on a different host/device, customize the Ollama API base URL by copying .env.example to .env.local and setting the environment variable NEXT_PUBLIC_OLLAMA_BASEURL. If not set, the base URL will default to http://localhost:11434.
  2. Start the tool by running npm run dev (it should be available in your web browser at http://localhost:3000)

To-do

Troubleshooting

If you encounter any issues, feel free to reach out!

License

This project is licensed under the MIT License. See LICENSE file for details.