Home

Awesome

Deployed to github pages

LLM X

Chrome Extension | Web/Mobile app

<img src="https://raw.githubusercontent.com/mrdjohnson/llm-X/main/public/LLMX.png" alt="drawing" width="200"/>

Privacy statement:

LLM X does not make any external api calls. (go ahead, check your network tab and see the Fetch section). Your chats and image generations are 100% private. This site / app works completely offline.

Issues:

LLM X (web app) will not connect to a server that is not secure. This means that you can use LLM X on localhost (considered a secure context) but if you're trying to use llm-x over a network the server needs to be from https or else it will not work.

Recent additions:

How To Use:

Prerequisites for application

How to use web client (no install):

Prerequisites for web client


Prerequisites for chrome extension

How to use offline:

How to use from project source:

Prerequisites for project source


Vite preview mode

Docker

Chrome Extension

Goals / Features

Screenshots:

Showing Chrome extension mode with Google's on-device Gemini Nano
Logo convo screenshot
Showing Chrome extension mode with Ollama's llama3.2-vision
Logo convo screenshot
Showing ability to run ollama and LM Studio at the same time
Logo convo screenshot
Conversation about logo
Logo convo screenshot
Image generation example!
Image generation screenshot
Showing off omnibar and code
Omnibar and code screenshot
Showing off code and light theme
Code and light theme screenshot
Responding about a cat
Cat screenshot
LaTex support!
Latex screenshot
Another logo response
Logo 2 screenshot

What is this? ChatGPT style UI for the niche group of folks who run Ollama (think of this like an offline chat gpt server) locally. Supports sending and receiving images and text! WORKS OFFLINE through PWA (Progressive Web App) standards (its not dead!)

Why do this? I have been interested in LLM UI for a while now and this seemed like a good intro application. I've been introduced to a lot of modern technologies thanks to this project as well, its been fun!

Why so many buzz words? I couldn't help but bee cool 😎

Tech Stack (thank you's):

Logic helpers:

UI Helpers:

Project setup helpers:

Inspiration: ollama-ui's project. Which allows users to connect to ollama via a web app

Perplexity.ai Perplexity has some amazing UI advancements in the LLM UI space and I have been very interested in getting to that point. Hopefully this starter project lets me get closer to doing something similar!

Getting started with local development

(please note the minimum engine requirements in the package json)

Clone the project, and run yarn in the root directory

yarn dev starts a local instance and opens up a browser tab under https:// (for PWA reasons)

MISC