Home

Awesome

🔍 Copilot for Obsidian

GitHub release (latest SemVer) Obsidian Downloads

Copilot for Obsidian is a free and open-source ChatGPT interface right inside Obsidian. It has a minimalistic design and is straightforward to use.

My goal is to make this AI assistant local-first and privacy-focused. It has a local vector store and can work with local models for chat and QA completely offline! More features are under construction. Stay tuned!

<img src="./images/ui.png" alt="UI">

If you enjoy Copilot for Obsidian, please consider sponsoring this project, or donate by clicking the button below. It will help me keep this project going to build toward a privacy-focused AI experience. Thank you!

<a href="https://www.buymeacoffee.com/logancyang" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 40px !important;width: 150px !important;" ></a>

SPECIAL THANKS TO OUR SPONSORS: @Arlorean, @dashinja, @emaynard, @scmarinelli, @borthwick

🎉 HIGHLY ANTICIPATED v2.5.0: Vault QA (BETA) mode (with local embedding support)! Claude 3! 🎉🎉🎉

<a href="https://youtu.be/NSoKRYNlOls" target="_blank"><img src="./images/thumbnail-vault-qa.png" width="700" /></a>

The highly anticipated biggest update of all is here!

The brand new Vault QA (BETA) mode allows you to chat with your whole vault, powered by a local vector store. Ask questions and get answers with cited sources!

What's more, with Ollama local embeddings and local chat models, this mode works completely offline! This is a huge step toward truly private and local AI assistance inside Obsidian!

Since Claude 3 models are announced today (3/4/2024), I managed to add them in this release too. Go to Anthropic's site to get your API key, you can now find it in the settings.

(Huge shoutout to @AntoineDao for working with me on Vault QA mode!)

FREE Models

OpenRouter.ai hosts some of the best open-source models at the moment, such as MistralAI's new models, check out their websites for all the good stuff they have!

LM Studio and Ollama are the 2 best choices for running local models on your own machine. Please check out the super simple setup guide here. Don't forget to flex your creativity in custom prompts using local models!

🛠️ Features

🎬 Demos

🤗 New to Copilot? Quick Guide for Beginners:

<a href="https://www.youtube.com/watch?v=jRCDAg2sck8" target="_blank"><img src="./images/thumbnail.png" width="700" /></a>

To use Copilot, you need API keys from one of the LLM providers such as OpenAI, Azure OpenAI, Gemini, OpenRouter (Free!). You can also use it offline with LM Studio or Ollama!

Once you put your valid API key in the Copilot setting, don't forget to click Save and Reload. If you are a new user and have trouble setting it up, please open an issue and describe it in detail.

💬 User Custom Prompt: Create as Many Copilot Commands as You Like!

You can add, apply, edit and delete your own custom Copilot commands, all persisted in your local Obsidian environment! Check out this demo video below!

<a href="https://www.youtube.com/watch?v=apuV1Jz6ObE" target="_blank"><img src="./images/thumbnail2.png" width="700" /></a>

🧠 Advanced Custom Prompt! Unleash your creativity and fully leverage the long context window!

<a href="https://youtu.be/VPNlXeCsH74?si=eYjJhO2cZtU7VrQz" target="_blank"><img src="./images/thumbnail-adv-prompt-tutorial.png" width="700" /></a>

This video shows how Advanced Custom Prompt works. This form of templating enables a lot more possibilities with long context window models. If you have your own creative cool use cases, don't hesitate to share them in the discussion or in the youtube comment section!

🔧 Copilot Settings

The settings page lets you set your own temperature, max tokens, conversation context based on your need.

New models will be added as I get access.

You can also use your own system prompt, choose between different embedding providers such as OpenAI, CohereAI (their trial API is free and quite stable!) and Huggingface Inference API (free but sometimes times out).

⚙️ Installation

Copilot for Obsidian is now available in Obsidian Community Plugin!

Now you can see the chat icon in your leftside ribbon, clicking on it will open the chat panel on the right! Don't forget to check out the Copilot commands available in the commands palette!

⛓️ Manual Installation

🔔 Note

📣 Again, please always be mindful of the API cost if you use GPT-4 with a long context!

🤔 FAQ (please read before submitting an issue)

<details> <summary>"You do not have access to this model"</summary> </details> <details> <summary>It's not using my note as context</summary> </details> <details> <summary>Unresponsive QA when using Huggingface as the Embedding Provider</summary> </details> <details> <summary>"insufficient_quota"</summary> </details> <details> <summary>"context_length_exceeded"</summary> </details> <details> <summary>Azure issue</summary> </details>

When opening an issue, please include relevant console logs. You can go to Copilot's settings and turn on "Debug mode" at the bottom for more console messages!

📝 Planned features (feedback welcome)

🙏 Thank You

Did you know that even the timer on Alexa needs internet access? In this era of corporate-dominated internet, I still believe there's room for powerful tech that's focused on privacy. A great local AI agent in Obsidian is the ultimate form of this plugin. If you share my vision, please consider sponsoring this project or buying me coffees!

<a href="https://www.buymeacoffee.com/logancyang" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 40px !important;width: 150px !important;" ></a>

Please also help spread the word by sharing about the Copilot for Obsidian Plugin on Twitter, Reddit, or any other social media platform you use.

You can find me on Twitter/X @logancyang.