Home

Awesome

Learning Ollama

Just one of the things I'm learning. https://github.com/hchiam/learning

Ollama makes it easy to run LLMs offline/locally/privately on your computer.

Setup

  1. ollama.com website download app to install ollama command
  2. ollama run llama2 automatically downloads llama2 model if needed, and lets you talk with the model offline directly in the terminal, with the ollama app running in the background
    • (otherwise to just use another terminal window to run ollama in the background: ollama serve)

Extra notes

Further resources

Example uses