Awesome
Ollama Copilot
Proxy that allows you to use ollama as a copilot like Github copilot
Installation
Ollama
Ensure ollama is installed:
curl -fsSL https://ollama.com/install.sh | sh
Or follow the manual install.
Models
To use the default model expected by ollama-copilot
:
ollama pull codellama:code
ollama-copilot
go install github.com/bernardo-bruning/ollama-copilot@latest
Running
Ensure your $PATH
includes $HOME/go/bin
or $GOPATH/bin
.
For example, in ~/.bashrc
or ~/.zshrc
:
export PATH="$HOME/go/bin:$GOPATH/bin:$PATH"
ollama-copilot
Configure IDE
Neovim
- Install copilot.vim
- Configure variables
let g:copilot_proxy = 'http://localhost:11435'
let g:copilot_proxy_strict_ssl = v:false
VScode
- Install copilot extension
- Sign-in or sign-up in github
- Configure open settings config and insert
{
"github.copilot.advanced": {
"debug.overrideProxyUrl": "http://localhost:11437"
},
"http.proxy": "http://localhost:11435",
"http.proxyStrictSSL": false
}
Roadmap
- Enable completions APIs usage; fill in the middle.
- Enable flexible configuration model (Currently only supported llamacode:code).
- Create self-installing functionality.
- Windows setup
- Documentation on how to use.