Home

Awesome

LlamaChat banner

<h3 align="center">Chat with your favourite LLaMA models, right on your Mac</h2> <hr />

LlamaChat is a macOS app that allows you to chat with LLaMA, Alpaca and GPT4All models all running locally on your Mac.

<img src="https://github.com/alexrozanski/LlamaChat/raw/main/Resources/screenshot.png" width="100%">

๐Ÿš€ Getting Started

LlamaChat requires macOS 13 Ventura, and either an Intel or Apple Silicon processor.

Direct Download

Download a .dmg containing the latest version ๐Ÿ‘‰ here ๐Ÿ‘ˆ.

Building from Source

git clone https://github.com/alexrozanski/LlamaChat.git
cd LlamaChat
open LlamaChat.xcodeproj

NOTE: LlamaChat includes Sparkle for autoupdates, which will fail to load if LlamaChat is not signed. Ensure that you use a valid signing certificate when building and running LlamaChat.

NOTE: model inference runs really slowly in Debug builds, so if building from source make sure that the Build Configuration in LlamaChat > Edit Scheme... > Run is set to Release.

โœจ Features

๐Ÿ”ฎ Models

NOTE: LlamaChat doesn't ship with any model files and requires that you obtain these from the respective sources in accordance with their respective terms and conditions.

.
โ”‚   ...
โ”œโ”€โ”€ 13B
โ”‚ย ย  โ”œโ”€โ”€ checklist.chk.txt
โ”‚ย ย  โ”œโ”€โ”€ consolidated.00.pth
โ”‚ย ย  โ”œโ”€โ”€ consolidated.01.pth
โ”‚ย ย  โ””โ”€โ”€ params.json
โ”‚   ...
โ””โ”€โ”€ tokenizer.model

๐Ÿ‘ฉโ€๐Ÿ’ป Contributing

Pull Requests and Issues are welcome and much appreciated. Please make sure to adhere to the Code of Conduct at all times.

LlamaChat is fully built using Swift and SwiftUI, and makes use of llama.swift under the hood to run inference and perform model operations.

The project is mostly built using MVVM and makes heavy use of Combine and Swift Concurrency.

โš–๏ธ License

LlamaChat is licensed under the MIT license.