Home

Awesome

LLAMA.CPP DART

A high-performance Dart binding for llama.cpp, enabling advanced text generation capabilities in both Dart and Flutter applications with flexible integration options.

Overview

This library provides three levels of abstraction for integrating llama.cpp into your Dart/Flutter projects, allowing you to choose the right balance between control and convenience:

  1. Low-Level FFI Bindings: Direct access to llama.cpp functions
  2. High-Level Wrapper: Simplified, object-oriented API
  3. Managed Isolate: Flutter-friendly, non-blocking implementation

Features

Usage Examples

Low-Level FFI Bindings

Direct llama.cpp integration with maximum control:

import 'package:llama_cpp_dart/src/llama_cpp.dart';

void main() {
  final lib = llama_cpp(DynamicLibrary.open("libllama.dylib"));
  // Initialize model, context, and sampling parameters
  // See examples/low_level.dart for complete example
}

High-Level Wrapper

Simplified API for common use cases:

import 'package:llama_cpp_dart/llama_cpp_dart.dart';

void main() {
  Llama.libraryPath = "libllama.dylib";
  final llama = Llama("path/to/model.gguf");
  
  llama.setPrompt("2 * 2 = ?");
  while (true) {
    var (token, done) = llama.getNext();
    print(token);
    if (done) break;
  }
  llama.dispose();
}

Managed Isolate

Perfect for Flutter applications:

import 'package:llama_cpp_dart/llama_cpp_dart.dart';

void main() async {
  final loadCommand = LlamaLoad(
    path: "path/to/model.gguf",
    modelParams: ModelParams(),
    contextParams: ContextParams(),
    samplingParams: SamplerParams(),
    format: ChatMLFormat(),
  );

  final llamaParent = LlamaParent(loadCommand);
  await llamaParent.init();

  llamaParent.stream.listen((response) => print(response));
  llamaParent.sendPrompt("2 * 2 = ?");
}

Getting Started

Prerequisites

Building llama.cpp Library

  1. Clone the llama.cpp repository:
git clone https://github.com/ggerganov/llama.cpp
  1. Compile into a shared library:
  1. Place the compiled library in your project's accessible directory

Installation

Add to your pubspec.yaml:

dependencies:
  llama_cpp_dart: ^latest_version

License

This project is licensed under the MIT License - see the LICENSE.md file for details.