Home

Awesome

Ingest

Ingest is a tool I've written to make my life easier when preparing content for LLMs.

It parses directories of plain text files, such as source code, into a single markdown file suitable for ingestion by AI/LLMs.

ingest

Ingest can also pass the prompt directly to an LLM such as Ollama for processing.

ingest with --llm

And ingest web URLs.

ingest with --web

Features

Ingest Intro ("Podcast" Episode):

<audio src="https://github.com/sammcj/smcleod_files/raw/refs/heads/master/audio/podcast-ep-sw/Podcast%20Episode%20-%20Ingest.mp3" controls preload></audio>

Installation

go install (recommended)

Make sure you have Go installed on your system, then run:

go install github.com/sammcj/ingest@HEAD

curl

I don't recommend this method as it's not as easy to update, but you can use the following command:

curl -sL https://raw.githubusercontent.com/sammcj/ingest/refs/heads/main/scripts/install.sh | bash

Manual install

  1. Download the latest release from the releases page
  2. Move the binary to a directory in your PATH, e.g. mv ingest* /usr/local/bin/ingest

Usage

Basic usage:

ingest [flags] <paths>

ingest will default to the current working directory if no path is provided, e.g:

$ ingest

⠋ Traversing directory and building tree...  [0s]
[ℹ️] Tokens (Approximate): 15,945
[✅] Copied to clipboard successfully.

The first time ingest runs, it will download a small tokeniser called 'cl100k_base.tiktoken' this is used for tokenisation.

Generate a prompt from a directory, including only Python files:

ingest -i "**/*.py" /path/to/project

Generate a prompt with git diff and copy to clipboard:

ingest -d /path/to/project

Generate a prompt for multiple files/directories:

ingest /path/to/project /path/to/other/project

Generate a prompt and save to a file:

ingest -o output.md /path/to/project

You can also provide individual files or multiple paths:

ingest /path/to/file /path/to/directory

Save output to to ~/ingest/<directory_name>.md:

ingest --save /path/to/project

VRAM Estimation and Model Compatibility

Ingest includes a feature to estimate VRAM requirements and check model compatibility using the Gollama's vramestimator package. This helps you determine if your generated content will fit within the specified model, VRAM, and quantisation constraints.

To use this feature, add the following flags to your ingest command:

ingest --vram --model <model_id> [--memory <memory_in_gb>] [--quant <quantisation>] [--context <context_length>] [--kvcache <kv_cache_quant>] [--quanttype <quant_type>] [other flags] <paths>

Examples:

Estimate VRAM usage for a specific context:

ingest --vram --model NousResearch/Hermes-2-Theta-Llama-3-8B --quant q4_k_m --context 2048 --kvcache q4_0 .
# Estimated VRAM usage: 5.35 GB

Calculate maximum context for a given memory constraint:

ingest --vram --model NousResearch/Hermes-2-Theta-Llama-3-8B --quant q4_k_m --memory 6 --kvcache q8_0 .
# Maximum context for 6.00 GB of memory: 5069

Find the best BPW (Bits Per Weight):

ingest --vram --model NousResearch/Hermes-2-Theta-Llama-3-8B --memory 6 --quanttype gguf .
# Best BPW for 6.00 GB of memory: IQ3_S

The tool also works for exl2 (ExllamaV2) models:

ingest --vram --model NousResearch/Hermes-2-Theta-Llama-3-8B --quant 5.0 --context 2048 --kvcache q4_0 . # For exl2 models
ingest --vram --model NousResearch/Hermes-2-Theta-Llama-3-8B --quant 5.0 --memory 6 --kvcache q8_0 . # For exl2 models

When using the VRAM estimation feature along with content generation, ingest will provide information about the generated content's compatibility with the specified constraints:

ingest --vram --model NousResearch/Hermes-2-Theta-Llama-3-8B --memory 8 --quant q4_0 .
⠋ Traversing directory and building tree... [0s]
[ℹ️] 14,702 Tokens (Approximate)
[ℹ️] Maximum context for 8.00 GB of memory: 10240
[✅] Generated content (14,702 tokens) fits within maximum context.
Top 5 largest files (by estimated token count):
1. /Users/samm/git/sammcj/ingest/main.go (4,682 tokens)
2. /Users/samm/git/sammcj/ingest/filesystem/filesystem.go (2,694 tokens)
3. /Users/samm/git/sammcj/ingest/README.md (1,895 tokens)
4. /Users/samm/git/sammcj/ingest/utils/utils.go (948 tokens)
5. /Users/samm/git/sammcj/ingest/config/config.go (884 tokens)
[✅] Copied to clipboard successfully.

Available flags for VRAM estimation:

Ingest will provide appropriate output based on the combination of flags used, such as estimating VRAM usage, calculating maximum context, or finding the best BPW. If the generated content fits within the specified constraints, you'll see a success message. Otherwise, you'll receive a warning that the content may not fit.

LLM Integration

Ingest can pass the generated prompt to LLMs that have an OpenAI compatible API such as Ollama for processing.

ingest --llm /path/to/project

By default this will use any prompt suffix from your configuration file:

./ingest utils.go --llm
⠋ Traversing directory and building tree...  [0s]
This is Go code for a file named `utils.go`. It contains various utility functions for
handling terminal output, clipboard operations, and configuration directories.
...

You can provide a prompt suffix to append to the generated prompt:

ingest --llm -p "explain this code" /path/to/project

Web Crawling & Ingestion

Crawl with explicit web mode

ingest --web https://example.com

Auto-detect URL and crawl

ingest https://example.com

Crawl with domain restriction

ingest --web --web-domains example.com https://example.com

Crawl deeper with more concurrency

ingest --web --web-depth 3 --web-concurrent 10 https://example.com

Exclude a path from the crawl

ingest --web https://example.com -e '/posts/**'

Shell Completions

Ingest includes shell completions for Bash, Zsh, and Fish.

source <(ingest completion zsh)

See ingest completion -h for more information.

Configuration

Ingest uses a configuration file located at ~/.config/ingest/ingest.json.

You can make Ollama processing run without prompting setting "llm_auto_run": true in the config file.

The config file also contains:

Ingest uses the following directories for user-specific configuration:

These directories will be created automatically on first run, along with README files explaining their purpose.

Flags

Excludes

You can get a list of the default excludes by parsing --print-default-excludes to ingest. These are defined in defaultExcludes.go.

To override the default excludes, create a default.glob file in ~/.config/ingest/patterns/exclude with the patterns you want to exclude.

Templates

Templates are written in standard go templating syntax.

You can get a list of the default templates by parsing --print-default-template to ingest. These are defined in template.go.

To override the default templates, create a default.tmpl file in ~/.config/ingest/patterns/templates with the template you want to use by default.

Contributing

Contributions are welcome, Please feel free to submit a Pull Request.

License

Acknowledgements

<script src="http://api.html5media.info/1.1.8/html5media.min.js"></script>