Home

Awesome

<p align="center"> <img src="./assets/bricks-logo.png" width="150" /> </p>

BricksLLM: AI Gateway For Putting LLMs In Production

<p align="center"> <a href='https://www.ycombinator.com/'><img alt='YCombinator S22' src='https://img.shields.io/badge/Y%20Combinator-2022-orange'/></a> <a href='http://makeapullrequest.com'><img alt='PRs Welcome' src='https://img.shields.io/badge/PRs-welcome-43AF11.svg?style=shields'/></a> <a href="https://discord.gg/dFvdt4wqWh"><img src="https://img.shields.io/badge/discord-BricksLLM-blue?logo=discord&labelColor=2EB67D" alt="Join BricksLLM on Discord"></a> <a href="https://github.com/bricks-cloud/bricks/blob/main/LICENSE"><img src="https://img.shields.io/badge/license-MIT-red" alt="License"></a> </p>

[!TIP] A managed version of BricksLLM is also available! It is production ready, and comes with a dashboard to make interacting with BricksLLM easier. Try us out for free today!

BricksLLM is a cloud native AI gateway written in Go. Currently, it provides native support for OpenAI, Anthropic, Azure OpenAI and vLLM. BricksLLM aims to provide enterprise level infrastructure that can power any LLM production use cases. Here are some use cases for BricksLLM:

Features

Getting Started

The easiest way to get started with BricksLLM is through BricksLLM-Docker.

Step 1 - Clone BricksLLM-Docker repository

git clone https://github.com/bricks-cloud/BricksLLM-Docker

Step 2 - Change to BricksLLM-Docker directory

cd BricksLLM-Docker

Step 3 - Deploy BricksLLM locally with Postgresql and Redis

docker compose up

You can run this in detach mode use the -d flag: docker compose up -d

Step 4 - Create a provider setting

curl -X PUT http://localhost:8001/api/provider-settings \
   -H "Content-Type: application/json" \
   -d '{
          "provider":"openai",
          "setting": {
             "apikey": "YOUR_OPENAI_KEY"
          }
      }'   

Copy the id from the response.

Step 5 - Create a Bricks API key

Use id from the previous step as settingId to create a key with a rate limit of 2 req/min and a spend limit of 25 cents.

curl -X PUT http://localhost:8001/api/key-management/keys \
   -H "Content-Type: application/json" \
   -d '{
	      "name": "My Secret Key",
	      "key": "my-secret-key",
	      "tags": ["mykey"],
        "settingIds": ["ID_FROM_STEP_FOUR"],
        "rateLimitOverTime": 2,
        "rateLimitUnit": "m",
        "costLimitInUsd": 0.25
      }'   

Congratulations you are done!!!

Then, just redirect your requests to us and use OpenAI as you would normally. For example:

curl -X POST http://localhost:8002/api/providers/openai/v1/chat/completions \
   -H "Authorization: Bearer my-secret-key" \
   -H "Content-Type: application/json" \
   -d '{
          "model": "gpt-3.5-turbo",
          "messages": [
              {
                  "role": "system",
                  "content": "hi"
              }
          ]
      }'

Or if you're using an SDK, you could change its baseURL to point to us. For example:

// OpenAI Node SDK v4
import OpenAI from 'openai';

const openai = new OpenAI({
  apiKey: "some-secret-key", // key created earlier
  baseURL: "http://localhost:8002/api/providers/openai/v1", // redirect to us
});

How to Update?

For updating to the latest version

docker pull luyuanxin1995/bricksllm:latest

For updating to a particular version

docker pull luyuanxin1995/bricksllm:1.4.0

Documentation

Environment variables

Nametypedescriptiondefault
POSTGRESQL_HOSTSrequiredHosts for Postgresql DB. Separated by ,localhost
POSTGRESQL_DB_NAMEoptionalName for Postgresql DB.
POSTGRESQL_USERNAMErequiredPostgresql DB username
POSTGRESQL_PASSWORDrequiredPostgresql DB password
POSTGRESQL_SSL_MODEoptionalPostgresql SSL modedisable
POSTGRESQL_PORToptionalThe port that Postgresql DB runs on5432
POSTGRESQL_READ_TIME_OUToptionalTimeout for Postgresql read operations2m
POSTGRESQL_WRITE_TIME_OUToptionalTimeout for Postgresql write operations5s
REDIS_HOSTSrequiredHost for Redis. Separated by ,localhost
REDIS_PASSWORDoptionalRedis Password
REDIS_PORToptionalThe port that Redis DB runs on6379
REDIS_READ_TIME_OUToptionalTimeout for Redis read operations1s
REDIS_WRITE_TIME_OUToptionalTimeout for Redis write operations500ms
IN_MEMORY_DB_UPDATE_INTERVALoptionalThe interval BricksLLM API gateway polls Postgresql DB for latest key configurations1s
STATS_PROVIDERoptional"datadog" or Host:Port(127.0.0.1:8125) for statsd.
PROXY_TIMEOUToptionalTimeout for proxy HTTP requests.600s
NUMBER_OF_EVENT_MESSAGE_CONSUMERSoptionalNumber of event message consumers that help handle counting tokens and inserting event into db.3
AWS_SECRET_ACCESS_KEYoptionalIt is for PII detection feature.5s
AWS_ACCESS_KEY_IDoptionalIt is for using PII detection feature.5s
AMAZON_REGIONoptionalRegion for AWS.us-west-2
AMAZON_REQUEST_TIMEOUToptionalTimeout for amazon requests.5s
AMAZON_CONNECTION_TIMEOUToptionalTimeout for amazon connection.10s
ADMIN_PASSoptionalSimple password for the admin server.

Admin Server

Swagger Doc

Proxy Server

Swagger Doc