Home

Awesome

<h1 align="center">AgentMark</h1> <p align="center"> <a href="https://github.com/puzzlet-ai"> <picture> <source media="(prefers-color-scheme: light)" srcset="https://i.imgur.com/xwq74He.png"> <source media="(prefers-color-scheme: dark)" srcset="https://i.imgur.com/JN9seOy.png"> <img src="https://i.imgur.com/xwq74He.png" alt="AgentMark Logo" width="200"> </picture> </a> </p> <p align="center"> <strong>The Prompt Engineer's Markdown</strong> </p> <p align="center"> <a href="https://discord.gg/P2NeMDtXar">Discord</a> | <a href="https://docs.puzzlet.ai/agentmark/">Docs</a> | <a href="https://marketplace.visualstudio.com/items?itemName=puzzlet.agentmark">VSCode</a> | <a href="https://github.com/puzzlet-ai/templatedx">TemplateDX</a> | <a href="https://puzzlet.ai">Puzzlet</a> </p>

Overview

AgentMark is a declarative, extensible, and composable approach for developing LLM applications using Markdown and JSX. AgentMark files enhance readability by displaying the exact inputs sent to the LLM, while providing lightweight abstractions for developers.

AgentMark is built on top of the templating language, TemplateDX, and inspired by MDX.

Getting Started

Below is a basic example to help you get started with AgentMark:

example.prompt.mdx

---
name: basic-prompt
metadata:
  model:
    name: gpt-4o-mini
test_settings:
  props:
    num: 3
---

<System>You are a math expert</System>

<User>What's 2 + {props.num}?</User>

Features

AgentMark supports:

  1. Markdown: 📝
  2. JSX components, props, & plugins: 🧩
  3. Unified model config: 🔗
  4. Custom Models: 🛠️
  5. Streaming: 🌊
  6. Loops, Conditionals, and Filter Functions: ♻️
  7. Type Safety: 🛡️
  8. Agents: 🕵️
  9. Observability: 👀

Read our docs to learn more.

Models

By default, AgentMark doesn't support any model providers. Instead, support must be added through our plugins. Here's a list of currently supported plugins you can start using.

Built-In Model Plugins

ProviderModelSupported@puzzlet/all-models
OpenAIgpt-4o✅ Supported
OpenAIgpt-4o-mini✅ Supported
OpenAIgpt-4-turbo✅ Supported
OpenAIgpt-4✅ Supported
OpenAIo1-mini✅ Supported
OpenAIo1-preview✅ Supported
OpenAIgpt-3.5-turbo✅ Supported
Anthropicclaude-3-5-haiku-latest✅ Supported
Anthropicclaude-3-5-sonnet-latest✅ Supported
Anthropicclaude-3-opus-latest✅ Supported
MetaALL✅ Supported🧩 Only
Customany✅ Supported🧩 Only
GoogleALL⚠️ Coming SoonN/A
GrokALL⚠️ Coming SoonN/A

Want to add support for another model? Open an issue.

Custom Model Plugins

Refer to our docs to learn how to add custom model support.

Language Support

We plan on providing support for AgentMark across a variety of languages.

LanguageSupport Status
TypeScript✅ Supported
Python⚠️ Coming Soon
Java⚠️ Coming Soon
OthersNeed something else? Open an issue

Running AgentMark

You can run AgentMark using one of the following methods:

1. VSCode Extension

Run .prompt.mdx files directly within your VSCode editor. Note: This allows you to run test_settings in your prompts.

Download the VSCode Extension

2. Node.js

Run AgentMark directly in your Node.js environment. Below is a sample implementation:

import { runInference, ModelPluginRegistry, load } from "@puzzlet/agentmark";
import AllModelPlugins from '@puzzlet/all-models';

// Note: Registering all latest models for demo/development purposes. 
// In production, you'll likely want to selectively load these, and pin models.
ModelPluginRegistry.registerAll(AllModelPlugins);

const run = async () => {
  const props = { name: "Emily" };
  const Prompt = await load('./example.prompt.mdx');
  const result = await runInference(Prompt, props);
  console.log(result);
}
run();

3. Webpack Loader

Integrate AgentMark with your webpack workflow using our loader.

AgentMark Webpack Loader

import { runInference, ModelPluginRegistry } from "@puzzlet/agentmark";
import AllModelPlugins from '@puzzlet/all-models';
import MyPrompt from './example.prompt.mdx';

// Note: Registering all latest models for demo/development purposes. 
// In production, you'll likely want to selectively load these, and pin models.
ModelPluginRegistry.registerAll(AllModelPlugins);

const run = async () => {
  const props = { name: "Emily" };
  const result = await runInference(MyPrompt, props);
  console.log(result)
}
run();

Contributing

We welcome contributions! Please check out our contribution guidelines for more information.

Community

Join our community to collaborate, ask questions, and stay updated:

License

This project is licensed under the MIT License.