Home

Awesome

MCPHost 🤖

A CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP). Currently supports both Claude 3.5 Sonnet and Ollama models.

Overview 🌟

MCPHost acts as a host in the MCP client-server architecture, where:

This architecture allows language models to:

Currently supports:

Features ✨

Requirements 📋

Environment Setup 🔧

  1. Anthropic API Key (for Claude):
export ANTHROPIC_API_KEY='your-api-key'
  1. Ollama Setup:
ollama pull mistral
ollama serve

Installation 📦

go install github.com/mark3labs/mcphost@latest

Configuration ⚙️

MCPHost will automatically create a configuration file at ~/.mcp.json if it doesn't exist. You can also specify a custom location using the --config flag:

{
  "mcpServers": {
    "sqlite": {
      "command": "uvx",
      "args": [
        "mcp-server-sqlite",
        "--db-path",
        "/tmp/foo.db"
      ]
    },
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/tmp"
      ]
    }
  }
}

Each MCP server entry requires:

Usage 🚀

MCPHost is a CLI tool that allows you to interact with various AI models through a unified interface. It supports various tools through MCP servers and provides streaming responses.

Available Models

Models can be specified using the --model (-m) flag:

Examples

# Use Ollama with Qwen model
mcphost -m ollama:qwen2.5:3b

# Use OpenAI's GPT-4
mcphost -m openai:gpt-4

Flags

Interactive Commands

While chatting, you can use:

Global Flags

MCP Server Compatibility 🔌

MCPHost can work with any MCP-compliant server. For examples and reference implementations, see the MCP Servers Repository.

Contributing 🤝

Contributions are welcome! Feel free to:

Please ensure your contributions follow good coding practices and include appropriate tests.

License 📄

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments 🙏