Awesome
MCP REST API and CLI Client
A simple REST API and CLI client to interact with Model Context Protocol (MCP) servers.
Key Features
1. MCP-Compatible Servers
- Supports any MCP-compatible servers servers.
- Pre-configured default servers:
- SQLite (test.db has been provided with sample products data)
- Brave Search
- Additional MCP servers can be added in the mcp-server-config.json file
2. Integrated with LangChain
- Leverages LangChain to execute LLM prompts.
- Enables multiple MCP servers to collaborate and respond to a specific query simultaneously.
3. LLM Provider Support
- Compatible with any LLM provider that supports APIs with function capabilities.
- Examples:
- OpenAI
- Claude
- Gemini
- AWS Nova
- Groq
- Ollama
- Essentially all LLM providers are supported as long as they provide a function-based API. Please refer langchain documentation for more details.
Setup
-
Clone the repository:
git clone https://github.com/rakesh-eltropy/mcp-client.git
-
Navigate to the Project Directory After cloning the repository, move to the project directory:
cd mcp-client
-
Set the OPENAI_API_KEY environment variable:
export OPENAI_API_KEY=your-openai-api-key
You can also set the
OPENAI_API_KEY
in the mcp-server-config.json file.You can also set the
provider
andmodel
in the mcp-server-config.json file. e.g.provider
can beollama
andmodel
can bellama3.2:3b
.
4.Set the BRAVE_API_KEY environment variable:
export BRAVE_API_KEY=your-brave-api-key
You can also set the BRAVE_API_KEY
in the mcp-server-config.json file.
You can get the free BRAVE_API_KEY
from Brave Search API.
-
Running from the CLI:
uv run cli.py
To explore the available commands, use the
help
option. You can chat with LLM usingchat
command. Sample prompts:What is the capital city of India?
Search the most expensive product from database and find more details about it from amazon?
-
Running from the REST API:
uvicorn app:app --reload
You can use the following curl command to chat with llm:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?"}' http://localhost:8000/chat
You can use the following curl command to chat with llm with streaming:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?", "streaming": true}' http://localhost:8000/chat
Contributing
Feel free to submit issues and pull requests for improvements or bug fixes.