what is MCP REST API and CLI Client?
MCP REST API and CLI Client is a simple client designed to interact with Model Context Protocol (MCP) servers, allowing users to send requests to and retrieve responses from these servers seamlessly.
how to use MCP REST API and CLI Client?
To use the client, you need to clone the repository, set up the necessary environment variables for API keys, and either run the client from the command line interface (CLI) or as a REST API.
key features of MCP REST API and CLI Client?
- Compatibility with any MCP-compatible servers and support for pre-configured default servers.
- Integration with LangChain to execute LLM prompts across multiple servers.
- Support for various LLM providers through function-based API compatibility.
use cases of MCP REST API and CLI Client?
- Sending queries to local databases and retrieving product details.
- Chatting with LLMs using specific commands.
- Searching data using Brave Search APIs integrated in the client.
FAQ from MCP REST API and CLI Client?
- What APIs does the client support?
The client supports multiple MCP-compatible servers, including SQLite and Brave Search, among others.
- How do I run the client?
You can either run it from the CLI using
uv run cli.py
or start the REST API withuvicorn app:app --reload
.
- Do I need an API key to use it?
Yes, you must set environment variables for OPENAI_API_KEY and BRAVE_API_KEY to use the associated features.
MCP REST API and CLI Client
A simple REST API and CLI client to interact with Model Context Protocol (MCP) servers.
Key Features
1. MCP-Compatible Servers
- Supports any MCP-compatible servers servers.
- Pre-configured default servers:
- SQLite (test.db has been provided with sample products data)
- Brave Search
- Additional MCP servers can be added in the mcp-server-config.json file
2. Integrated with LangChain
- Leverages LangChain to execute LLM prompts.
- Enables multiple MCP servers to collaborate and respond to a specific query simultaneously.
3. LLM Provider Support
- Compatible with any LLM provider that supports APIs with function capabilities.
- Examples:
- OpenAI
- Claude
- Gemini
- AWS Nova
- Groq
- Ollama
- Essentially all LLM providers are supported as long as they provide a function-based API. Please refer langchain documentation for more details.
Setup
-
Clone the repository:
git clone https://github.com/rakesh-eltropy/mcp-client.git
-
Navigate to the Project Directory After cloning the repository, move to the project directory:
cd mcp-client
-
Set the OPENAI_API_KEY environment variable:
export OPENAI_API_KEY=your-openai-api-key
You can also set the
OPENAI_API_KEY
in the mcp-server-config.json file.You can also set the
provider
andmodel
in the mcp-server-config.json file. e.g.provider
can beollama
andmodel
can bellama3.2:3b
.
4.Set the BRAVE_API_KEY environment variable:
export BRAVE_API_KEY=your-brave-api-key
You can also set the BRAVE_API_KEY
in the mcp-server-config.json file.
You can get the free BRAVE_API_KEY
from Brave Search API.
-
Running from the CLI:
uv run cli.py
To explore the available commands, use the
help
option. You can chat with LLM usingchat
command. Sample prompts:What is the capital city of India?
Search the most expensive product from database and find more details about it from amazon?
-
Running from the REST API:
uvicorn app:app --reload
You can use the following curl command to chat with llm:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?"}' http://localhost:8000/chat
You can use the following curl command to chat with llm with streaming:
curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?", "streaming": true}' http://localhost:8000/chat
Contributing
Feel free to submit issues and pull requests for improvements or bug fixes.