Model Context Protocol CLI

Model Context Protocol CLI

By shahshrey GitHub

-

Overview

what is Universal MCP UI?

Universal MCP UI is a command-line interface (CLI) designed to interact with a Model Context Protocol (MCP) server, allowing users to send commands, query data, and manage resources effectively.

how to use Universal MCP UI?

To use the Universal MCP UI, clone the repository, install the required dependencies, and run the client with the appropriate server configuration and provider settings.

key features of Universal MCP UI?

  • Protocol-level communication with the MCP Server.
  • Dynamic exploration of tools and resources.
  • Support for multiple providers (OpenAI, Ollama) and models (e.g., gpt-4o, qwen2.5-coder).
  • Interactive command execution and chat mode.

use cases of Universal MCP UI?

  1. Interacting with AI models for data processing.
  2. Querying and managing resources in a server environment.
  3. Executing commands dynamically in an interactive mode.

FAQ from Universal MCP UI?

  • What are the prerequisites for using Universal MCP UI?

You need Python 3.8 or higher and the required dependencies installed.

  • How do I set up the OpenAI API key?

Set the OPENAI_API_KEY environment variable before running the client.

  • Can I use different providers?

Yes, you can choose between OpenAI and Ollama as providers.

Content

Model Context Protocol CLI

This repository contains a protocol-level CLI designed to interact with a Model Context Protocol server. The client allows users to send commands, query data, and interact with various resources provided by the server.

Features

  • Protocol-level communication with the MCP Server.
  • Dynamic tool and resource exploration.
  • Support for multiple providers and models:
    • Providers: OpenAI, Ollama.
    • Default models: gpt-4o for OpenAI, qwen2.5-coder for Ollama.

Prerequisites

  • Python 3.8 or higher.
  • Required dependencies (see Installation)
  • If using ollama you should have ollama installed and running.
  • If using openai you should have an api key set in your environment variables (OPENAI_API_KEY=yourkey)

Installation

  1. Clone the repository:
git clone https://github.com/chrishayuk/mcp-cli
cd mcp-cli
  1. Install UV:
pip install uv
  1. Resynchronize dependencies:
uv sync --reinstall

Usage

To start the client and interact with the SQLite server, run the following command:

uv run mcp-cli --server sqlite

Command-line Arguments

  • --server: Specifies the server configuration to use. Required.
  • --config-file: (Optional) Path to the JSON configuration file. Defaults to server_config.json.
  • --provider: (Optional) Specifies the provider to use (openai or ollama). Defaults to openai.
  • --model: (Optional) Specifies the model to use. Defaults depend on the provider:
    • gpt-4o for OpenAI.
    • llama3.2 for Ollama.

Examples

Run the client with the default OpenAI provider and model:

uv run mcp-cli --server sqlite

Run the client with a specific configuration and Ollama provider:

uv run mcp-cli --server sqlite --provider ollama --model llama3.2

Interactive Mode

The client supports interactive mode, allowing you to execute commands dynamically. Type help for a list of available commands or quit to exit the program.

Supported Commands

  • ping: Check if the server is responsive.
  • list-tools: Display available tools.
  • list-resources: Display available resources.
  • list-prompts: Display available prompts.
  • chat: Enter interactive chat mode.
  • clear: Clear the terminal screen.
  • help: Show a list of supported commands.
  • quit/exit: Exit the client.

Chat Mode

To enter chat mode and interact with the server:

uv run mcp-cli --server sqlite

In chat mode, you can use tools and query the server interactively. The provider and model used are specified during startup and displayed as follows:

Entering chat mode using provider 'ollama' and model 'llama3.2'...

Using OpenAI Provider:

If you wish to use openai models, you should

  • set the OPENAI_API_KEY environment variable before running the client, either in .env or as an environment variable.

Contributing

Contributions are welcome! Please open an issue or submit a pull request with your proposed changes.

License

This project is licensed under the MIT License.

No tools information available.
No content found.