What is Deepin MCP?
Deepin MCP is a client designed for connecting to Machine Conversation Protocol (MCP) servers, enabling interaction with them using OpenAI language models.
How to use Deepin MCP?
To use Deepin MCP, install the required dependencies, set up your environment with an OpenAI API key, and run the client with the desired MCP server script.
Key features of Deepin MCP?
- Connects to MCP servers using Python or JavaScript.
- Allows listing and using available tools from the server.
- Supports natural language queries for interaction.
- Integrates with OpenAI's language models for processing responses.
- Includes a task planning system for executing complex tasks automatically.
Use cases of Deepin MCP?
- Automating file operations through natural language commands.
- Querying weather information using a dedicated weather server.
- Executing Linux bash commands via the bash server.
- Planning and executing multi-step tasks automatically.
FAQ from Deepin MCP?
- What are the prerequisites for using Deepin MCP?
You need Python 3.12 or higher, an OpenAI API key, and an MCP server to connect to.
- Is there a recommended package installer?
Yes, UV is recommended for installing dependencies and managing the environment.
- Can I run Deepin MCP without Python installed?
Yes, you can build a standalone executable that does not require Python installation.
Deepin MCP
A client for connecting to MCP (Machine Conversation Protocol) servers and interacting with them using OpenAI language models.
Overview
This MCP client allows you to:
- Connect to an MCP server (Python or JavaScript)
- List and use available tools from the server
- Interact with the server using natural language queries
- Process responses with OpenAI's language models
- Plan and execute complex tasks automatically
Setup
Prerequisites
- Python 3.12 or higher
- OpenAI API key
- MCP server to connect to
Installing UV (Recommended)
UV is a fast, reliable Python package installer and resolver. It's recommended for this project:
-
Install UV:
# Unix (macOS, Linux) curl -LsSf https://astral.sh/uv/install.sh | bash # Windows (PowerShell) irm https://astral.sh/uv/install.ps1 | iex
-
Verify installation:
uv --version
For more installation options, visit UV's official documentation.
Installation
-
Clone this repository
-
Create a virtual environment:
python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
-
Install dependencies:
pip install -e .
Or using UV (recommended):
uv venv uv pip install -e .
-
Create a
.env
file with your API keys and configuration:# OpenAI API Configuration OPENAI_API_KEY=your_openai_api_key BASE_URL=https://api.openai.com/v1 # Optional: Use your own OpenAI API proxy MODEL=gpt-3.5-turbo # Optional: Specify the model to use
-
Install required dependencies:
Using UV (recommended):
# Install core dependencies uv pip install -e . # Or install individual packages uv add openai # OpenAI API client uv add python-dotenv # For loading environment variables uv add mcp # Machine Conversation Protocol library
Using standard pip:
pip install openai python-dotenv mcp
Each server type may require additional dependencies:
# For JavaScript servers uv add node-fetch # If connecting to JavaScript MCP servers # For additional functionality uv add asyncio # For asynchronous operations uv add pathlib # For file path handling
Usage
Quickstart with UV
Get up and running quickly with UV:
# Clone the repository
git clone https://github.com/yourusername/deepin-mcp-python.git
cd deepin-mcp-python
# Install UV
curl -sSf https://astral.sh/uv/install.sh | bash
# Create a virtual environment and install dependencies
uv venv
uv pip install -e .
# Create .env file (replace with your actual API key)
echo "OPENAI_API_KEY=your_openai_api_key" > .env
echo "MODEL=gpt-3.5-turbo" >> .env
# Run the client with bash server
uv run python client.py servers/bash_server.py
# Or run the task planning system
uv run python main.py
Basic Usage
Run the client by specifying the path to your MCP server script:
python client.py path/to/your/server_script.py
Or using UV:
uv run python client.py path/to/your/server_script.py
For JavaScript servers:
python client.py path/to/your/server_script.js
Or using UV:
uv run python client.py path/to/your/server_script.js
Task Planning System
The project includes a powerful task planning system that can:
- Break down complex user requests into sequential tasks
- Execute tasks automatically using the MCP server
- Provide detailed execution summaries
To use the task planning system:
python main.py
Or using UV (recommended):
uv run python main.py
You can also specify a custom server path:
python main.py --server path/to/your/server_script.py
Or check the version information:
python main.py --version
Main Entry Point
The project provides a main entry point (main.py
) that serves as the centralized entry point for the application:
python main.py
Or using UV (recommended):
uv run python main.py
The main entry point:
- Initializes the task planning system
- Provides version information and system description
- Handles exceptions gracefully
- Makes the application more user-friendly
- Allows specifying custom server paths with the
--server
option
Example usage:
$ python main.py
====== Deepin MCP 任务规划系统 ======
版本: 1.0.0
描述: 这是一个基于MCP协议的任务规划执行系统
======================================
欢迎使用任务规划执行系统
这个系统会将您的请求拆解为多个任务,并依次执行
已成功连接到服务器: /home/user/deepin-mcp-python/servers/bash_server.py
请输入您的请求 (输入'quit'退出):
Example interaction:
请输入您的请求 (输入'quit'退出): Create a new directory called 'projects' and copy all .txt files from 'documents' to it
正在分析您的请求...
已将您的请求拆解为 3 个任务:
1. Create a new directory called 'projects'
2. List all .txt files in the 'documents' directory
3. Copy all .txt files from 'documents' to 'projects'
是否执行这些任务? (y/n): y
[Task execution progress will be shown here]
执行总结:
[Detailed summary of the executed tasks will be shown here]
The task planning system is particularly useful for:
- Complex file operations
- Multi-step system configurations
- Automated workflow execution
- Batch processing tasks
Interactive Mode
Once connected, you can enter queries in the interactive prompt. The client will:
- Send your query to the language model
- Process any tool calls requested by the model
- Return the final response
Type quit
to exit the interactive mode.
Available Servers
Weather Server
The weather_server.py
provides an MCP tool to query weather information for a specified city.
python client.py weather_server.py
Or using UV:
uv run python client.py weather_server.py
Example query: "查询北京的天气"
Dependencies:
uv add requests # For HTTP requests to weather APIs
File Server
The file_server.py
provides MCP tools for file operations like open, copy, move, rename, delete, and create files.
python client.py file_server.py
Or using UV:
uv run python client.py file_server.py
Example query: "创建一个名为test.txt的文件"
Dependencies:
uv add pathlib # For file path handling
Bash Server
The bash_server.py
provides MCP tools for executing Linux bash commands.
python client.py bash_server.py
Or using UV:
uv run python client.py bash_server.py
Example query: "列出当前目录下的所有文件"
Dependencies:
uv add shlex # For shell command parsing
Development
To extend or modify this client:
- The
MCPClient
class handles the main functionality - The
process_query
method processes queries using the LLM - The
TaskPlanner
class breaks down complex requests into tasks - The main entry point (
main.py
) provides a user-friendly interface - Error handling is implemented throughout for robustness
Troubleshooting
If you encounter issues:
- Check your API keys in
.env
- Verify the server script path is correct
- Ensure the server implements the MCP protocol correctly
- Check console output for error messages
- Make sure you have required dependencies installed
License
This project is available under the MIT License.
Building Executable Binary
You can build a standalone binary executable that can be distributed without requiring Python installation:
Using the Build Script
The project includes a build script that simplifies the process:
# Make the build script executable
chmod +x build.sh
# Run the build script
./build.sh
The executable will be created in the dist
directory as deepin-mcp
.
Manual Build Process
If you prefer to build manually:
-
Install PyInstaller using UV:
uv pip install pyinstaller
-
Build the executable:
pyinstaller --clean deepin-mcp.spec
-
The executable will be available at
dist/deepin-mcp
Running the Executable
Once built, you can run the executable directly:
# Run with default settings
./dist/deepin-mcp
# Check version
./dist/deepin-mcp --version
# Specify a custom server
./dist/deepin-mcp --server path/to/your/server_script.py
The executable includes all necessary dependencies and can be distributed to systems without Python installed.
Server Discovery and Selection
The application automatically discovers and loads available server scripts:
-
When starting the application, it searches for server scripts in these locations:
- The current working directory
- The
servers
subdirectory - The application's executable directory
- The
servers
subdirectory within the executable directory
-
Available servers can be listed with:
./deepin-mcp --list-servers
-
A specific server can be selected at startup:
./deepin-mcp --server bash # or by path ./deepin-mcp --server /path/to/custom_server.py
-
Servers can be switched during runtime by typing
switch
at the prompt.
Server Deployment
When the application is packaged as a binary executable:
- Server scripts are automatically deployed to the
servers
directory next to the executable - The application automatically finds and loads these servers
- Each server is accompanied by a wrapper script (e.g.,
bash_server.wrapper.py
) that ensures proper dependency loading - A dedicated Python environment with all necessary dependencies (including the
mcp
module) is created in theserver_env
directory - Custom servers can be added by placing them in the
servers
directory with a filename containing "server" (e.g.,custom_server.py
) - The
run_server.sh
helper script can be used to directly run any server
Running Servers Directly
You can run servers directly using the provided helper script:
# Run a server by name
./dist/deepin-mcp/run_server.sh bash
# Run a server by file name
./dist/deepin-mcp/run_server.sh bash_server.py
# Run a server by full path
./dist/deepin-mcp/run_server.sh /path/to/your/custom_server.py
This is particularly useful for:
- Testing server functionality independently
- Running servers in separate terminals
- Using custom server scripts with the application
Server Environment Structure
The packaged application includes a complete environment for the servers:
dist/deepin-mcp/
├── deepin-mcp # Main executable
├── run_server.sh # Server helper script
├── server_env/ # Python environment for servers
│ ├── mcp/ # MCP module and dependencies
│ ├── openai/ # OpenAI module
│ └── ... # Other dependencies
└── servers/
├── bash_server.py # Original server script
├── bash_server.wrapper.py # Wrapper that configures path
├── file_server.py # Original server script
├── file_server.wrapper.py # Wrapper that configures path
└── ...
Adding Custom Servers
To add custom servers to a packaged application:
- Create your server script following the MCP protocol
- Place it in the
servers
directory of the packaged application - Create a wrapper script that follows the same pattern as the included wrappers, or run the following command inside the dist/deepin-mcp directory:
cat > servers/myserver_server.wrapper.py << EOF
#!/usr/bin/env python
# Wrapper script for myserver_server.py
import os
import sys
# Add the server_env to the Python path
server_env_path = os.path.abspath(os.path.join(os.path.dirname(__file__), '..', 'server_env'))
sys.path.insert(0, server_env_path)
# Import the actual server code
server_file = os.path.abspath(os.path.join(os.path.dirname(__file__), 'myserver_server.py'))
with open(server_file, 'r') as f:
server_code = f.read()
# Execute the server code with the modified path
exec(server_code)
EOF
chmod +x servers/myserver_server.wrapper.py
- Run your custom server with the helper script or use it directly from the main application