What is Deepseek R1 MCP Server?
Deepseek R1 MCP Server is an implementation of a Model Context Protocol (MCP) server designed for the Deepseek R1 language model, which is optimized for reasoning tasks and supports a context window of 8192 tokens.
How to use Deepseek R1 MCP Server?
To use the server, you can install it via Smithery or manually clone the repository, set up your environment, and run the server with your API key.
Key features of Deepseek R1 MCP Server?
- Advanced text generation capabilities with a large context window.
- Configurable parameters for text generation.
- Robust error handling with detailed messages.
- Full support for the MCP protocol.
- Integration with Claude Desktop.
- Support for multiple models (DeepSeek-R1 and DeepSeek-V3).
Use cases of Deepseek R1 MCP Server?
- Generating complex text outputs for applications.
- Assisting in coding and mathematical calculations.
- Data cleaning and analysis tasks.
- Creative writing and poetry generation.
FAQ from Deepseek R1 MCP Server?
- What are the prerequisites for using the server?
You need Node.js (v18 or higher), npm, Claude Desktop, and a Deepseek API key.
- Can I use different models with this server?
Yes, you can switch between DeepSeek-R1 and DeepSeek-V3 by modifying the model name in the configuration.
- How does the temperature parameter affect output?
The temperature controls the randomness of the output; lower values are better for coding and math, while higher values are suited for creative tasks.
Deepseek R1 MCP Server
A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model. Deepseek R1 is a powerful language model optimized for reasoning tasks with a context window of 8192 tokens.
Why Node.js? This implementation uses Node.js/TypeScript as it provides the most stable integration with MCP servers. The Node.js SDK offers better type safety, error handling, and compatibility with Claude Desktop.
Quick Start
Installing via Smithery
To install Deepseek R1 for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @66julienmartin/mcp-server-deepseek_r1 --client claude
Installing manually
# Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install
# Set up environment
cp .env.example .env # Then add your API key
# Build and run
npm run build
Prerequisites
- Node.js (v18 or higher)
- npm
- Claude Desktop
- Deepseek API key
Model Selection
By default, this server uses the deepseek-R1 model. If you want to use DeepSeek-V3 instead, modify the model name in src/index.ts
:
// For DeepSeek-R1 (default)
model: "deepseek-reasoner"
// For DeepSeek-V3
model: "deepseek-chat"
Project Structure
deepseek-r1-mcp/
├── src/
│ ├── index.ts # Main server implementation
├── build/ # Compiled files
│ ├── index.js
├── LICENSE
├── README.md
├── package.json
├── package-lock.json
└── tsconfig.json
Configuration
- Create a
.env
file:
DEEPSEEK_API_KEY=your-api-key-here
- Update Claude Desktop configuration:
{
"mcpServers": {
"deepseek_r1": {
"command": "node",
"args": ["/path/to/deepseek-r1-mcp/build/index.js"],
"env": {
"DEEPSEEK_API_KEY": "your-api-key"
}
}
}
}
Development
npm run dev # Watch mode
npm run build # Build for production
Features
- Advanced text generation with Deepseek R1 (8192 token context window)
- Configurable parameters (max_tokens, temperature)
- Robust error handling with detailed error messages
- Full MCP protocol support
- Claude Desktop integration
- Support for both DeepSeek-R1 and DeepSeek-V3 models
API Usage
{
"name": "deepseek_r1",
"arguments": {
"prompt": "Your prompt here",
"max_tokens": 8192, // Maximum tokens to generate
"temperature": 0.2 // Controls randomness
}
}
The Temperature Parameter
The default value of temperature
is 0.2.
Deepseek recommends setting the temperature
according to your specific use case:
USE CASE | TEMPERATURE | EXAMPLE |
---|---|---|
Coding / Math | 0.0 | Code generation, mathematical calculations |
Data Cleaning / Data Analysis | 1.0 | Data processing tasks |
General Conversation | 1.3 | Chat and dialogue |
Translation | 1.3 | Language translation |
Creative Writing / Poetry | 1.5 | Story writing, poetry generation |
Error Handling
The server provides detailed error messages for common issues:
- API authentication errors
- Invalid parameters
- Rate limiting
- Network issues
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT