Deepseek Thinker MCP Server

Deepseek Thinker MCP Server

By ruixingshi GitHub

A MCP provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's CoT from the Deepseek API service or a local Ollama server.

Overview

what is Deepseek Thinker MCP?

Deepseek Thinker MCP is a Model Context Protocol (MCP) provider that delivers Deepseek reasoning content to MCP-enabled AI clients, such as Claude Desktop. It allows access to Deepseek's thought processes via the Deepseek API service or a local Ollama server.

how to use Deepseek Thinker MCP?

To use Deepseek Thinker MCP, integrate it with an AI client by configuring the claude_desktop_config.json file with the necessary command and environment variables. You can also run it in Ollama mode or configure it for local server use.

key features of Deepseek Thinker MCP?

  • Dual Mode Support: OpenAI API mode and Ollama local mode.
  • Focused Reasoning: Captures and provides reasoning output from Deepseek's thinking process.

use cases of Deepseek Thinker MCP?

  1. Enhancing AI client capabilities with Deepseek's reasoning.
  2. Supporting complex reasoning tasks in AI applications.
  3. Facilitating local AI model interactions through Ollama.

FAQ from Deepseek Thinker MCP?

  • What should I do if I encounter "MCP error -32001: Request timed out"?

This error indicates that the Deepseek API response is slow or the reasoning output is too lengthy, causing a timeout.

  • Is there a specific tech stack used for this project?

Yes, the project uses TypeScript, OpenAI API, Ollama, and Zod for parameter validation.

Content

Deepseek Thinker MCP Server

smithery badge

A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server.

Deepseek Thinker Server MCP server

Core Features

  • 🤖 Dual Mode Support

    • OpenAI API mode support
    • Ollama local mode support
  • 🎯 Focused Reasoning

    • Captures Deepseek's thinking process
    • Provides reasoning output

Available Tools

get-deepseek-thinker

  • Description: Perform reasoning using the Deepseek model
  • Input Parameters:
    • originPrompt (string): User's original prompt
  • Returns: Structured text response containing the reasoning process

Environment Configuration

OpenAI API Mode

Set the following environment variables:

API_KEY=<Your OpenAI API Key>
BASE_URL=<API Base URL>

Ollama Mode

Set the following environment variable:

USE_OLLAMA=true

Usage

Integration with AI Client, like Claude Desktop

Add the following configuration to your claude_desktop_config.json:

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "npx",
      "args": [
        "-y",
        "deepseek-thinker-mcp"
      ],
      "env": {
        "API_KEY": "<Your API Key>",
        "BASE_URL": "<Your Base URL>"
      }
    }
  }
}

Using Ollama Mode

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "npx",
      "args": [
        "-y",
        "deepseek-thinker-mcp"
      ],
      "env": {
        "USE_OLLAMA": "true"
      }
    }
  }
}

Local Server Configuration

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "node",
      "args": [
        "/your-path/deepseek-thinker-mcp/build/index.js"
      ],
      "env": {
        "API_KEY": "<Your API Key>",
        "BASE_URL": "<Your Base URL>"
      }
    }
  }
}

Development Setup

# Install dependencies
npm install

# Build project
npm run build

# Run service
node build/index.js

FAQ

Response like this: "MCP error -32001: Request timed out"

This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.

Tech Stack

  • TypeScript
  • @modelcontextprotocol/sdk
  • OpenAI API
  • Ollama
  • Zod (parameter validation)

License

This project is licensed under the MIT License. See the LICENSE file for details.

No tools information available.
No content found.