RAT MCP Server (Retrieval Augmented Thinking)

RAT MCP Server (Retrieval Augmented Thinking)

By newideas99 GitHub

🧠 MCP server implementing RAT (Retrieval Augmented Thinking) - combines DeepSeek's reasoning with GPT-4/Claude/Mistral responses, maintaining conversation context between interactions.

mathgpt math-solver
Overview

what is RAT MCP Server?

RAT MCP Server (Retrieval Augmented Thinking) is a server that implements a two-stage reasoning process, combining DeepSeek's reasoning capabilities with various response models like GPT-4 and Claude, while maintaining conversation context.

how to use RAT MCP Server?

To use the RAT MCP Server, clone the repository, install dependencies, configure your API keys in a .env file, and build the server. You can then integrate it with Cline for generating responses.

key features of RAT MCP Server?

  • Two-stage processing using DeepSeek for reasoning and multiple models for response generation.
  • Maintains conversation context and history.
  • Supports various models including Claude and OpenRouter models.

use cases of RAT MCP Server?

  1. Enhancing AI responses through structured reasoning.
  2. Providing context-aware answers in conversational AI applications.
  3. Integrating with development tools for AI-assisted coding.

FAQ from RAT MCP Server?

  • What models does RAT MCP Server support?

It supports DeepSeek, Claude, and any OpenRouter models like GPT-4.

  • Is there a license for RAT MCP Server?

Yes, it is released under the MIT License.

  • How do I maintain conversation context?

The server automatically maintains conversation history and includes it in the reasoning process.

Content

RAT MCP Server (Retrieval Augmented Thinking)

A Model Context Protocol (MCP) server that implements RAT's two-stage reasoning process, combining DeepSeek's reasoning capabilities with various response models.

RAT Server MCP server

Features

  • Two-Stage Processing:

    • Uses DeepSeek for detailed reasoning and analysis
    • Supports multiple models for final response generation
    • Maintains conversation context between interactions
  • Supported Models:

    • DeepSeek Reasoner (for thinking process)
    • Claude 3.5 Sonnet (via Anthropic)
    • Any OpenRouter model (GPT-4, Gemini, etc.)
  • Context Management:

    • Maintains conversation history
    • Includes previous Q&A in reasoning process
    • Supports context clearing when needed
    • Configurable context size limit

Installation

  1. Clone the repository:
git clone https://github.com/newideas99/RAT-retrieval-augmented-thinking-MCP.git
cd rat-mcp-server
  1. Install dependencies:
npm install
  1. Create a .env file with your API keys and model configuration:
# Required: DeepSeek API key for reasoning stage
DEEPSEEK_API_KEY=your_deepseek_api_key_here

# Required: OpenRouter API key for non-Claude models
OPENROUTER_API_KEY=your_openrouter_api_key_here

# Optional: Anthropic API key for Claude model
ANTHROPIC_API_KEY=your_anthropic_api_key_here

# Optional: Model configuration
DEFAULT_MODEL=claude-3-5-sonnet-20241022  # or any OpenRouter model ID
OPENROUTER_MODEL=openai/gpt-4  # default OpenRouter model if not using Claude
  1. Build the server:
npm run build

Usage with Cline

Add to your Cline MCP settings (usually in ~/.vscode/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):

{
  "mcpServers": {
    "rat": {
      "command": "/path/to/node",
      "args": ["/path/to/rat-mcp-server/build/index.js"],
      "env": {
        "DEEPSEEK_API_KEY": "your_key_here",
        "OPENROUTER_API_KEY": "your_key_here",
        "ANTHROPIC_API_KEY": "your_key_here",
        "DEFAULT_MODEL": "claude-3-5-sonnet-20241022",
        "OPENROUTER_MODEL": "openai/gpt-4"
      },
      "disabled": false,
      "autoApprove": []
    }
  }
}

Tool Usage

The server provides a single tool generate_response with the following parameters:

{
  "prompt": string,           // Required: The question or prompt
  "showReasoning"?: boolean, // Optional: Show DeepSeek's reasoning process
  "clearContext"?: boolean   // Optional: Clear conversation history
}

Example usage in Cline:

use_mcp_tool({
  server_name: "rat",
  tool_name: "generate_response",
  arguments: {
    prompt: "What is Python?",
    showReasoning: true
  }
});

Development

For development with auto-rebuild:

npm run watch

License

MIT License - See LICENSE file for details.

Credits

Based on the RAT (Retrieval Augmented Thinking) concept by Skirano, which enhances AI responses through structured reasoning and knowledge retrieval.

No tools information available.

-

mathgpt math-solver
View Details
StatSource
StatSource by jamie7893

Statsource is a standalone MCP server designed to simplify data analysis. Whether you're pulling data from a PostgreSQL database or a CSV file, Statsource delivers actionable insights with ease

mathgpt math-solver
View Details

Mirror of

mathgpt math-solver
View Details

created from MCP server demo

mathgpt math-solver
View Details

-

mathgpt math-solver
View Details

created from MCP server demo

mathgpt math-solver
View Details