🚀 Nchan MCP Transport

🚀 Nchan MCP Transport

By ConechoAI GitHub

The best way to deploy mcp server. A high-performance WebSocket/SSE transport layer & gateway for Anthropic's MCP (Model Context Protocol) — powered by Nginx, Nchan, and FastAPI.

actions gpts
Overview

What is Nchan MCP Transport?

Nchan MCP Transport is a high-performance WebSocket/SSE transport layer and gateway designed for Anthropic's Model Context Protocol (MCP), enabling real-time, scalable AI integrations with Claude and other LLM agents.

How to use Nchan MCP Transport?

To use Nchan MCP Transport, install the server SDK with pip install httmcp, run the demo in Docker, and define your tools using Python decorators.

Key features of Nchan MCP Transport?

  • Dual Protocol Support: Supports WebSocket and SSE with automatic detection.
  • High Performance Pub/Sub: Built on Nginx + Nchan, handling thousands of concurrent connections.
  • MCP-Compliant Transport: Fully implements Model Context Protocol (JSON-RPC 2.0).
  • OpenAPI Integration: Auto-generate MCP tools from any OpenAPI spec.
  • Tool / Resource System: Register tools and resources using Python decorators.
  • Asynchronous Execution: Background task queue with live progress updates.
  • Dockerized Deployment: Easily deployable with Docker Compose.

Use cases of Nchan MCP Transport?

  1. Building Claude plugin servers over WebSocket/SSE.
  2. Creating real-time LLM agent backends.
  3. Connecting Claude to internal APIs via OpenAPI.
  4. Serving as a high-performance tool/service bridge for MCP.

FAQ from Nchan MCP Transport?

  • What are the requirements for Nchan MCP Transport?

Requires Nginx with Nchan module, Python 3.9+, and Docker/Docker Compose.

  • Is it easy to deploy?

Yes! It can be easily deployed using Docker Compose.

Content

🚀 Nchan MCP Transport

A high-performance WebSocket/SSE transport layer & gateway for Anthropic's MCP (Model Context Protocol) — powered by Nginx, Nchan, and FastAPI.
For building real-time, scalable AI integrations with Claude and other LLM agents.


✨ What is this?

Nchan MCP Transport provides a real-time API gateway for MCP clients (like Claude) to talk to your tools and services over:

  • 🧵 WebSocket or Server-Sent Events (SSE)
  • ⚡️ Streamable HTTP compatible
  • 🧠 Powered by Nginx + Nchan for low-latency pub/sub
  • 🛠 Integrates with FastAPI for backend logic and OpenAPI tooling

✅ Ideal for AI developers building Claude plugins, LLM agents, or integrating external APIs into Claude via MCP.


🧩 Key Features

FeatureDescription
🔄 Dual Protocol SupportSeamlessly supports WebSocket and SSE with automatic detection
🚀 High Performance Pub/SubBuilt on Nginx + Nchan, handles thousands of concurrent connections
🔌 MCP-Compliant TransportFully implements Model Context Protocol (JSON-RPC 2.0)
🧰 OpenAPI IntegrationAuto-generate MCP tools from any OpenAPI spec
🪝 Tool / Resource SystemUse Python decorators to register tools and resources
📡 Asynchronous ExecutionBackground task queue + live progress updates via push notifications
🧱 Dockerized DeploymentEasily spin up with Docker Compose

🧠 Why Use This?

MCP lets AI assistants like Claude talk to external tools. But:

  • Native MCP is HTTP+SSE, which struggles with long tasks, network instability, and high concurrency
  • WebSockets aren’t natively supported by Claude — this project bridges the gap
  • Server-side logic in pure Python (like FastMCP) may not scale under load

Nchan MCP Transport gives you:

  • Web-scale performance (Nginx/Nchan)
  • FastAPI-powered backend for tools
  • Real-time event delivery to Claude clients
  • Plug-and-play OpenAPI to Claude integration

🚀 Quickstart

📦 1. Install server SDK

pip install httmcp

🧪 2. Run demo in Docker

git clone https://github.com/yourusername/nchan-mcp-transport.git
cd nchan-mcp-transport
docker-compose up -d

🛠 3. Define your tool

@server.tool()
async def search_docs(query: str) -> str:
    return f"Searching for {query}..."

🧬 4. Expose OpenAPI service (optional)

openapi_server = await OpenAPIMCP.from_openapi("https://example.com/openapi.json", publish_server="http://nchan:80")
app.include_router(openapi_server.router)

🖥️ 5. One-Click GPTs Actions to MCP Deployment

HTTMCP provides a powerful CLI for instant deployment of GPTs Actions to MCP servers:

# Installation
pip install httmcp[cli]

# One-click deployment from GPTs Actions OpenAPI spec
python -m httmcp -f gpt_actions_openapi.json -p http://nchan:80

📚 Use Cases

  • Claude plugin server over WebSocket/SSE
  • Real-time LLM agent backend (LangChain/AutoGen style)
  • Connect Claude to internal APIs (via OpenAPI)
  • High-performance tool/service bridge for MCP

🔒 Requirements

  • Nginx with Nchan module (pre-installed in Docker image)
  • Python 3.9+
  • Docker / Docker Compose

🛠 Tech Stack

  • 🧩 Nginx + Nchan – persistent connection management & pub/sub
  • ⚙️ FastAPI – backend logic & JSON-RPC routing
  • 🐍 HTTMCP SDK – full MCP protocol implementation
  • 🐳 Docker – deployment ready

📎 Keywords

mcp transport, nchan websocket, sse for anthropic, mcp jsonrpc gateway, claude plugin backend, streamable http, real-time ai api gateway, fastapi websocket mcp, mcp pubsub, mcp openapi bridge


🤝 Contributing

Pull requests are welcome! File issues if you’d like to help improve:

  • Performance
  • Deployment
  • SDK integrations

📄 License

MIT License

No tools information available.
No content found.