OpenAPI to Model Context Protocol (MCP)

OpenAPI to Model Context Protocol (MCP)

By gujord GitHub

The OpenAPI to Model Context Protocol (MCP) proxy server bridges the gap between AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools. This simplifies the integration process, significantly reducing development time and complexity associated with custom API wrappers.

python ai
Overview

What is OpenAPI-MCP?

OpenAPI-MCP is a proxy server that bridges AI agents and external APIs by translating OpenAPI specifications into standardized Model Context Protocol (MCP) tools, simplifying integration and reducing development complexity.

How to use OpenAPI-MCP?

To use OpenAPI-MCP, clone the repository, install the required packages, configure the environment variables, and run the server using the provided commands.

Key features of OpenAPI-MCP?

  • Dynamic tool generation from OpenAPI endpoints.
  • Supports multiple transport methods including stdio and Server-Sent Events (SSE).
  • OAuth2 support for secure interactions.
  • Dry run mode for safe API simulations.
  • JSON-RPC 2.0 compliance for robust communication.
  • Integration with popular AI orchestrators like Cursor and Windsurf.

Use cases of OpenAPI-MCP?

  1. Seamless integration of AI agents with various APIs.
  2. Rapid development of applications requiring API interactions.
  3. Simplifying the process of creating custom API wrappers.

FAQ from OpenAPI-MCP?

  • What is the purpose of OpenAPI-MCP?

It simplifies the integration of AI agents with external APIs by standardizing communication through MCP.

  • Is OpenAPI-MCP easy to set up?

Yes! It requires cloning the repository and configuring a few environment variables.

  • Can OpenAPI-MCP handle multiple APIs?

Yes! It can dynamically generate tools for multiple OpenAPI specifications.

Content

OpenAPI to Model Context Protocol (MCP)

License: MIT Repo Size Last Commit Open Issues

OpenAPI-MCP

The OpenAPI to Model Context Protocol (MCP) proxy server bridges the gap between AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools. This simplifies integration, eliminating the need for custom API wrappers.


Why MCP?

The Model Context Protocol (MCP), developed by Anthropic, standardizes communication between Large Language Models (LLMs) and external tools. By acting as a universal adapter, MCP enables AI agents to interface with external APIs seamlessly.


Key Features

  • OpenAPI Integration: Parses and registers OpenAPI operations as callable tools.
  • OAuth2 Support: Handles machine authentication via Client Credentials flow.
  • Dry Run Mode: Simulates API calls without execution for inspection.
  • JSON-RPC 2.0 Support: Fully compliant request/response structure.
  • Auto Metadata: Derives tool names, summaries, and schemas from OpenAPI.
  • Sanitized Tool Names: Ensures compatibility with MCP name constraints.
  • Query String Parsing: Supports direct passing of query parameters as a string.
  • Enhanced Parameter Handling: Automatically converts parameters to correct data types.
  • Extended Tool Metadata: Includes detailed parameter information for better LLM understanding.
  • FastMCP Transport: Optimized for stdio, works out-of-the-box with agents.

Quick Start

Installation

git clone https://github.com/gujord/OpenAPI-MCP.git
cd OpenAPI-MCP
pip install -r requirements.txt

Environment Configuration

VariableDescriptionRequiredDefault
OPENAPI_URLURL to the OpenAPI specificationYes-
SERVER_NAMEMCP server nameNoopenapi_proxy_server
OAUTH_CLIENT_IDOAuth client IDNo-
OAUTH_CLIENT_SECRETOAuth client secretNo-
OAUTH_TOKEN_URLOAuth token endpoint URLNo-
OAUTH_SCOPEOAuth scopeNoapi

How It Works

  1. Parses OpenAPI spec using httpx and PyYAML if needed.
  2. Extracts operations and generates MCP-compatible tools with proper names.
  3. Authenticates using OAuth2 (if credentials are present).
  4. Builds input schemas based on OpenAPI parameter definitions.
  5. Handles calls via JSON-RPC 2.0 protocol with automatic error responses.
  6. Supports extended parameter information for improved LLM understanding.
  7. Handles query string parsing for easier parameter passing.
  8. Performs automatic type conversion based on OpenAPI schema definitions.
  9. Supports dry_run to inspect outgoing requests without invoking them.
sequenceDiagram
    participant LLM as LLM (Claude/GPT)
    participant MCP as OpenAPI-MCP Proxy
    participant API as External API

    Note over LLM, API: Communication Process
    
    LLM->>MCP: 1. Initialize (initialize)
    MCP-->>LLM: Metadata and tool list
    
    LLM->>MCP: 2. Request tools (tools_list)
    MCP-->>LLM: Detailed tool list from OpenAPI specification
    
    LLM->>MCP: 3. Call tool (tools_call)
    
    alt With OAuth2
        MCP->>API: Request OAuth2 token
        API-->>MCP: Access Token
    end
    
    MCP->>API: 4. Execute API call with proper formatting
    API-->>MCP: 5. API response (JSON)
    
    alt Type Conversion
        MCP->>MCP: 6. Convert parameters to correct data types
    end
    
    MCP-->>LLM: 7. Formatted response from API
    
    alt Dry Run Mode
        LLM->>MCP: Call with dry_run=true
        MCP-->>LLM: Display request information without executing call
    end

Built-in Tools

These tools are always available:

  • initialize – Returns server metadata and protocol version.
  • tools_list – Lists all registered tools (from OpenAPI and built-in) with extended metadata.
  • tools_call – Calls any tool by name with arguments.

Advanced Usage

Query String Passing

You can pass query parameters as a string in the kwargs parameter:

{
  "jsonrpc": "2.0",
  "method": "tools_call",
  "params": {
    "name": "get_pets",
    "arguments": {
      "kwargs": "status=available&limit=10"
    }
  },
  "id": 1
}

Parameter Type Conversion

The server automatically converts parameter values to the appropriate type based on the OpenAPI specification:

  • String parameters remain as strings
  • Integer parameters are converted using int()
  • Number parameters are converted using float()
  • Boolean parameters are converted from strings like "true", "1", "yes", "y" to True

LLM Orchestrator Configuration

Cursor (~/.cursor/mcp.json)

{
  "mcpServers": {
    "petstore3": {
      "command": "full_path_to_openapi_mcp/venv/bin/python",
      "args": ["full_path_to_openapi_mcp/src/server.py"],
      "env": {
        "SERVER_NAME": "petstore3",
        "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
      },
      "transport": "stdio"
    }
  }
}

Cursor

Windsurf (~/.codeium/windsurf/mcp_config.json)

{
  "mcpServers": {
    "petstore3": {
      "command": "full_path_to_openapi_mcp/venv/bin/python",
      "args": ["full_path_to_openapi_mcp/src/server.py"],
      "env": {
        "SERVER_NAME": "petstore3",
        "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
      },
      "transport": "stdio"
    }
  }
}

Windsurf

Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json)

{
    "mcpServers": {
        "petstore3": {
            "command": "full_path_to_openapi_mcp/venv/bin/python",
            "args": ["full_path_to_openapi_mcp/src/server.py"],
            "env": {
                "SERVER_NAME": "petstore3",
                "OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
            },
            "transport": "stdio"
        }
    }
}

Contributing

  • Fork this repo
  • Create a new branch
  • Submit a pull request with a clear description

License

MIT License


If you find it useful, give it a ⭐ on GitHub!

No tools information available.

A Model Context Protocol server for integrating HackMD's note-taking platform with AI assistants.

The open-source multi-agent chat interface that lets you manage multiple agents in one dynamic conversation and add MCP servers for deep research

python typescript
View Details

AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).

python mcp
View Details
YouTube MCP Server
YouTube MCP Server by IA-Programming

YouTube MCP Server is an AI-powered solution designed to revolutionize your YouTube experience. It empowers users to search for YouTube videos, retrieve detailed transcripts, and perform semantic searches over video content—all without relying on the official API. By integrating with a vector database, this server streamlines content discovery.

youtube ai
View Details

MCP Deep Research Server using Gemini creating a Research AI Agent

research ai
View Details
MCP-Mealprep
MCP-Mealprep by JoshuaRL

This project takes a number of MCP servers from GitHub locations, packages them together with this repo's GHCR container, and launches them with docker-compose to run as a stack for ML/AI resources.

docker ai
View Details

BioMCP: Biomedical Model Context Protocol

bioinformatics ai
View Details