Your connection was interrupted
what is MCP Perplexity Proxy Server?
MCP Perplexity Proxy Server is a Node.js server that proxies OpenAI-compatible or MCP-native requests to Perplexity AI's Sonar models, providing streaming support for enhanced performance.
how to use MCP Perplexity Proxy Server?
To use the server, you can run it locally using Docker and make API requests to the specified endpoints.
key features of MCP Perplexity Proxy Server?
- Proxies requests to OpenAI-compatible and MCP-native APIs
- Supports streaming responses for real-time data
- Easy local setup with Docker
use cases of MCP Perplexity Proxy Server?
- Integrating Perplexity AI's models into applications that require real-time data processing.
- Facilitating communication between different AI models and applications.
- Testing and developing applications that utilize AI-driven responses.
FAQ from MCP Perplexity Proxy Server?
- What is the purpose of this server?
The server acts as a proxy to facilitate requests to Perplexity AI's models, enabling developers to integrate AI capabilities into their applications easily.
- How do I run the server locally?
You can run the server locally by using the command
docker-compose up --build
.
- What APIs are supported?
The server supports OpenAI-compatible APIs and MCP-native APIs, including endpoints for chat completions and a Swagger UI for testing.
🚀 MCP Perplexity Proxy Server
This Node.js server proxies OpenAI-compatible or MCP-native requests to Perplexity AI's Sonar models with streaming support.
🔧 Run locally
docker-compose up --build
🔁 Supported APIs
- POST
/v1/chat/completions
(OpenAI-compatible) - POST
/mcp-stream
(MCP-native) - GET
/
(Swagger UI)
✅ Usage in CLine / Cursor / RooCode
{
"provider": "openai",
"api_base": "http://localhost:3000/v1",
"api_key": "dummy"
}
🧪 Local Testing
curl -X POST http://localhost:3000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"sonar-reasoning-pro","messages":[{"role":"user","content":"Was ist JSON.stringify?"}]}'