What is Perplexity MCP Server?
Perplexity MCP Server is a web search server that utilizes Perplexity's API to provide intelligent search capabilities with automatic model selection based on the user's query intent.
How to use Perplexity MCP Server?
To use the server, clone the repository from GitHub, install the necessary dependencies, configure your Perplexity API key, and integrate it with the Claude Desktop App. After setup, you can ask Claude to perform searches using natural language queries.
Key features of Perplexity MCP Server?
- Automatic model selection based on query intent.
- Support for multiple models tailored for different types of queries.
- Domain and recency filtering for customized search results.
- Easy integration with Claude Desktop App.
Use cases of Perplexity MCP Server?
- Conducting in-depth research on various topics.
- Finding quick answers to general knowledge questions.
- Analyzing complex problems using advanced reasoning models.
- Customizing search results based on specific domains or time frames.
FAQ from Perplexity MCP Server?
- How do I get a Perplexity API key?
You can obtain an API key by visiting https://www.perplexity.ai/settings/api.
- Can I customize the search results?
Yes! You can use domain and recency filters to tailor your search experience.
- What models are available for use?
The server supports several models including sonar-deep-research, sonar-reasoning-pro, and sonar for various search intents.
Perplexity MCP Server
An MCP server that provides web search capabilities using Perplexity's API with automatic model selection based on query intent.
Prerequisites
- Node.js (v14 or higher)
- A Perplexity API key (get one at https://www.perplexity.ai/settings/api)
- Claude Desktop App
Installation
Installing via Git
-
Clone this repository:
git clone https://github.com/RossH121/perplexity-mcp.git cd perplexity-mcp
-
Install dependencies:
npm install
-
Build the server:
npm run build
Configuration
-
Get your Perplexity API key from https://www.perplexity.ai/settings/api
-
Add the server to Claude's config file at
~/Library/Application Support/Claude/claude_desktop_config.json
:
{
"mcpServers": {
"perplexity-server": {
"command": "node",
"args": [
"/absolute/path/to/perplexity-mcp/build/index.js"
],
"env": {
"PERPLEXITY_API_KEY": "your-api-key-here",
"PERPLEXITY_MODEL": "sonar"
}
}
}
}
Replace /absolute/path/to
with the actual path to where you cloned the repository.
Available Models
The server now supports automatic model selection based on query intent, but you can also specify a default model using the PERPLEXITY_MODEL
environment variable. Available options:
sonar-deep-research
- Specialized for extensive research and expert-level analysis across domainssonar-reasoning-pro
- Optimized for advanced logical reasoning and complex problem-solvingsonar-reasoning
- Designed for reasoning tasks with balanced performancesonar-pro
- General-purpose model with excellent search capabilities and citation densitysonar
- Fast and efficient for straightforward queries
The default model (specified in the environment variable) will be used as the baseline for automatic model selection.
For up-to-date model pricing and availability, visit: https://docs.perplexity.ai/guides/pricing
Usage
After configuring the server and restarting Claude, you can simply ask Claude to search for information. For example:
- "What's the latest news about SpaceX?"
- "Search for the best restaurants in Chicago"
- "Find information about the history of jazz music"
- "I need a deep research analysis of recent AI developments" (uses sonar-deep-research)
- "Help me reason through this complex problem" (uses sonar-reasoning-pro)
Claude will automatically use the Perplexity search tool to find and return relevant information. The server will automatically select the most appropriate model based on your query's intent.
If for whatever reason it decides not to use the search tool, you can force the issue by prepending your prompt with "Search the web".
Intelligent Model Selection
The server automatically selects the most appropriate Perplexity model based on your query:
- Use research-oriented terms like "deep research," "comprehensive," or "in-depth" to trigger sonar-deep-research
- Use reasoning terms like "solve," "figure out," or "complex problem" to trigger sonar-reasoning-pro
- Use simple terms like "quick," "brief," or "basic" to trigger the lightweight sonar model
- General search terms default to sonar-pro for balanced performance
Each search response includes information about which model was used and why.
Domain Filtering
This server supports domain filtering to customize your search experience. You can allow or block specific domains using these commands:
- Add an allowed domain: "Use the domain_filter tool to allow wikipedia.org"
- Add a blocked domain: "Use the domain_filter tool to block pinterest.com"
- View current filters: "Use the list_filters tool" (shows domain and recency filters)
- Clear all filters: "Use the clear_filters tool" (clears both domain and recency filters)
Note: Perplexity API supports up to 3 domains total with priority given to allowed domains. Domain filtering requires a Perplexity API tier that supports this feature.
Example usage flow:
- "Use the domain_filter tool to allow wikipedia.org"
- "Use the domain_filter tool to allow arxiv.org"
- "Use the list_filters tool" (to verify your settings)
- "Search for quantum computing advances" (results will prioritize wikipedia.org and arxiv.org)
Recency Filtering
You can limit search results to a specific time window using the recency filter:
- Set recency filter: "Use the recency_filter tool with filter=hour" (options: hour, day, week, month)
- Disable recency filter: "Use the recency_filter tool with filter=none"
This is particularly useful for time-sensitive queries like current events or breaking news.
Model Selection Control
While the automatic model selection works well for most cases, you can manually control which model is used:
- View model information: "Use the model_info tool"
- Set a specific model: "Use the model_info tool with model=sonar-deep-research"
- Return to automatic selection: Set the model back to the default model
Example usage:
- "Use the model_info tool" (to see available models and current status)
- "Use the model_info tool with model=sonar-reasoning-pro" (to force using reasoning model)
- "Search for a mathematical proof of the Pythagorean theorem" (will use sonar-reasoning-pro)
- "Use the model_info tool with model=sonar-pro" (to return to automatic selection)
Development
To modify the server:
- Edit
src/index.ts
- Rebuild with
npm run build
- Restart Claude to load the changes
License
MIT