
Higress AI-Search MCP Server
An MCP server enhances AI responses with real-time search results via Higress ai-search.
What is Higress AI-Search MCP Server?
Higress AI-Search MCP Server is a Model Context Protocol server that enhances AI responses by integrating real-time search results from various search engines through the Higress ai-search feature.
How to use Higress AI-Search MCP Server?
To use the server, configure it with the required environment variables and run it using the provided commands. You can choose between using the uvx
command for automatic package installation or the uv
command for local development.
Key features of Higress AI-Search MCP Server?
- Internet Search: Access to general web information from Google, Bing, and Quark.
- Academic Search: Retrieve scientific papers and research from Arxiv.
- Internal Knowledge Search: Search through internal knowledge bases.
Use cases of Higress AI-Search MCP Server?
- Enhancing AI model responses with up-to-date information.
- Providing academic research support for AI applications.
- Enabling internal knowledge retrieval for organizations.
FAQ from Higress AI-Search MCP Server?
- Can I use this server for any AI model?
Yes! You can configure it to work with various LLM models as per your requirements.
- Is there a demo available?
Yes! You can find demo links in the project documentation.
- What are the prerequisites for running this server?
You need to install the
uv
package and configure the Higress service with the ai-search and ai-proxy plugins.
Higress AI-Search MCP Server
Overview
A Model Context Protocol (MCP) server that provides an AI search tool to enhance AI model responses with real-time search results from various search engines through Higress ai-search feature.
Demo
Cline
https://github.com/user-attachments/assets/60a06d99-a46c-40fc-b156-793e395542bb
Claude Desktop
https://github.com/user-attachments/assets/5c9e639f-c21c-4738-ad71-1a88cc0bcb46
Features
- Internet Search: Google, Bing, Quark - for general web information
- Academic Search: Arxiv - for scientific papers and research
- Internal Knowledge Search
Prerequisites
Configuration
The server can be configured using environment variables:
HIGRESS_URL
(optional): URL for the Higress service (default:http://localhost:8080/v1/chat/completions
).MODEL
(required): LLM model to use for generating responses.INTERNAL_KNOWLEDGE_BASES
(optional): Description of internal knowledge bases.
Option 1: Using uvx
Using uvx will automatically install the package from PyPI, no need to clone the repository locally.
{
"mcpServers": {
"higress-ai-search-mcp-server": {
"command": "uvx",
"args": [
"higress-ai-search-mcp-server"
],
"env": {
"HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
"MODEL": "qwen-turbo",
"INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
}
}
}
}
Option 2: Using uv with local development
Using uv requires cloning the repository locally and specifying the path to the source code.
{
"mcpServers": {
"higress-ai-search-mcp-server": {
"command": "uv",
"args": [
"--directory",
"path/to/src/higress-ai-search-mcp-server",
"run",
"higress-ai-search-mcp-server"
],
"env": {
"HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
"MODEL": "qwen-turbo",
"INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
}
}
}
}
License
This project is licensed under the MIT License - see the LICENSE file for details.