MCP Server with RAG and Multi-Search

MCP Server with RAG and Multi-Search

By KunjShah95 GitHub

A custom MCP server with RAG capabilities and multiple search providers (Gemini 2.0 and Linkup)

mcp rag
Overview

MCP Server with RAG and Multi-Search is a custom Model Calling Protocol (MCP) server that integrates Retrieval-Augmented Generation (RAG) capabilities with multiple search providers, including Google's Gemini 2.0 and Linkup.

How to use MCP Server?

To use the MCP Server, clone the repository, install the required dependencies, set up your environment variables with the necessary API keys, and run the server using Python.

Key features of MCP Server?

  • RAG workflow utilizing local documents for enhanced search results.
  • Integration with Google's Gemini 2.0 for advanced AI-powered search capabilities.
  • Traditional web search functionality through Linkup.
  • Built with FastMCP for efficient server implementation.

Use cases of MCP Server?

  1. Enhancing search results by combining local document queries with web searches.
  2. Utilizing AI-powered search for more relevant results in applications.
  3. Supporting research and data retrieval tasks with RAG capabilities.

FAQ from MCP Server?

  • What are the prerequisites for running the MCP Server?

You need Python 3.8 or higher, Ollama installed with DeepSeek models, and API keys for Gemini and Linkup.

  • How do I install the MCP Server?

Clone the repository, install dependencies, set up environment variables, and run the server.

  • What libraries does the MCP Server use?

It uses llama-index, ollama, Google Generative AI SDK, Linkup SDK, FastMCP, and others for functionality.

Overview

MCP Server with RAG and Multi-Search is a custom Model Calling Protocol (MCP) server that integrates Retrieval-Augmented Generation (RAG) capabilities with multiple search providers, including Google's Gemini 2.0 and Linkup.

How to use MCP Server?

To use the MCP Server, clone the repository, install the required dependencies, set up your environment variables with the necessary API keys, and run the server using Python.

Key features of MCP Server?

  • RAG workflow utilizing local documents for enhanced search results.
  • Integration with Google's Gemini 2.0 for advanced AI-powered search capabilities.
  • Traditional web search functionality through Linkup.
  • Built with FastMCP for efficient server implementation.

Use cases of MCP Server?

  1. Enhancing search results by combining local document queries with web searches.
  2. Utilizing AI-powered search for more relevant results in applications.
  3. Supporting research and data retrieval tasks with RAG capabilities.

FAQ from MCP Server?

  • What are the prerequisites for running the MCP Server?

You need Python 3.8 or higher, Ollama installed with DeepSeek models, and API keys for Gemini and Linkup.

  • How do I install the MCP Server?

Clone the repository, install dependencies, set up environment variables, and run the server.

  • What libraries does the MCP Server use?

It uses llama-index, ollama, Google Generative AI SDK, Linkup SDK, FastMCP, and others for functionality.

No tools information available.

This is a basic MCP Server-Client Impl using SSE

mcp server-client
View Details

-

mcp model-context-protocol
View Details

Buttplug.io Model Context Protocol (MCP) Server

mcp buttplug
View Details

MCP web search using perplexity without any API KEYS

mcp puppeteer
View Details

free MCP server hosting using vercel

mcp mantle-network
View Details

MCPHubs is a website that showcases projects related to Anthropic's Model Context Protocol (MCP)

mcp mcp-server
View Details