
🧠 Advanced MCP Server Setup with
Advanced MCP Server Setup with uv, llama-index, ollama, and Cursor IDE
What is Advanced MCP Server Setup?
Advanced MCP Server Setup is a project that integrates various tools like uv
, llama-index
, ollama
, and Cursor IDE to create a powerful server environment for managing AI agents and local LLMs.
How to use Advanced MCP Server Setup?
To use this setup, follow the installation steps to configure your environment, set up the project directory, and run the server using the provided commands in the documentation.
Key features of Advanced MCP Server Setup?
- Easy project initialization with
uv
- Integration with
ollama
for running local LLMs - Configuration of MCP servers in Cursor IDE
- Support for managing dependencies and virtual environments
Use cases of Advanced MCP Server Setup?
- Setting up a local server for AI agent management.
- Running and testing LLMs in a controlled environment.
- Developing and deploying AI applications using Cursor IDE.
FAQ from Advanced MCP Server Setup?
- What are the prerequisites for this setup?
You need Python 3.10+,
uv
,Ollama
, andCursor IDE
installed.
- Can I use this setup for any AI project?
Yes! This setup is flexible and can be adapted for various AI projects involving local LLMs and agent orchestration.
- Is there support for troubleshooting?
Yes, the documentation provides detailed steps for setup and common issues.
uv
, llama-index
, ollama
, and Cursor IDE
🧠 Advanced MCP Server Setup with ✅ Prerequisites
- Python 3.10+ installed
- uv (by Astral) installed globally (
pip install uv
) - Ollama installed and running locally
- Cursor IDE installed
🛠 Step 1: Project Setup
1.1 Create a New Project Directory
uv init mcp-server
cd mcp-server
1.2 Create and Activate Virtual Environment
uv venv
.venv\Scripts\activate # On Windows
# OR
source .venv/bin/activate # On Linux/Mac
🔐 Step 2: Environment Configuration
Create a .env
file in the root of your project and add your API key:
LINKUP_API_KEY=your_api_key_here
📦 Step 3: Install Required Dependencies
Run these commands one by one inside your virtual environment:
# Core MCP CLI and HTTP utilities
uv add mcp[cli] httpx
# Linkup SDK for orchestrating agents
uv add linkup-sdk
# LlamaIndex integrations
uv add llama-index
uv add llama-index-embeddings-huggingface
uv add llama-index-llms-ollama
# Optional: for using notebooks
uv add ipykernel
🧪 Step 4: Confirm Installation
After installation, check your uv
-managed pyproject.toml
for something like this:
[tool.uv.dependencies]
mcp = { extras = ["cli"] }
httpx = "*"
linkup-sdk = "*"
llama-index = "*"
llama-index-embeddings-huggingface = "*"
llama-index-llms-ollama = "*"
ipykernel = "*"
⚙️ Step 5: Create a Minimal Server Entry Point
Create a server.py
file inside the project root:
# server.py
from mcp.cli import app
if __name__ == "__main__":
app()
You can later replace this with your own
FastMCP
or Agent orchestrator script.
🧠 Step 6: Run Ollama Locally
Make sure Ollama is installed and running:
ollama run llama3.2 Or any model you want
This starts the LLM backend at http://localhost:11434
.
🖥️ Step 7: Configure MCP Server in Cursor IDE
7.1 Open Cursor Settings
- Open
Settings
→ Go to MCP section. - Click on "Add New Global MCP Server"
7.2 Fill Out the Configuration
Replace the paths with your actual machine paths. You can get the full path to uv
by running:
where uv # Windows
Now add this to your Cursor IDE settings:
{
"mcpServers": {
"weather": {
"command": "C:\\Users\\SIDHYA\\AppData\\Roaming\\Python\\Python311\\Scripts\\uv.exe", // Replace with your actual uv path
"args": [
"--directory",
"C:\\Users\\SIDHYA\\Development\\Ai\\mcp-server",
"run",
"server.py"
]
}
}
}
🧪 Step 8: Test the Integration
- Open any
.py
file in Cursor. - Use the MCP tools (usually accessible via
⌘K
orCtrl+K
) to run the “weather” MCP server. - You should see the server spin up using your
server.py
.
📘 Suggested Directory Structure
mcp-server/
├── .env
├── pyproject.toml
├── server.py
└── rag.py
🔁 Keep Things Updated
To update dependencies:
uv pip install --upgrade llama-index
uv pip install --upgrade linkup-sdk
✍️ Author
👋 Hey, I'm Asutosh Sidhya
🌐 Connect with Me
- 📧 Email: sidhyaasutosh@gmail.com
- 🧑💻 GitHub: @asutosh7
- 💼 LinkedIn: linkedin.com/in/asutosh-sidhya
If you're building something around AI agents, local LLMs, or automated RAG pipelines—I'd love to connect or collaborate!