what is Vercel AI Docs?
Vercel AI Docs is a Model Context Protocol (MCP) server that provides AI-powered search and querying capabilities for the Vercel AI SDK documentation, enabling developers to ask questions and receive accurate, contextualized responses.
how to use Vercel AI Docs?
To use Vercel AI Docs, clone the repository, install dependencies, set up environment variables, and start the MCP server. You can then integrate it with AI assistants like Claude Desktop or Cursor.
key features of Vercel AI Docs?
- Direct documentation search using similarity search
- AI-powered agent for natural language questions
- Session management for conversation context
- Automated indexing of the latest documentation
use cases of Vercel AI Docs?
- Developers querying the Vercel AI SDK documentation for specific functions.
- Integrating with AI assistants for enhanced documentation interaction.
- Maintaining context in conversations about SDK usage.
FAQ from Vercel AI Docs?
- Can I use Vercel AI Docs with any MCP client?
Yes! It is compatible with any client that implements the Model Context Protocol.
- What are the prerequisites for running Vercel AI Docs?
You need Node.js 18+, npm, and a Google API key for the Gemini model.
- How do I troubleshoot common issues?
Common issues include index not found errors and API rate limits; refer to the documentation for solutions.
Vercel AI SDK Documentation MCP Agent
A Model Context Protocol (MCP) server that provides AI-powered search and querying capabilities for the Vercel AI SDK documentation. This project enables developers to ask questions about the Vercel AI SDK and receive accurate, contextualized responses based on the official documentation.
Features
- Direct Documentation Search: Query the Vercel AI SDK documentation index directly using similarity search
- AI-Powered Agent: Ask natural language questions about the Vercel AI SDK and receive comprehensive answers
- Session Management: Maintain conversation context across multiple queries
- Automated Indexing: Includes tools to fetch, process, and index the latest Vercel AI SDK documentation
Architecture
This system consists of several key components:
- MCP Server: Exposes tools via the Model Context Protocol for integration with AI assistants
- DocumentFetcher: Crawls and processes the Vercel AI SDK documentation
- VectorStoreManager: Creates and manages the FAISS vector index for semantic search
- AgentService: Provides AI-powered answers to questions using the Google Gemini model
- DirectQueryService: Offers direct semantic search of the documentation
Setup Instructions
Prerequisites
- Node.js 18+
- npm
- A Google API key for Gemini model access
Environment Variables
Create a .env
file in the project root with the following variables:
GOOGLE_GENERATIVE_AI_API_KEY=your-google-api-key-here
You'll need to obtain a Google Gemini API key from the Google AI Studio.
Installation
-
Clone the repository
git clone https://github.com/IvanAmador/vercel-ai-docs-mcp.git cd vercel-ai-docs-mcp-agent
-
Install dependencies
npm install
-
Build the project
npm run build
-
Build the documentation index
npm run build:index
-
Start the MCP server
npm run start
Integration with Claude Desktop
Claude Desktop is a powerful AI assistant that supports MCP servers. To connect the Vercel AI SDK Documentation MCP agent with Claude Desktop:
-
First, install Claude Desktop if you don't have it already.
-
Open Claude Desktop settings (via the application menu, not within the chat interface).
-
Navigate to the "Developer" tab and click "Edit Config".
-
Add the Vercel AI Docs MCP server to your configuration:
{
"mcpServers": {
"vercel-ai-docs": {
"command": "node",
"args": ["ABSOLUTE_PATH_TO_PROJECT/dist/main.js"],
"env": {
"GOOGLE_GENERATIVE_AI_API_KEY": "your-google-api-key-here"
}
}
}
}
Make sure to replace:
ABSOLUTE_PATH_TO_PROJECT
with the actual path to your project folderyour-google-api-key-here
with your Google Gemini API key
-
Save the config file and restart Claude Desktop.
-
To verify the server is connected, look for the hammer 🔨 icon in the Claude chat interface.
For more detailed information about setting up MCP servers with Claude Desktop, visit the MCP Quickstart Guide.
Integration with Other MCP Clients
This MCP server is compatible with any client that implements the Model Context Protocol. Here are a few examples:
Cursor
Cursor is an AI-powered code editor that supports MCP servers. To integrate with Cursor:
-
Add a
.cursor/mcp.json
file to your project directory (for project-specific configuration) or a~/.cursor/mcp.json
file in your home directory (for global configuration). -
Add the following to your configuration file:
{
"mcpServers": {
"vercel-ai-docs": {
"command": "node",
"args": ["ABSOLUTE_PATH_TO_PROJECT/dist/main.js"],
"env": {
"GOOGLE_GENERATIVE_AI_API_KEY": "your-google-api-key-here"
}
}
}
}
For more information about using MCP with Cursor, refer to the Cursor MCP documentation.
Usage
The MCP server exposes three primary tools:
1. agent-query
Query the Vercel AI SDK documentation using an AI agent that can search and synthesize information.
{
"name": "agent-query",
"arguments": {
"query": "How do I use the streamText function?",
"sessionId": "unique-session-id"
}
}
2. direct-query
Perform a direct similarity search against the Vercel AI SDK documentation index.
{
"name": "direct-query",
"arguments": {
"query": "streamText usage",
"limit": 5
}
}
3. clear-memory
Clears the conversation memory for a specific session or all sessions.
{
"name": "clear-memory",
"arguments": {
"sessionId": "unique-session-id"
}
}
To clear all sessions, omit the sessionId parameter.
Development
Project Structure
├── config/ # Configuration settings
├── core/ # Core functionality
│ ├── indexing/ # Document indexing and vector store
│ └── query/ # Query services (agent and direct)
├── files/ # Storage directories
│ ├── docs/ # Processed documentation
│ ├── faiss_index/ # Vector index files
│ └── sessions/ # Session data
├── mcp/ # MCP server and tools
│ ├── server.ts # MCP server implementation
│ └── tools/ # MCP tool definitions
├── scripts/ # Build and utility scripts
└── utils/ # Helper utilities
Build Scripts
npm run build
: Compile TypeScript filesnpm run build:index
: Build the documentation indexnpm run dev:index
: Build and index in development modenpm run dev
: Build and start in development mode
Troubleshooting
Common Issues
-
Index not found or failed to load
Run
npm run build:index
to create the index before starting the server. -
API rate limits
When exceeding Google API rate limits, the agent service may return errors. Implement appropriate backoff strategies.
-
Model connection issues
Ensure your Google API key is valid and has access to the specified Gemini model.
-
Claude Desktop not showing MCP server
- Check your configuration file for syntax errors.
- Make sure the path to the server is correct and absolute.
- Check Claude Desktop logs for errors.
- Restart Claude Desktop after making configuration changes.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT