what is n8n-nodes-mcp-client?
The n8n-nodes-mcp-client is a community node for the n8n workflow automation platform that allows users to interact with Model Context Protocol (MCP) servers within their workflows. MCP enables AI models to communicate with external tools and data sources in a standardized manner.
how to use n8n-nodes-mcp-client?
To use the n8n-nodes-mcp-client, follow the installation guide provided in the n8n community nodes documentation. Configure the MCP Client node with the necessary credentials and environment variables to connect to your MCP server, and then create workflows that utilize the node's capabilities.
key features of n8n-nodes-mcp-client?
- Connects to MCP servers using various transport methods (STDIO and SSE).
- Supports multiple operations such as executing tools, listing prompts, and accessing resources.
- Allows integration of community nodes as tools in AI Agent workflows.
use cases of n8n-nodes-mcp-client?
- Automating data retrieval from multiple AI tools in a single workflow.
- Executing complex operations across different MCP servers.
- Integrating AI capabilities into business processes using n8n workflows.
FAQ from n8n-nodes-mcp-client?
- What is the Model Context Protocol (MCP)?
MCP is a protocol that standardizes the interaction between AI models and external tools/data sources.
- Is n8n-nodes-mcp-client free to use?
Yes! It is open-source and free to use under the MIT license.
- What versions of n8n and MCP are required?
The node requires n8n version 1.0.0 or later and MCP Protocol version 1.0.0 or later.
n8n-nodes-mcp-client
This is an n8n community node that lets you interact with Model Context Protocol (MCP) servers in your n8n workflows.
MCP is a protocol that enables AI models to interact with external tools and data sources in a standardized way. This node allows you to connect to MCP servers, access resources, execute tools, and use prompts.
n8n is a fair-code licensed workflow automation platform.
Installation
Credentials
Environment Variables
Operations
Using as a Tool
Compatibility
Resources
Installation
Follow the installation guide in the n8n community nodes documentation.
Credentials
The MCP Client node supports two types of credentials to connect to an MCP server:
Command-line Based Transport (STDIO)
- Command: The command to start the MCP server
- Arguments: Optional arguments to pass to the server command
- Environment Variables: Variables to pass to the server in NAME=VALUE format
Server-Sent Events (SSE) Transport
- SSE URL: The URL of the SSE endpoint (default: http://localhost:3001/sse)
- Messages Post Endpoint: Optional custom endpoint for posting messages if different from the SSE URL
- Additional Headers: Optional headers to send with requests (format: name:value, one per line)
Environment Variables
The MCP Client node supports passing environment variables to MCP servers using the command-line based transport in two ways:
1. Using the Credentials UI
You can add environment variables directly in the credentials configuration:
This method is useful for individual setups and testing. The values are stored securely as credentials in n8n.
2. Using Docker Environment Variables
For Docker deployments, you can pass environment variables directly to your MCP servers by prefixing them with MCP_
:
version: '3'
services:
n8n:
image: n8nio/n8n
environment:
- MCP_BRAVE_API_KEY=your-api-key-here
- MCP_OPENAI_API_KEY=your-openai-key-here
- MCP_CUSTOM_SETTING=some-value
# other configuration...
These environment variables will be automatically passed to your MCP servers when they are executed.
Example: Using Brave Search MCP Server
This example shows how to set up and use the Brave Search MCP server:
-
Install the Brave Search MCP server:
npm install -g @modelcontextprotocol/server-brave-search
-
Configure MCP Client credentials:
- Command:
npx
- Arguments:
-y @modelcontextprotocol/server-brave-search
- Environment Variables:
BRAVE_API_KEY=your-api-key
Add a variables (space comma or newline separated)
- Command:
-
Create a workflow that uses the MCP Client node:
- Add an MCP Client node
- Select the "List Tools" operation to see available search tools
- Add another MCP Client node
- Select the "Execute Tool" operation
- Choose the "brave_search" tool
- Set Parameters to:
{"query": "latest AI news"}
The node will execute the search and return the results in the output.
Example: Multi-Server Setup with AI Agent
This example demonstrates how to set up multiple MCP servers in a production environment and use them with an AI agent:
- Configure your docker-compose.yml file:
version: '3'
services:
n8n:
image: n8nio/n8n
environment:
# MCP server environment variables
- MCP_BRAVE_API_KEY=your-brave-api-key
- MCP_OPENAI_API_KEY=your-openai-key
- MCP_SERPER_API_KEY=your-serper-key
- MCP_WEATHER_API_KEY=your-weather-api-key
# Enable community nodes as tools
- N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
ports:
- "5678:5678"
volumes:
- ~/.n8n:/home/node/.n8n
-
Create multiple MCP Client credentials in n8n:
Brave Search Credentials:
- Command:
npx
- Arguments:
-y @modelcontextprotocol/server-brave-search
OpenAI Tools Credentials:
- Command:
npx
- Arguments:
-y @modelcontextprotocol/server-openai
Web Search Credentials:
- Command:
npx
- Arguments:
-y @modelcontextprotocol/server-serper
Weather API Credentials:
- Command:
npx
- Arguments:
-y @modelcontextprotocol/server-weather
- Command:
-
Create an AI Agent workflow:
- Add an AI Agent node
- Enable MCP Client as a tool
- Configure different MCP Client nodes with different credentials
- Create a prompt that uses multiple data sources
Example AI Agent prompt:
I need you to help me plan a trip. First, search for popular destinations in {destination_country}.
Then, check the current weather in the top 3 cities.
Finally, find some recent news about travel restrictions for these places.
With this setup, the AI agent can use multiple MCP tools across different servers, all using environment variables configured in your Docker deployment.
Example: Using a Local MCP Server with SSE
This example shows how to connect to a locally running MCP server using Server-Sent Events (SSE):
-
Start a local MCP server that supports SSE:
npx @modelcontextprotocol/server-example-sse
Or run your own custom MCP server with SSE support on port 3001.
-
Configure MCP Client credentials:
- In the node settings, select Connection Type:
Server-Sent Events (SSE)
- Create new credentials of type MCP Client (SSE) API
- Set SSE URL:
http://localhost:3001/sse
- Add any required headers if your server needs authentication
- In the node settings, select Connection Type:
-
Create a workflow that uses the MCP Client node:
- Add an MCP Client node
- Set the Connection Type to
Server-Sent Events (SSE)
- Select your SSE credentials
- Select the "List Tools" operation to see available tools
- Execute the workflow to see the results
This method is particularly useful when:
- Your MCP server is running as a standalone service
- You're connecting to a remote MCP server
- Your server requires special authentication headers
- You need to separate the transport channel from the message channel
Operations
The MCP Client node supports the following operations:
- Execute Tool - Execute a specific tool with parameters
- Get Prompt - Get a specific prompt template
- List Prompts - Get a list of available prompts
- List Resources - Get a list of available resources from the MCP server
- List Tools - Get a list of available tools
- Read Resource - Read a specific resource by URI
Example: List Tools Operation
The List Tools operation returns all available tools from the MCP server, including their names, descriptions, and parameter schemas.
Example: Execute Tool Operation
The Execute Tool operation allows you to execute a specific tool with parameters. Make sure to select the tool you want to execute from the dropdown menu.
Using as a Tool
This node can be used as a tool in n8n AI Agents. To enable community nodes as tools, you need to set the N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE
environment variable to true
.
Setting the Environment Variable
If you're using a bash/zsh shell:
export N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
n8n start
If you're using Docker: Add to your docker-compose.yml file:
environment:
- N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
If you're using the desktop app:
Create a .env
file in the n8n directory:
N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
If you want to set it permanently on Mac/Linux:
Add to your ~/.zshrc
or ~/.bash_profile
:
export N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true
Example of an AI Agent workflow results:
After setting this environment variable and restarting n8n, your MCP Client node will be available as a tool in AI Agent nodes.
Compatibility
- Requires n8n version 1.0.0 or later
- Compatible with MCP Protocol version 1.0.0 or later
- Supports both STDIO and SSE transports for connecting to MCP servers
- SSE transport requires a server that implements the MCP Server-Sent Events specification