Mattermost MCP Host

Mattermost MCP Host

By jagan-shanmugam GitHub

A Mattermost integration that connects to Model Context Protocol (MCP) servers, leveraging a LangGraph-based Agent.

mcp mattermost
Overview

What is Mattermost MCP Host?

Mattermost MCP Host is an integration that connects Mattermost with Model Context Protocol (MCP) servers, utilizing AI language models to create an intelligent interface for managing and executing tools within Mattermost.

How to use Mattermost MCP Host?

To use Mattermost MCP Host, install the package via pip, configure your environment with the necessary Mattermost and AI provider credentials, and start the integration using Python.

Key features of Mattermost MCP Host?

  • AI-Powered Assistance with multiple AI providers (Azure OpenAI, OpenAI, Anthropic Claude, Google Gemini)
  • MCP Server Integration for connecting to any Model Context Protocol server
  • Tool Management for accessing and executing tools from connected servers
  • Thread-Based Conversations to maintain context within Mattermost threads
  • Tool Chaining to allow AI to call multiple tools in sequence
  • Resource Discovery to list available tools and resources from MCP servers
  • Multiple Provider Support for easy configuration changes.

Use cases of Mattermost MCP Host?

  1. Managing AI tools and resources directly within Mattermost.
  2. Executing complex tasks by chaining multiple AI tools.
  3. Facilitating team collaboration through intelligent tool management.

FAQ from Mattermost MCP Host?

  • What are the prerequisites for using Mattermost MCP Host?

You need Python 3.13.1+, a Mattermost server, a bot account with permissions, and access to at least one LLM API.

  • How do I start the integration?

After installation and configuration, run the command python -m mattermost_mcp_host to start the integration.

  • Can I use different AI providers?

Yes! You can choose your preferred AI provider by changing the configuration.

Content

Mattermost MCP Host

A Mattermost integration that connects to Model Context Protocol (MCP) servers, leveraging a LangGraph-based AI agent to provide an intelligent interface for interacting with users and executing tools directly within Mattermost.

Version Python License Package Manager

Demo

1. Github Agent in support channel - searches the existing issues and PRs and creates a new issue if not found

Description of your GIF

2. Search internet and post to a channel using Mattermost-MCP-server

Description of your GIF

Scroll below for full demo in YouTube

Features

  • 🤖 Langgraph Agent Integration: Uses a LangGraph agent to understand user requests and orchestrate responses.
  • 🔌 MCP Server Integration: Connects to multiple MCP servers defined in mcp-servers.json.
  • 🛠️ Dynamic Tool Loading: Automatically discovers tools from connected MCP servers and makes them available to the AI agent. Converts MCP tools to langchain structured tools.
  • 💬 Thread-Aware Conversations: Maintains conversational context within Mattermost threads for coherent interactions.
  • 🔄 Intelligent Tool Use: The AI agent can decide when to use available tools (including chaining multiple calls) to fulfill user requests.
  • 🔍 MCP Capability Discovery: Allows users to list available servers, tools, resources, and prompts via direct commands.
  • #️⃣ Direct Command Interface: Interact directly with MCP servers using a command prefix (default: #).

Overview

The integration works as follows:

  1. Mattermost Connection (mattermost_client.py): Connects to the Mattermost server via API and WebSocket to listen for messages in a specified channel.
  2. MCP Connections (mcp_client.py): Establishes connections (primarily stdio) to each MCP server defined in src/mattermost_mcp_host/mcp-servers.json. It discovers available tools on each server.
  3. Agent Initialization (agent/llm_agent.py): A LangGraphAgent is created, configured with the chosen LLM provider and the dynamically loaded tools from all connected MCP servers.
  4. Message Handling (main.py):
    • If a message starts with the command prefix (#), it's parsed as a direct command to list servers/tools or call a specific tool via the corresponding MCPClient.
    • Otherwise, the message (along with thread history) is passed to the LangGraphAgent.
  5. Agent Execution: The agent processes the request, potentially calling one or more MCP tools via the MCPClient instances, and generates a response.
  6. Response Delivery: The final response from the agent or command execution is posted back to the appropriate Mattermost channel/thread.

Setup

  1. Clone the repository:

    git clone <repository-url>
    cd mattermost-mcp-host
    
  2. Install:

    • Using uv (recommended):
      # Install uv if you don't have it yet
      # curl -LsSf https://astral.sh/uv/install.sh | sh 
      
      # Activate venv
      source .venv/bin/activate
      
      # Install the package with uv
      uv sync
      
      # To install dev dependencies
      uv sync --dev --all-extras
      
  3. Configure Environment (.env file): Copy the .env.example and fill in the values or Create a .env file in the project root (or set environment variables):

    # Mattermost Details
    MATTERMOST_URL=http://your-mattermost-url
    MATTERMOST_TOKEN=your-bot-token # Needs permissions to post, read channel, etc.
    MATTERMOST_TEAM_NAME=your-team-name
    MATTERMOST_CHANNEL_NAME=your-channel-name # Channel for the bot to listen in
    # MATTERMOST_CHANNEL_ID= # Optional: Auto-detected if name is provided
    
    # LLM Configuration (Azure OpenAI is default)
    DEFAULT_PROVIDER=azure
    AZURE_OPENAI_ENDPOINT=your-azure-endpoint
    AZURE_OPENAI_API_KEY=your-azure-api-key
    AZURE_OPENAI_DEPLOYMENT=your-deployment-name # e.g., gpt-4o
    # AZURE_OPENAI_API_VERSION= # Optional, defaults provided
    
    # Optional: Other providers (install with `[all]` extra)
    # OPENAI_API_KEY=...
    # ANTHROPIC_API_KEY=...
    # GOOGLE_API_KEY=...
    
    # Command Prefix
    COMMAND_PREFIX=# 
    

    See .env.example for more options.

  4. Configure MCP Servers: Edit src/mattermost_mcp_host/mcp-servers.json to define the MCP servers you want to connect to. See src/mattermost_mcp_host/mcp-servers-example.json. Depending on the server configuration, you might npx, uvx, docker installed in your system and in path.

  5. Start the Integration:

    mattermost-mcp-host
    

Prerequisites

  • Python 3.13.1+
  • uv package manager
  • Mattermost server instance
  • Mattermost Bot Account with API token
  • Access to a LLM API (Azure OpenAI)

Optional

  • One or more MCP servers configured in mcp-servers.json
  • Tavily web search requires TAVILY_API_KEY in .env file

Usage in Mattermost

Once the integration is running and connected:

  1. Direct Chat: Simply chat in the configured channel or with the bot. The AI agent will respond, using tools as needed. It maintains context within message threads.
  2. Direct Commands: Use the command prefix (default #) for specific actions:
    • #help - Display help information.
    • #servers - List configured and connected MCP servers.
    • #<server_name> tools - List available tools for <server_name>.
    • #<server_name> call <tool_name> <json_arguments> - Call <tool_name> on <server_name> with arguments provided as a JSON string.
      • Example: #my-server call echo '{"message": "Hello MCP!"}'
    • #<server_name> resources - List available resources for <server_name>.
    • #<server_name> prompts - List available prompts for <server_name>.

Next Steps

  • ⚙️ Configurable LLM Backend: Supports multiple AI providers (Azure OpenAI default, OpenAI, Anthropic Claude, Google Gemini) via environment variables.

Mattermost Setup

  1. Create a Bot Account
  • Go to Integrations > Bot Accounts > Add Bot Account
  • Give it a name and description
  • Save the access token in the .env file
  1. Required Bot Permissions
  • post_all
  • create_post
  • read_channel
  • create_direct_channel
  • read_user
  1. Add Bot to Team/Channel
  • Invite the bot to your team
  • Add bot to desired channels

Troubleshooting

  1. Connection Issues
  • Verify Mattermost server is running
  • Check bot token permissions
  • Ensure correct team/channel names
  1. AI Provider Issues
  • Validate API keys
  • Check API quotas and limits
  • Verify network access to API endpoints
  1. MCP Server Issues
  • Check server logs
  • Verify server configurations
  • Ensure required dependencies are installed and env variables are defined

Demos

Create issue via chat using Github MCP server

Description of your GIF

(in YouTube)

AI Agent in Action in Mattermost

Contributing

Please feel free to open a PR.

License

This project is licensed under the MIT License - see the LICENSE file for details.

No tools information available.

This is a basic MCP Server-Client Impl using SSE

mcp server-client
View Details

-

mcp model-context-protocol
View Details

Buttplug.io Model Context Protocol (MCP) Server

mcp buttplug
View Details

MCP web search using perplexity without any API KEYS

mcp puppeteer
View Details

free MCP server hosting using vercel

mcp mantle-network
View Details

MCPHubs is a website that showcases projects related to Anthropic's Model Context Protocol (MCP)

mcp mcp-server
View Details