@pinkpixel/mem0-mcp MCP Server ✨

@pinkpixel/mem0-mcp MCP Server ✨

By pinkpixel-dev GitHub

✨ mem0 MCP Server: A modern memory system for AI applications with flexible provider support (OpenAI, Ollama, etc.) and MCP protocol integration. Enables long-term memory for AI agents as a drop-in MCP server.

agent memory
Overview

What is mem0 Memory System?

The mem0 Memory System is a modern memory system designed for AI applications, providing flexible support for various memory providers and enabling long-term memory capabilities for AI agents through a simple API or as a drop-in MCP server.

How to use mem0 Memory System?

To use the mem0 Memory System, you can either set it up as an MCP server for integration with compatible applications or directly integrate it into your application as a library. Installation can be done manually or using an installer script, and the server can be run using Python scripts or shell scripts.

Key features of mem0 Memory System?

  • Multi-provider support (OpenAI, Anthropic, Google, etc.)
  • Flexible embedding options (OpenAI, HuggingFace, Ollama)
  • Local storage capabilities with ChromaDB and SQLite
  • Autonomous memory extraction and retrieval
  • User isolation for multiple users
  • Configurable data directories and models

Use cases of mem0 Memory System?

  1. Enhancing AI applications with long-term memory capabilities.
  2. Storing and retrieving user-specific information for personalized interactions.
  3. Integrating with command-line applications that support MCP.

FAQ from mem0 Memory System?

  • Can I use mem0 with any AI application?

Yes! mem0 can be integrated with any application that supports the Machine Communication Protocol (MCP) or used directly as a library.

  • Is there support for multiple users?

Yes! The system supports user isolation, allowing multiple users to have their own memory spaces.

  • How does the autonomous memory feature work?

The autonomous memory feature automatically extracts and stores relevant information from user interactions without requiring explicit commands.

Content

Mem0 Logo

@pinkpixel/mem0-mcp MCP Server ✨

A Model Context Protocol (MCP) server that integrates with Mem0.ai to provide persistent memory capabilities for LLMs. It allows AI agents to store and retrieve information across sessions.

This server uses the mem0ai Node.js SDK for its core functionality.

Features 🧠

Tools

  • add_memory: Stores a piece of text content as a memory associated with a specific userId.
    • Input: content (string, required), userId (string, required), sessionId (string, optional), agentId (string, optional), metadata (object, optional)
    • Stores the provided text, enabling recall in future interactions.
  • search_memory: Searches stored memories based on a natural language query for a specific userId.
    • Input: query (string, required), userId (string, required), sessionId (string, optional), agentId (string, optional), filters (object, optional), threshold (number, optional)
    • Retrieves relevant memories based on semantic similarity.
  • delete_memory: Deletes a specific memory from storage by its ID.
    • Input: memoryId (string, required), userId (string, required), sessionId (string, optional), agentId (string, optional)
    • Permanently removes the specified memory.
    • Retrieves relevant memories based on semantic similarity.

Prerequisites 🔑

This server supports two storage modes:

  1. Cloud Storage Mode ☁️ (Recommended)

    • Requires a Mem0 API key (provided as MEM0_API_KEY environment variable)
    • Memories are persistently stored on Mem0's cloud servers
    • No local database needed
  2. Local Storage Mode 💾

    • Requires an OpenAI API key (provided as OPENAI_API_KEY environment variable)
    • Memories are stored in an in-memory vector database (non-persistent by default)
    • Data is lost when the server restarts unless configured for persistent storage

Installation & Configuration ⚙️

You can run this server in two main ways:

Install the package globally using npm:

npm install -g @pinkpixel/mem0-mcp

Configure your MCP client (e.g., Claude Desktop, Cursor, Cline, Roo Code, etc.) to run the server using npx:

{
  "mcpServers": {
    "mem0-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@pinkpixel/mem0-mcp"
      ],
      "env": {
        "MEM0_API_KEY": "YOUR_MEM0_API_KEY_HERE",
        "DEFAULT_USER_ID": "user123"
      },
      "disabled": false,
      "alwaysAllow": [
        "add_memory",
        "search_memory"
      ]
    }
  }
}

Note: Replace "YOUR_MEM0_API_KEY_HERE" with your actual Mem0 API key.

Local Storage Configuration (Alternative)

{
  "mcpServers": {
    "mem0-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@pinkpixel/mem0-mcp"
      ],
      "env": {
        "OPENAI_API_KEY": "YOUR_OPENAI_API_KEY_HERE",
        "DEFAULT_USER_ID": "user123"
      },
      "disabled": false,
      "alwaysAllow": [
        "add_memory",
        "search_memory"
      ]
    }
  }
}

Note: Replace "YOUR_OPENAI_API_KEY_HERE" with your actual OpenAI API key.

2. Running from Cloned Repository

Note: This method requires you to git clone the repository first.

Clone the repository, install dependencies, and build the server:

git clone https://github.com/pinkpixel-dev/mem0-mcp 
cd mem0-mcp
npm install
npm run build

Then, configure your MCP client to run the built script directly using node:

{
  "mcpServers": {
    "mem0-mcp": {
      "command": "node",
      "args": [
        "/absolute/path/to/mem0-mcp/build/index.js" 
      ],
      "env": {
        "MEM0_API_KEY": "YOUR_MEM0_API_KEY_HERE",
        "DEFAULT_USER_ID": "user123"
        // OR use "OPENAI_API_KEY": "YOUR_OPENAI_API_KEY_HERE" for local storage
      },
      "disabled": false,
      "alwaysAllow": [
        "add_memory",
        "search_memory"
      ]
    }
  }
}

Important Notes:

  1. Replace /absolute/path/to/mem0-mcp/ with the actual absolute path to your cloned repository
  2. Use the build/index.js file, not the src/index.ts file
  3. The MCP server requires clean stdout for protocol communication - any libraries or code that writes to stdout may interfere with the protocol

Default User ID (Optional Fallback)

Both the add_memory and search_memory tools require a userId argument to associate memories with a specific user.

For convenience during testing or in single-user scenarios, you can optionally set the DEFAULT_USER_ID environment variable when launching the server. If this variable is set, and the userId argument is omitted when calling the search_memory tool, the server will use the value of DEFAULT_USER_ID for the search.

Note: While this fallback exists, it's generally recommended that the calling agent (LLM) explicitly provides the correct userId for both adding and searching memories to avoid ambiguity.

Example configuration using DEFAULT_USER_ID:

{
  "mcpServers": {
    "mem0-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@pinkpixel/mem0-mcp"
      ],
      "env": {
        "MEM0_API_KEY": "YOUR_MEM0_API_KEY_HERE",
        "DEFAULT_USER_ID": "user123" 
      },
    }
  }
}

Or when running directly with node:

git clone https://github.com/pinkpixel-dev/mem0-mcp 
cd mem0-mcp
npm install
npm run build
{
  "mcpServers": {
    "mem0-mcp": {
      "command": "node",
      "args": [
        "path/to/mem0-mcp/build/index.js"
      ],
      "env": {
        "OPENAI_API_KEY": "YOUR_OPENAI_API_KEY_HERE",
        "DEFAULT_USER_ID": "user123" 
      },
    }
  }
}

Cloud vs. Local Storage 🔄

Cloud Storage (Mem0 API)

  • Persistent by default - Your memories remain available across sessions and server restarts
  • No local database required - All data is stored on Mem0's servers
  • Higher retrieval quality - Uses Mem0's optimized search algorithms
  • Additional fields - Supports agent_id and threshold parameters
  • Requires - A Mem0 API key

Local Storage (OpenAI API)

  • In-memory by default - Data is stored only in RAM and is not persistent long-term. While some caching may occur, you should not rely on this for permanent storage.
  • Data loss risk - Memory data will be lost on server restart, system reboot, or if the process is terminated
  • Recommended for - Development, testing, or temporary use only
  • For persistent storage - Use the Cloud Storage option with Mem0 API if you need reliable long-term memory
  • Uses OpenAI embeddings - For vector search functionality
  • Self-contained - All data stays on your machine
  • Requires - An OpenAI API key

Development 💻

Clone the repository and install dependencies:

git clone https://github.com/pinkpixel-dev/mem0-mcp 
cd mem0-mcp
npm install

Build the server:

npm run build

For development with auto-rebuild on file changes:

npm run watch

Debugging 🐞

Since MCP servers communicate over stdio, debugging can be challenging. Here are some approaches:

  1. Use the MCP Inspector: This tool can monitor the MCP protocol communication:
npm run inspector
  1. Console Logging: When adding console logs, always use console.error() instead of console.log() to avoid interfering with the MCP protocol

  2. Environment Files: Use a .env file for local development to simplify setting API keys and other configuration options

Technical Implementation Notes 🔧

Advanced Mem0 API Parameters

When using the Cloud Storage mode with the Mem0 API, you can leverage additional parameters for more sophisticated memory management. While not explicitly exposed in the tool schema, these can be included in the metadata object when adding memories:

Advanced Parameters for add_memory:

ParameterTypeDescription
metadataobjectStore additional context about the memory (e.g., location, time, identifiers). This can be used for filtering during retrieval.
includesstringSpecific preferences to include in the memory.
excludesstringSpecific preferences to exclude from the memory.
inferbooleanWhether to infer memories or directly store messages (default: true).
output_formatstringFormat version, either v1.0 (default, deprecated) or v1.1 (recommended).
custom_categoriesobjectList of categories with names and descriptions.
custom_instructionsstringProject-specific guidelines for handling and organizing memories.
immutablebooleanWhether the memory is immutable (default: false).
expiration_datestringWhen the memory will expire (format: YYYY-MM-DD).
org_idstringOrganization ID associated with this memory.
project_idstringProject ID associated with this memory.
versionstringMemory version (v1 is deprecated, v2 recommended for new applications).

To use these parameters with the MCP server, include them in your metadata object when calling the add_memory tool. For example:

{
  "content": "Important information to remember",
  "userId": "user123",
  "sessionId": "project-abc",
  "metadata": {
    "includes": "important context",
    "excludes": "sensitive data",
    "immutable": true,
    "expiration_date": "2025-12-31",
    "custom_instructions": "Prioritize this memory for financial questions",
    "version": "v2"
  }
}

Advanced Parameters for search_memory:

The Mem0 v2 search API offers powerful filtering capabilities that can be utilized through the filters parameter:

ParameterTypeDescription
filtersobjectComplex filters with logical operators and comparison conditions
top_kintegerNumber of top results to return (default: 10)
fieldsstring[]Specific fields to include in the response
rerankbooleanWhether to rerank the memories (default: false)
keyword_searchbooleanWhether to search based on keywords (default: false)
filter_memoriesbooleanWhether to filter the memories (default: false)
thresholdnumberMinimum similarity threshold for results (default: 0.3)
org_idstringOrganization ID for filtering memories
project_idstringProject ID for filtering memories

The filters parameter supports complex logical operations (AND, OR) and various comparison operators:

OperatorDescription
inMatches any of the values specified
gteGreater than or equal to
lteLess than or equal to
gtGreater than
ltLess than
neNot equal to
icontainsCase-insensitive containment check

Example of using complex filters with the search_memory tool:

{
  "query": "What are Alice's hobbies?",
  "userId": "user123",
  "filters": {
    "AND": [
      {
        "user_id": "alice"
      },
      {
        "agent_id": {"in": ["travel-agent", "sports-agent"]}
      }
    ]
  },
  "threshold": 0.5,
  "top_k": 5
}

This would search for memories related to Alice's hobbies where the user_id is "alice" AND the agent_id is either "travel-agent" OR "sports-agent", returning at most 5 results with a similarity score of at least 0.5.

For more detailed information on these parameters, refer to the Mem0 API documentation.

SafeLogger

The MCP server implements a SafeLogger class that selectively redirects console.log calls from the mem0ai library to stderr without disrupting MCP protocol:

  • Intercepts console.log calls and examines stack traces to determine source
  • Only redirects log calls from mem0ai library or our own code
  • Preserves clean stdout for MCP protocol communication
  • Automatically cleans up resources on process exit

This allows proper functioning within MCP clients while maintaining useful debug information.

Environment Variables

The server recognizes several environment variables that control its behavior:

  • MEM0_API_KEY: API key for cloud storage mode
  • OPENAI_API_KEY: API key for local storage mode (embeddings)
  • DEFAULT_USER_ID: Default user ID for memory operations

Made with ❤️ by Pink Pixel

No tools information available.

Mirror of

iac memory
View Details

🤖 The Semantic Engine for Model Context Protocol(MCP) Clients and AI Agents 🔥

agent semantic
View Details

Send Nano currency from AI agents/LLMs

agent crypto
View Details
Website
Website by FunnyWolf

Adversary simulation and Red teaming platform with AI

Model Context Protocol (MCP) server that provides weather information from Malaysia Government's Open API

agent weather
View Details
NSAF MCP Server
NSAF MCP Server by ariunbolor

The Neuro-Symbolic Autonomy Framework integrates neural, symbolic, and autonomous learning methods into a single, continuously evolving AI agent-building system. This prototype demonstrates the SCMA component, which enables AI agents to self-design new AI agents using Generative Architecture Models.

A documentation AI Agent built with LangGraph, MCP Docs, and Chainlit, designed to help users create different projects using natural language.

agent docker
View Details