A-MEM MCP Server

A-MEM MCP Server

By Titan-co GitHub

Memory Control Protocol (MCP) server for the Agentic Memory (A-MEM) system - a flexible, dynamic memory system for LLM agents

memory mcp
Overview

what is A-MEM MCP Server?

A-MEM MCP Server is a Memory Control Protocol (MCP) server designed for the Agentic Memory (A-MEM) system, which provides a flexible and dynamic memory system for LLM (Large Language Model) agents.

how to use A-MEM MCP Server?

To use the A-MEM MCP Server, clone the repository from GitHub, install the required dependencies, and start the server using Uvicorn. You can then interact with the server through its RESTful API endpoints for memory operations.

key features of A-MEM MCP Server?

  • RESTful API for memory operations
  • Dynamic memory organization based on Zettelkasten principles
  • Intelligent indexing and linking of memories
  • Comprehensive note generation with structured attributes
  • Interconnected knowledge networks
  • Continuous memory evolution and refinement
  • Agent-driven decision making for adaptive memory management

use cases of A-MEM MCP Server?

  1. Creating and managing memory notes for LLM agents.
  2. Searching and retrieving memories based on specific queries.
  3. Updating and deleting memory notes as needed.
  4. Integrating with various LLM frameworks for enhanced memory management.

FAQ from A-MEM MCP Server?

  • What is the purpose of the A-MEM MCP Server?

It serves as a memory management system for LLM agents, allowing for dynamic organization and retrieval of memories.

  • How can I access the API documentation?

Interactive API documentation is available at Swagger UI and ReDoc once the server is running.

  • Is there a specific backend required for the server?

The server can be configured to use different LLM backends, including OpenAI and Ollama.

Content

A-MEM MCP Server

A Memory Control Protocol (MCP) server for the Agentic Memory (A-MEM) system - a flexible, dynamic memory system for LLM agents.

Overview

The A-MEM MCP Server provides a RESTful API wrapper around the core Agentic Memory (A-MEM) system, enabling easy integration with any LLM agent framework. The server exposes endpoints for memory creation, retrieval, updating, deletion, and search operations.

A-MEM is a novel agentic memory system for LLM agents that can dynamically organize memories without predetermined operations, drawing inspiration from the Zettelkasten method of knowledge management.

Key Features

  • 🔄 RESTful API for memory operations
  • 🧠 Dynamic memory organization based on Zettelkasten principles
  • 🔍 Intelligent indexing and linking of memories
  • 📝 Comprehensive note generation with structured attributes
  • 🌐 Interconnected knowledge networks
  • 🧬 Continuous memory evolution and refinement
  • 🤖 Agent-driven decision making for adaptive memory management

Installation

  1. Clone the repository:
git clone https://github.com/Titan-co/amem-mcp-server.git
cd amem-mcp-server
  1. Install dependencies:
pip install -r requirements.txt
  1. Start the server:
uvicorn server:app --host 0.0.0.0 --port 8000 --reload

API Endpoints

Create Memory

  • Endpoint: POST /memories
  • Description: Create a new memory note
  • Request Body:
    {
      "content": "string",
      "tags": ["string"],
      "category": "string",
      "timestamp": "string"
    }
    

Get Memory

  • Endpoint: GET /memories/{id}
  • Description: Retrieve a memory by ID

Update Memory

  • Endpoint: PUT /memories/{id}
  • Description: Update an existing memory
  • Request Body:
    {
      "content": "string",
      "tags": ["string"],
      "category": "string",
      "context": "string",
      "keywords": ["string"]
    }
    

Delete Memory

  • Endpoint: DELETE /memories/{id}
  • Description: Delete a memory by ID

Search Memories

  • Endpoint: GET /memories/search?query={query}&k={k}
  • Description: Search for memories based on query
  • Query Parameters:
    • query: Search query string
    • k: Number of results to return (default: 5)

Configuration

The server can be configured through environment variables:

  • OPENAI_API_KEY: API key for OpenAI services
  • LLM_BACKEND: LLM backend to use (openai or ollama, default: openai)
  • LLM_MODEL: LLM model to use (default: gpt-4)
  • EMBEDDING_MODEL: Embedding model for semantic search (default: all-MiniLM-L6-v2)
  • EVO_THRESHOLD: Number of memories before triggering evolution (default: 3)

Documentation

Interactive API documentation is available at:

References

Based on research paper: A-MEM: Agentic Memory for LLM Agents

No tools information available.
School MCP
School MCP by 54yyyu

A Model Context Protocol (MCP) server for academic tools, integrating with Canvas and Gradescope platforms.

canvas mcp
View Details
repo-template
repo-template by loonghao

A Model Context Protocol (MCP) server for Python package intelligence, providing structured queries for PyPI packages and GitHub repositories. Features include dependency analysis, version tracking, and package metadata retrieval for LLM interactions.

-

google-calendar mcp
View Details
strava-mcp
strava-mcp by jeremysilva1098

MCP server for strava

strava mcp
View Details

Model Context Protocol (MCP) server implementation for Rhinoceros/Grasshopper integration, enabling AI models to interact with parametric design tools

grasshopper mcp
View Details

MCP configuration to connect AI agent to a Linux machine.

security mcp
View Details

AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).

python mcp
View Details