
A-MEM MCP Server
Memory Control Protocol (MCP) server for the Agentic Memory (A-MEM) system - a flexible, dynamic memory system for LLM agents
what is A-MEM MCP Server?
A-MEM MCP Server is a Memory Control Protocol (MCP) server designed for the Agentic Memory (A-MEM) system, which provides a flexible and dynamic memory system for LLM (Large Language Model) agents.
how to use A-MEM MCP Server?
To use the A-MEM MCP Server, clone the repository from GitHub, install the required dependencies, and start the server using Uvicorn. You can then interact with the server through its RESTful API endpoints for memory operations.
key features of A-MEM MCP Server?
- RESTful API for memory operations
- Dynamic memory organization based on Zettelkasten principles
- Intelligent indexing and linking of memories
- Comprehensive note generation with structured attributes
- Interconnected knowledge networks
- Continuous memory evolution and refinement
- Agent-driven decision making for adaptive memory management
use cases of A-MEM MCP Server?
- Creating and managing memory notes for LLM agents.
- Searching and retrieving memories based on specific queries.
- Updating and deleting memory notes as needed.
- Integrating with various LLM frameworks for enhanced memory management.
FAQ from A-MEM MCP Server?
- What is the purpose of the A-MEM MCP Server?
It serves as a memory management system for LLM agents, allowing for dynamic organization and retrieval of memories.
- How can I access the API documentation?
Interactive API documentation is available at Swagger UI and ReDoc once the server is running.
- Is there a specific backend required for the server?
The server can be configured to use different LLM backends, including OpenAI and Ollama.
A-MEM MCP Server
A Memory Control Protocol (MCP) server for the Agentic Memory (A-MEM) system - a flexible, dynamic memory system for LLM agents.
Overview
The A-MEM MCP Server provides a RESTful API wrapper around the core Agentic Memory (A-MEM) system, enabling easy integration with any LLM agent framework. The server exposes endpoints for memory creation, retrieval, updating, deletion, and search operations.
A-MEM is a novel agentic memory system for LLM agents that can dynamically organize memories without predetermined operations, drawing inspiration from the Zettelkasten method of knowledge management.
Key Features
- 🔄 RESTful API for memory operations
- 🧠 Dynamic memory organization based on Zettelkasten principles
- 🔍 Intelligent indexing and linking of memories
- 📝 Comprehensive note generation with structured attributes
- 🌐 Interconnected knowledge networks
- 🧬 Continuous memory evolution and refinement
- 🤖 Agent-driven decision making for adaptive memory management
Installation
- Clone the repository:
git clone https://github.com/Titan-co/amem-mcp-server.git
cd amem-mcp-server
- Install dependencies:
pip install -r requirements.txt
- Start the server:
uvicorn server:app --host 0.0.0.0 --port 8000 --reload
API Endpoints
Create Memory
- Endpoint: POST /memories
- Description: Create a new memory note
- Request Body:
{ "content": "string", "tags": ["string"], "category": "string", "timestamp": "string" }
Get Memory
- Endpoint: GET /memories/{id}
- Description: Retrieve a memory by ID
Update Memory
- Endpoint: PUT /memories/{id}
- Description: Update an existing memory
- Request Body:
{ "content": "string", "tags": ["string"], "category": "string", "context": "string", "keywords": ["string"] }
Delete Memory
- Endpoint: DELETE /memories/{id}
- Description: Delete a memory by ID
Search Memories
- Endpoint: GET /memories/search?query={query}&k={k}
- Description: Search for memories based on query
- Query Parameters:
query
: Search query stringk
: Number of results to return (default: 5)
Configuration
The server can be configured through environment variables:
OPENAI_API_KEY
: API key for OpenAI servicesLLM_BACKEND
: LLM backend to use (openai
orollama
, default:openai
)LLM_MODEL
: LLM model to use (default:gpt-4
)EMBEDDING_MODEL
: Embedding model for semantic search (default:all-MiniLM-L6-v2
)EVO_THRESHOLD
: Number of memories before triggering evolution (default:3
)
Documentation
Interactive API documentation is available at:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
References
Based on research paper: A-MEM: Agentic Memory for LLM Agents