
MemGPT MCP Server
A Model Context Protocol (MCP) server that provides persistent memory and multi-model LLM support.
what is MemGPT MCP Server?
MemGPT MCP Server is a TypeScript-based Model Context Protocol (MCP) server that provides persistent memory and multi-model support for large language models (LLMs). It allows users to chat with various LLM providers while maintaining conversation history.
how to use MemGPT MCP Server?
To use MemGPT MCP Server, install the necessary dependencies, build the server, and configure it with your LLM provider API keys. You can then interact with the server through its tools to send messages, retrieve memory, or switch providers.
key features of MemGPT MCP Server?
- Supports multiple LLM providers (OpenAI, Anthropic, OpenRouter, Ollama)
- Tools for sending messages, retrieving, and clearing conversation history
- Ability to switch between different models and providers
- Unlimited memory retrieval option
use cases of MemGPT MCP Server?
- Engaging in conversations with various LLMs while keeping track of context.
- Utilizing different models for specific tasks like customer support or content generation.
- Debugging and inspecting LLM interactions using the MCP Inspector.
FAQ from MemGPT MCP Server?
- Can I use multiple LLM providers with MemGPT?
Yes! MemGPT supports multiple providers including OpenAI and Anthropic.
- How do I retrieve my conversation history?
You can use the
get_memory
tool to retrieve your conversation history, with options for limiting the number of memories.
- Is there a way to clear my conversation history?
Yes! You can use the
clear_memory
tool to remove all stored memories.
MemGPT MCP Server
A TypeScript-based MCP server that implements a memory system for LLMs. It provides tools for chatting with different LLM providers while maintaining conversation history.
Features
Tools
-
chat
- Send a message to the current LLM provider- Takes a message parameter
- Supports multiple providers (OpenAI, Anthropic, OpenRouter, Ollama)
-
get_memory
- Retrieve conversation history- Optional
limit
parameter to specify number of memories to retrieve - Pass
limit: null
for unlimited memory retrieval - Returns memories in chronological order with timestamps
- Optional
-
clear_memory
- Clear conversation history- Removes all stored memories
-
use_provider
- Switch between different LLM providers- Supports OpenAI, Anthropic, OpenRouter, and Ollama
- Persists provider selection
-
use_model
- Switch to a different model for the current provider- Supports provider-specific models:
- Anthropic Claude Models:
- Claude 3 Series:
claude-3-haiku
: Fastest response times, ideal for tasks like customer support and content moderationclaude-3-sonnet
: Balanced performance for general-purpose useclaude-3-opus
: Advanced model for complex reasoning and high-performance tasks
- Claude 3.5 Series:
claude-3.5-haiku
: Enhanced speed and cost-effectivenessclaude-3.5-sonnet
: Superior performance with computer interaction capabilities
- Claude 3 Series:
- OpenAI: 'gpt-4o', 'gpt-4o-mini', 'gpt-4-turbo'
- OpenRouter: Any model in 'provider/model' format (e.g., 'openai/gpt-4', 'anthropic/claude-2')
- Ollama: Any locally available model (e.g., 'llama2', 'codellama')
- Anthropic Claude Models:
- Persists model selection
- Supports provider-specific models:
Development
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"letta-memgpt": {
"command": "/path/to/memgpt-server/build/index.js",
"env": {
"OPENAI_API_KEY": "your-openai-key",
"ANTHROPIC_API_KEY": "your-anthropic-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
}
}
}
Environment Variables
OPENAI_API_KEY
- Your OpenAI API keyANTHROPIC_API_KEY
- Your Anthropic API keyOPENROUTER_API_KEY
- Your OpenRouter API key
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Recent Updates
Claude 3 and 3.5 Series Support (March 2024)
- Added support for latest Claude models:
- Claude 3 Series (Haiku, Sonnet, Opus)
- Claude 3.5 Series (Haiku, Sonnet)
Unlimited Memory Retrieval
- Added support for retrieving unlimited conversation history
- Use
{ "limit": null }
with theget_memory
tool to retrieve all stored memories - Use
{ "limit": n }
to retrieve the n most recent memories - Default limit is 10 if not specified