What is Graphiti MCP Server?
Graphiti MCP Server is a powerful knowledge graph server designed for AI agents, built using Neo4j and integrated with the Model Context Protocol (MCP).
How to use Graphiti MCP Server?
To use Graphiti MCP Server, clone the repository, set up your environment variables, and start the services using Docker.
Key features of Graphiti MCP Server?
- Dynamic knowledge graph management with Neo4j
- Seamless integration with OpenAI models
- Support for Model Context Protocol (MCP)
- Docker-ready deployment
- Custom entity extraction capabilities
- Advanced semantic search functionality
Use cases of Graphiti MCP Server?
- Managing complex knowledge graphs for AI applications.
- Integrating AI models for enhanced data processing.
- Utilizing semantic search for improved information retrieval.
FAQ from Graphiti MCP Server?
- What are the prerequisites for using Graphiti MCP Server?
You need Docker, Docker Compose, Python 3.10 or higher, and an OpenAI API key.
- Is Graphiti MCP Server open-source?
Yes! It is licensed under the MIT License.
- Can I contribute to the project?
Absolutely! Contributions are welcome, and you can submit a Pull Request.
Graphiti MCP Server 🧠
🌟 A powerful knowledge graph server for AI agents, built with Neo4j and integrated with Model Context Protocol (MCP).
🚀 Features
- 🔄 Dynamic knowledge graph management with Neo4j
- 🤖 Seamless integration with OpenAI models
- 🔌 MCP (Model Context Protocol) support
- 🐳 Docker-ready deployment
- 🎯 Custom entity extraction capabilities
- 🔍 Advanced semantic search functionality
🛠️ Installation
Prerequisites
- Docker and Docker Compose
- Python 3.10 or higher
- OpenAI API key
Quick Start 🚀
- Clone the repository:
git clone https://github.com/gifflet/graphiti-mcp-server.git
cd graphiti-mcp-server
- Set up environment variables:
cp .env.sample .env
- Edit
.env
with your configuration:
# Required for LLM operations
OPENAI_API_KEY=your_openai_api_key_here
MODEL_NAME=gpt-4o
- Start the services:
docker compose up
🔧 Configuration
Neo4j Settings 🗄️
Default configuration for Neo4j:
- Username:
neo4j
- Password:
demodemo
- URI:
bolt://neo4j:7687
(within Docker network) - Memory settings optimized for development
Docker Environment Variables 🐳
You can run with environment variables directly:
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4o docker compose up
🔌 Integration
Cursor IDE Integration 🖥️
- Configure Cursor to connect to Graphiti:
{
"mcpServers": {
"Graphiti": {
"url": "http://localhost:8000/sse"
}
}
}
- Add Graphiti rules to Cursor's User Rules (see
graphiti_cursor_rules.md
) - Start an agent session in Cursor
🏗️ Architecture
The server consists of two main components:
- Neo4j database for graph storage
- Graphiti MCP server for API and LLM operations
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Neo4j team for the amazing graph database
- OpenAI for their powerful LLM models
- MCP community for the protocol specification