Graphiti MCP Server 🧠

Graphiti MCP Server 🧠

By gifflet GitHub

-

graphiti knowledge-graph
Overview

What is Graphiti MCP Server?

Graphiti MCP Server is a powerful knowledge graph server designed for AI agents, built using Neo4j and integrated with the Model Context Protocol (MCP).

How to use Graphiti MCP Server?

To use Graphiti MCP Server, clone the repository, set up your environment variables, and start the services using Docker.

Key features of Graphiti MCP Server?

  • Dynamic knowledge graph management with Neo4j
  • Seamless integration with OpenAI models
  • Support for Model Context Protocol (MCP)
  • Docker-ready deployment
  • Custom entity extraction capabilities
  • Advanced semantic search functionality

Use cases of Graphiti MCP Server?

  1. Managing complex knowledge graphs for AI applications.
  2. Integrating AI models for enhanced data processing.
  3. Utilizing semantic search for improved information retrieval.

FAQ from Graphiti MCP Server?

  • What are the prerequisites for using Graphiti MCP Server?

You need Docker, Docker Compose, Python 3.10 or higher, and an OpenAI API key.

  • Is Graphiti MCP Server open-source?

Yes! It is licensed under the MIT License.

  • Can I contribute to the project?

Absolutely! Contributions are welcome, and you can submit a Pull Request.

Content

Graphiti MCP Server 🧠

Python Version License Docker

🌟 A powerful knowledge graph server for AI agents, built with Neo4j and integrated with Model Context Protocol (MCP).

🚀 Features

  • 🔄 Dynamic knowledge graph management with Neo4j
  • 🤖 Seamless integration with OpenAI models
  • 🔌 MCP (Model Context Protocol) support
  • 🐳 Docker-ready deployment
  • 🎯 Custom entity extraction capabilities
  • 🔍 Advanced semantic search functionality

🛠️ Installation

Prerequisites

  • Docker and Docker Compose
  • Python 3.10 or higher
  • OpenAI API key

Quick Start 🚀

  1. Clone the repository:
git clone https://github.com/gifflet/graphiti-mcp-server.git
cd graphiti-mcp-server
  1. Set up environment variables:
cp .env.sample .env
  1. Edit .env with your configuration:
# Required for LLM operations
OPENAI_API_KEY=your_openai_api_key_here
MODEL_NAME=gpt-4o
  1. Start the services:
docker compose up

🔧 Configuration

Neo4j Settings 🗄️

Default configuration for Neo4j:

  • Username: neo4j
  • Password: demodemo
  • URI: bolt://neo4j:7687 (within Docker network)
  • Memory settings optimized for development

Docker Environment Variables 🐳

You can run with environment variables directly:

OPENAI_API_KEY=your_key MODEL_NAME=gpt-4o docker compose up

🔌 Integration

Cursor IDE Integration 🖥️

  1. Configure Cursor to connect to Graphiti:
{
  "mcpServers": {
    "Graphiti": {
      "url": "http://localhost:8000/sse"
    }
  }
}
  1. Add Graphiti rules to Cursor's User Rules (see graphiti_cursor_rules.md)
  2. Start an agent session in Cursor

🏗️ Architecture

The server consists of two main components:

  • Neo4j database for graph storage
  • Graphiti MCP server for API and LLM operations

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Neo4j team for the amazing graph database
  • OpenAI for their powerful LLM models
  • MCP community for the protocol specification
No tools information available.

-

mcp-server-memory knowledge-graph
View Details

Mirror of

memory knowledge-graph
View Details

Mirror of

mcp-server knowledge-graph
View Details

MCP Memory Server with Neo4j backend for AI knowledge graph storage

mcp-neo4j knowledge-graph
View Details

MCP Server that integrates a knowledge graph and a vector database based on SQLite

sqlite knowledge-graph
View Details