cognee-mcp-server

cognee-mcp-server

By topoteretes GitHub

-

cognee mcp-server
Overview

what is cognee-mcp-server?

The cognee-mcp-server is a server designed for 'cognee', an AI memory engine that builds a knowledge graph from user-input text and facilitates searching within it.

how to use cognee-mcp-server?

To use the cognee-mcp-server, configure it according to your environment, including setting up necessary parameters in the claude_desktop_config.json, and run it with the specified commands related to your installation path.

key features of cognee-mcp-server?

  • Constructs knowledge graphs based on textual input.
  • Allows for searching within the generated knowledge graph.
  • Supports custom graph model implementations through flexible configuration options.

use cases of cognee-mcp-server?

  1. Enhancing AI systems with memory by building contextual knowledge graphs.
  2. Performing efficient searches across complex datasets based on user queries.
  3. Supporting applications needing robust data retrieval and organization capabilities.

FAQ from cognee-mcp-server?

  • What is required to run cognee-mcp-server?

You will need to configure the server with your API key and local paths, and it works with various database providers.

  • Can I use custom models with cognee-mcp-server?

Yes, you can use your own Pydantic graph model implementations by specifying the filenames and class names in the configuration.

  • Is cognee-mcp-server compatible with existing AI frameworks?

Yes, it is designed to integrate smoothly with systems like Claude Desktop and can be adapted for other AI frameworks.

Content

cognee-mcp-server

An MCP server for cognee, an AI memory engine.

Tools

  • Cognify_and_search : Builds knowledge graph from the input text and performs search in it.
    • Inputs:
      • text (String): Context for knowledge graph contstruction
      • search_query (String): Query for retrieval
      • graph_model_file (String, optional): Filename of a custom pydantic graph model implementation
      • graph_model_name (String, optional): Class name of a custom pydantic graph model implementation
    • Output:
      • Retrieved edges of the knowledge graph

Configuration

Usage with Claude Desktop

Add this to your claude_desktop_config.json:

Using uvx
"mcpcognee": {
  "command": "uv",
  "args": [
    "--directory",
    "/path/to/your/cognee-mcp-server",
    "run",
    "mcpcognee"
  ],
  "env": {
    "ENV": "local",
    "TOKENIZERS_PARALLELISM": "false",
    "LLM_API_KEY": “your llm api key”,
    "GRAPH_DATABASE_PROVIDER": “networkx”,
    "VECTOR_DB_PROVIDER": "lancedb",
    "DB_PROVIDER": "sqlite",
    "DB_NAME": “cognee_db”
  }
}
No tools information available.

Mirror of

image-generation mcp-server
View Details

Secure MCP server for analyzing Excel files with oletools

oletools mcp-server
View Details

Mirror of

bigquery mcp-server
View Details

MCPHubs is a website that showcases projects related to Anthropic's Model Context Protocol (MCP)

mcp mcp-server
View Details
Dealx
Dealx by DealExpress

-

dealx mcp-server
View Details

Google Analytics MCP server for accessing analytics data through tools and resources

google-analytics mcp-server
View Details

A Python-based MCP server that lets Claude run boto3 code to query and manage AWS resources. Execute powerful AWS operations directly through Claude with proper sandboxing and containerization. No need for complex setups - just pass your AWS credentials and start interacting with all AWS services.

aws mcp-server
View Details