what is cognee-mcp-server?
The cognee-mcp-server is an MCP server designed for Cognee, an AI memory engine that builds knowledge graphs from input text and performs searches within them.
how to use cognee-mcp-server?
To use the cognee-mcp-server, install the necessary dependencies, configure your environment, and run the server with the appropriate commands to build and search knowledge graphs.
key features of cognee-mcp-server?
- Builds knowledge graphs from input text.
- Performs efficient searches within the constructed knowledge graphs.
- Configurable for different environments and use cases.
use cases of cognee-mcp-server?
- Creating knowledge graphs for research data.
- Enhancing AI applications with memory capabilities.
- Performing complex queries on structured data.
FAQ from cognee-mcp-server?
- What is the purpose of the cognee-mcp-server?
The cognee-mcp-server is designed to facilitate the construction and querying of knowledge graphs for AI applications.
- How do I configure the server?
Configuration involves setting up environment variables and specifying command-line arguments in the configuration file.
- Can I use custom graph models?
Yes, the server allows for the use of custom pydantic graph models for advanced use cases.
cognee-mcp-server
An MCP server for cognee, an AI memory engine.
Tools
Cognify_and_search
: Builds knowledge graph from the input text and performs search in it.- Inputs:
text
(String): Context for knowledge graph contstructionsearch_query
(String): Query for retrievalgraph_model_file
(String, optional): Filename of a custom pydantic graph model implementationgraph_model_name
(String, optional): Class name of a custom pydantic graph model implementation
- Output:
- Retrieved edges of the knowledge graph
- Inputs:
Configuration
Usage with Claude Desktop
Install uv with homebrew.
Add this to your claude_desktop_config.json:
Using uvx
"mcpcognee": {
"command": "uv",
"args": [
"--directory",
"/path/to/your/cognee-mcp-server",
"run",
"mcpcognee"
],
"env": {
"ENV": "local",
"TOKENIZERS_PARALLELISM": "false",
"LLM_API_KEY": “your llm api key”,
"GRAPH_DATABASE_PROVIDER": “networkx”,
"VECTOR_DB_PROVIDER": "lancedb",
"DB_PROVIDER": "sqlite",
"DB_NAME": “cognee_db”
}
}