GibsonAI MCP Server

GibsonAI MCP Server

By GibsonAI GitHub

AI-Powered Cloud databases: Build, migrate, and deploy database instances with AI

database sql
Overview

What is GibsonAI MCP Server? GibsonAI MCP Server is a Model Context Protocol implementation that connects AI language models to GibsonAI's database management platform, enabling natural language interaction with database systems directly within development environments.

How to use GibsonAI MCP Server To use the server, configure your IDE (such as Cursor) by adding the MCP server configuration, authenticate with the Gibson CLI, and begin interacting with your GibsonAI database through your IDE's chat interface.

Key features of GibsonAI MCP Server

  • Conversational database schema design and management
  • Automated code generation for database interfaces
  • Natural language schema modifications
  • Context-aware AI assistance for database development
  • Cloud database management capabilities

Use cases of GibsonAI MCP Server

  1. Creating and updating database projects through natural language
  2. Generating API endpoints that correctly interface with your backend
  3. Troubleshooting database performance issues
  4. Refactoring schemas while preserving existing data
  5. Developing database monitoring solutions

FAQ from GibsonAI MCP Server

  • How do I authenticate with the GibsonAI CLI? Use the command uvx --from gibson-cli@latest gibson auth login to authenticate.
  • What development environments are supported? Any IDE that supports Model Context Protocol will work with GibsonAI. For example, Cursor, Windsurf, Claude Code, etc.
  • Is the MCP Server compatible with all GibsonAI accounts? Yes, all GibsonAI accounts can use the MCP Server functionality.
Content

Introduction

GibsonAI's Model Context Protocol (MCP) server integrates their database management platform directly into development workflows. This server allows developers to interact with GibsonAI's database tools through natural language within their IDE.

Available Methods

The GibsonAI MCP server enables an LLM to perform multiple operations on behalf of the user through their GibsonAI account.

  1. get_projects - View the list of projects available in the GibsonAI account
  2. create_project - Create a new database project in GibsonAI
  3. get_project_details - Retrieve the project details such as IDs, schema, tables, and entities
  4. get_project_hosted_api_details - Fetch the auto-generated API spec and available methods for the project
  5. update_project - Update the name or other information for a project
  6. submit_data_modeling_request - Send a natural language or SQL request to GibsonAI to create or update a model
  7. deploy_project - Set your project live
  8. get_project_schema - Fetch just the schema for a given project
  9. get_deployed_schema - Fetch the currently deployed live version of a project's schema
  10. query_database - Send a SQL request directly to the DB

Technical Implementation

The GibsonAI MCP Server works by:

  1. Providing context about your database configuration to the AI
  2. Processing natural language requests through your IDE's chat interface
  3. Converting those requests into appropriate database operations
  4. Returning results in a developer-friendly format

Configuration in Cursor IDE

{
  "mcpServers": {
    "gibson": {
      "command": "uvx",
      "args": ["--from", "gibson-cli@latest", "gibson", "mcp", "run"]
    }
  }
}
No tools information available.

A server that helps people access and query data in databases using the Legion Query Runner with Model Context Protocol (MCP) in Python.

database mcp
View Details

A Model Context Protocol (MCP) server that provides secure, read-only access to BigQuery datasets. Enables Large Language Models (LLMs) to safely query and analyze data through a standardized interface.

bigquery sql
View Details

Model Context Protocol with Neo4j

database neo4j
View Details