
MCP Connector: Integrating AI agent with Data Warehouse in Microsoft Fabric
MCP Client and Server apps to demo integration of Azure OpenAI-based AI agent with a Data Warehouse, exposed through GraphQL in Microsoft Fabric.
What is MCP Connector?
MCP Connector is a project that integrates an AI agent powered by Azure OpenAI with a Microsoft Fabric data warehouse, utilizing GraphQL for data access.
How to use MCP Connector?
To use MCP Connector, set up a Microsoft Fabric data warehouse, configure the local client environment with necessary API keys, and run the MCP client to connect to the GraphQL API endpoint.
Key features of MCP Connector?
- Integration of Azure OpenAI AI agent with Microsoft Fabric data warehouse.
- Utilization of GraphQL for bidirectional data access.
- Dynamic discovery of tools and data resources through the Model Context Protocol (MCP).
Use cases of MCP Connector?
- Enabling AI agents to access and manipulate enterprise data.
- Facilitating data queries and mutations through GraphQL.
- Demonstrating AI capabilities in data-driven applications.
FAQ from MCP Connector?
- What is the Model Context Protocol (MCP)?
MCP is an open integration standard for AI agents that allows for dynamic discovery of tools and data resources.
- Is there a demo available for MCP Connector?
Yes! A practical demo can be found on YouTube.
- What programming language is used in this project?
The project is implemented in Python.
MCP Connector: Integrating AI agent with Data Warehouse in Microsoft Fabric
This repo demonstrates the integration of an Azure OpenAI-powered AI agent with a Microsoft Fabric data warehouse using the Model Context Protocol (MCP), open integration standard for AI agents by Anthropic.
MCP enables dynamic discovery of tools, data resources and prompt templates (with more coming soon), unifying their integration with AI agents. GraphQL provides an abstraction layer for universal data connection. Below, you will find detailed steps on how to combine MCP and GraphQL to enable bidirectional access to enterprise data for your AI agent.
NOTE
In the MCP server's script, some query parameter values are hard-coded for the sake of this example. In a real-world scenario, these values would be dynamically generated or retrieved.
Table of contents:
- Part 1: Configuring Microsoft Fabric Backend
- Part 2: Configuring Local Client Environment
- Part 3: User Experience - Gradio UI
- Part 4: Demo video on YouTube
Part 1: Configuring Microsoft Fabric Backend
- In Microsoft Fabric, create a new data warehouse pre-populated by sample data by clicking New item -> Sample warehouse:
- Next, create a GraphQL API endpoint by clicking New item -> API for GraphQL:
- In the data configuration of GraphQL API, choose the Trip (dbo.Trip) table:
- Copy the endpoint URL of your GraphQL API:
Part 2: Configuring Local Client Environment
- Install the required Python packages, listed in the provided requirements.txt:
pip install -r requirements.txt
- Configure environmnet variables for the MCP client:
Variable | Description |
---|---|
AOAI_API_BASE | Base URL of the Azure OpenAI endpoint |
AOAI_API_VERSION | API version of the Azure OpenAI endpoint |
AOAI_DEPLOYMENT | Deployment name of the Azure OpenAI model |
- Set the value of the
AZURE_FABRIC_GRAPHQL_ENDPOINT
variable with the GraphQL endpoint URL from Step 1.4 above. It will be utilised by the MCP Server script to establish connectivity with Microsoft Fabric:
Variable | Description |
---|---|
AZURE_FABRIC_GRAPHQL_ENDPOINT | Microsoft Fabric's GraphQL API endpoint |
Part 3: User Experience - Gradio UI
- Launch the MCP client in your command prompt:
python MCP_Client_Gradio.py
- Click the Initialise System button to start the MCP server and connect your AI agent to the Microsoft Fabric's GraphQL API endpoint:
- You can now pull and push data to your data warehouse using GraphQL's queries and mutations enabled by this MCP connector:
Part 4: Demo video on YouTube
A practical demo of the provided MCP connector can be found on this YouTube video.