AWS Resources MCP Server

AWS Resources MCP Server

By MCP-Mirror GitHub

Mirror of

aws mcp-server
Overview

What is AWS Resources MCP Server?

AWS Resources MCP Server is a Model Context Protocol (MCP) server implementation that allows users to run generated Python code to query AWS resources using the boto3 library.

How to use AWS Resources MCP Server?

To use the server, you need to set up AWS credentials and run the server via Docker. You can execute Python code snippets to query AWS resources directly.

Key features of AWS Resources MCP Server?

  • Executes Python code to query AWS resources using boto3.
  • Supports dynamic resource querying through a Docker container.
  • Allows management operations beyond read-only access, depending on user permissions.

Use cases of AWS Resources MCP Server?

  1. Querying S3 buckets and their contents.
  2. Retrieving the latest deployment from AWS CodePipeline.
  3. Managing AWS resources programmatically with Python.

FAQ from AWS Resources MCP Server?

  • What permissions do I need to use this server?

You need AWS credentials with appropriate permissions to query the resources you want.

  • Is there a limit on the number of queries?

No, the server does not impose a limit, but your AWS user role will dictate the permissions for what you can do.

  • Can I run this server locally?

Yes, you can build the Docker image locally or pull it from Docker Hub.

Content

AWS Resources MCP Server

Docker Hub smithery badge

Overview

A Model Context Protocol (MCP) server implementation that provides running generated python code to query any AWS resources through boto3.

At your own risk: I didn't limit the operations to ReadyOnly, so that cautious Ops people can be helped using this tool doing management operations. Your AWS user role will dictate the permissions for what you can do.

image

Demo: Fix Dynamodb Permission Error

https://github.com/user-attachments/assets/de88688d-d7a0-45e1-94eb-3f5d71e9a7c7

Why Another AWS MCP Server?

I tried AWS Chatbot with Developer Access. Free Tier has a limit of 25 query/month for resources. Next tier is $19/month include 90% of the features I don't use. And the results are in a fashion of JSON and a lot of restrictions.

I tried using aws-mcp but ran into a few issues:

  1. Setup Hassle: Had to clone a git repo and deal with local setup
  2. Stability Issues: Wasn't stable enough on my Mac
  3. Node.js Stack: As a Python developer, I couldn't effectively contribute back to the Node.js codebase

So I created this new approach that:

  • Runs directly from a Docker image - no git clone needed
  • Uses Python and boto3 for better stability
  • Makes it easy for Python folks to contribute
  • Includes proper sandboxing for code execution
  • Keeps everything containerized and clean

For more information about the Model Context Protocol and how it works, see Anthropic's MCP documentation.

Components

Resources

The server exposes the following resource:

  • aws://query_resources: A dynamic resource that provides access to AWS resources through boto3 queries

Example Queries

Here are some example queries you can execute:

s3 = session.client('s3')
result = s3.list_buckets()
  1. Get latest CodePipeline deployment:
def get_latest_deployment(pipeline_name):
    codepipeline = session.client('codepipeline')

    result = codepipeline.list_pipeline_executions(
        pipelineName=pipeline_name,
        maxResults=5
    )

    if result['pipelineExecutionSummaries']:
        latest_execution = max(
            [e for e in result['pipelineExecutionSummaries']
             if e['status'] == 'Succeeded'],
            key=itemgetter('startTime'),
            default=None
        )

        if latest_execution:
            result = codepipeline.get_pipeline_execution(
                pipelineName=pipeline_name,
                pipelineExecutionId=latest_execution['pipelineExecutionId']
            )
        else:
            result = None
    else:
        result = None

    return result

result = get_latest_deployment("your-pipeline-name")

Tools

The server offers a tool for executing AWS queries:

  • query_aws_resources
    • Execute a boto3 code snippet to query AWS resources
    • Input:
      • code_snippet (string): Python code using boto3 to query AWS resources
      • The code must set a result variable with the query output
    • Allowed imports:
      • boto3
      • operator
      • json
      • datetime
      • pytz
    • Available built-in functions:
      • Basic types: dict, list, tuple, set, str, int, float, bool
      • Operations: len, max, min, sorted, filter, map, sum, any, all
      • Object handling: hasattr, getattr, isinstance
      • Other: print, import

Setup

Prerequisites

You'll need AWS credentials with appropriate permissions to query AWS resources. You can obtain these by:

  1. Creating an IAM user in your AWS account
  2. Generating access keys for programmatic access
  3. Ensuring the IAM user has necessary permissions for the AWS services you want to query

The following environment variables are required:

  • AWS_ACCESS_KEY_ID: Your AWS access key
  • AWS_SECRET_ACCESS_KEY: Your AWS secret key
  • AWS_SESSION_TOKEN: (Optional) AWS session token if using temporary credentials
  • AWS_DEFAULT_REGION: AWS region (defaults to 'us-east-1' if not set)

You can also use a profile stored in the ~/.aws/credentials file. To do this, set the AWS_PROFILE environment variable to the profile name.

Note: Keep your AWS credentials secure and never commit them to version control.

Installing via Smithery

To install AWS Resources MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install mcp-server-aws-resources-python --client claude

Docker Installation

You can either build the image locally or pull it from Docker Hub. The image is built for the Linux platform.

Supported Platforms

  • Linux/amd64
  • Linux/arm64
  • Linux/arm/v7

Option 1: Pull from Docker Hub

docker pull buryhuang/mcp-server-aws-resources:latest

Option 2: Build Locally

docker build -t mcp-server-aws-resources .

Run the container:

docker run \
  -e AWS_ACCESS_KEY_ID=your_access_key_id_here \
  -e AWS_SECRET_ACCESS_KEY=your_secret_access_key_here \
  -e AWS_DEFAULT_REGION=your_AWS_DEFAULT_REGION \
  buryhuang/mcp-server-aws-resources:latest

Or using stored credentials and a profile:

docker run \
  -e AWS_PROFILE=[AWS_PROFILE_NAME] \
  -v ~/.aws:/root/.aws \
  buryhuang/mcp-server-aws-resources:latest

Cross-Platform Publishing

To publish the Docker image for multiple platforms, you can use the docker buildx command. Follow these steps:

  1. Create a new builder instance (if you haven't already):

    docker buildx create --use
    
  2. Build and push the image for multiple platforms:

    docker buildx build --platform linux/amd64,linux/arm64,linux/arm/v7 -t buryhuang/mcp-server-aws-resources:latest --push .
    
  3. Verify the image is available for the specified platforms:

    docker buildx imagetools inspect buryhuang/mcp-server-aws-resources:latest
    

Usage with Claude Desktop

Running with Docker

Example using ACCESS_KEY_ID and SECRET_ACCESS_KEY

{
  "mcpServers": {
    "aws-resources": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "AWS_ACCESS_KEY_ID=your_access_key_id_here",
        "-e",
        "AWS_SECRET_ACCESS_KEY=your_secret_access_key_here",
        "-e",
        "AWS_DEFAULT_REGION=us-east-1",
        "buryhuang/mcp-server-aws-resources:latest"
      ]
    }
  }
}

Example using PROFILE and mounting local AWS credentials

{
  "mcpServers": {
    "aws-resources": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "AWS_PROFILE=default",
        "-v",
        "~/.aws:/root/.aws",
        "buryhuang/mcp-server-aws-resources:latest"
      ]
    }
  }
}

Running with Git clone

Example running with git clone and profile

{
  "mcpServers": {
    "aws": {
      "command": "/Users/gmr/.local/bin/uv",
      "args": [
        "--directory",
        "/<your-path>/mcp-server-aws-resources-python",
        "run",
        "src/mcp_server_aws_resources/server.py",
        "--profile",
        "testing"
      ]
    }
  }
}
No tools information available.

Mirror of

image-generation mcp-server
View Details

Secure MCP server for analyzing Excel files with oletools

oletools mcp-server
View Details

Mirror of

bigquery mcp-server
View Details

MCPHubs is a website that showcases projects related to Anthropic's Model Context Protocol (MCP)

mcp mcp-server
View Details
Dealx
Dealx by DealExpress

-

dealx mcp-server
View Details

Google Analytics MCP server for accessing analytics data through tools and resources

google-analytics mcp-server
View Details

A Python-based MCP server that lets Claude run boto3 code to query and manage AWS resources. Execute powerful AWS operations directly through Claude with proper sandboxing and containerization. No need for complex setups - just pass your AWS credentials and start interacting with all AWS services.

aws mcp-server
View Details