Allora MCP Server

Allora MCP Server

By allora-network GitHub

-

allora mcp
Overview

What is Allora MCP Server?

The Allora MCP Server is a Model Context Protocol (MCP) server implementation that allows AI systems to fetch machine learning inferences and access prediction markets data from the Allora Network.

How to use Allora MCP Server?

To use the Allora MCP Server, clone the repository, install the dependencies, set up your environment variables, and run the server. You can interact with it using any MCP client.

Key features of Allora MCP Server?

  • Access to Allora prediction markets data through a standardized protocol.
  • SSE connection for real-time communications.
  • Multiple API endpoints for fetching topics and inference data.

Use cases of Allora MCP Server?

  1. Integrating prediction market data into AI workflows.
  2. Fetching real-time predictions for various topics.
  3. Enhancing AI applications with market insights.

FAQ from Allora MCP Server?

  • What is the Model Context Protocol (MCP)?

MCP is a standardized protocol for communication between AI systems and data sources.

  • Do I need an API key to use the server?

Yes, you need to sign up at alloralabs.com to obtain an API key.

  • Is the Allora MCP Server open source?

Yes, it is available on GitHub under the MIT License.

Content

Allora MCP Server

This is a Model Context Protocol (MCP) server implementation for fetching machine learning inferences from the Allora Network, providing access to Allora's prediction markets data through the Model Context Protocol.

License: MIT

Overview

The Allora MCP server allows AI systems and applications to access Allora prediction markets data through the standardized Model Context Protocol (MCP), enabling seamless integration of prediction market data into AI workflows. This server provides direct access to Allora topics, market predictions, and inference data.

Prerequisites

Setup

  1. Clone the repository
git clone https://github.com/your-username/allora-mcp.git
cd allora-mcp
  1. Install dependencies
npm install
  1. Set up environment variables

Create a .env file in the project root (or copy from .env.example):

PORT=3001
ALLORA_API_KEY=your_api_key

Development

npm run dev

Building

npm run build

Running

npm start

Docker

Building the Docker Image

docker build -t allora-mcp .

Running with Docker

# Run the container
docker run -p 3001:3001 -e PORT=3001 -e ALLORA_API_KEY=your_api_key allora-mcp

# Or with environment variables in a file
docker run -p 3001:3001 --env-file .env allora-mcp

Docker Compose (optional)

Create a docker-compose.yml file:

version: '3'
services:
  allora-mcp:
    build: .
    ports:
      - "3001:3001"
    environment:
      - PORT=3001
      - ALLORA_API_KEY=your_api_key

Then run:

docker-compose up

API Usage

Once the server is running, you can interact with it using any MCP client. The server exposes the following endpoints:

  • GET /sse - SSE connection endpoint for MCP communications
  • POST /messages - Message endpoint for MCP communications

Available Tools

Tool NameDescriptionParameters
list_all_topicsFetch a list of all Allora topicsNone
get_inference_by_topic_idFetch inference data for a specific topictopicID: number

Example Usage with Claude

When connected to Claude or other MCP-compatible AI systems, you can access Allora data with:

What topics are available in Allora?

Or get specific inference data:

What is the current prediction for BTC price in 8 hours?

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

No tools information available.
School MCP
School MCP by 54yyyu

A Model Context Protocol (MCP) server for academic tools, integrating with Canvas and Gradescope platforms.

canvas mcp
View Details
repo-template
repo-template by loonghao

A Model Context Protocol (MCP) server for Python package intelligence, providing structured queries for PyPI packages and GitHub repositories. Features include dependency analysis, version tracking, and package metadata retrieval for LLM interactions.

-

google-calendar mcp
View Details
strava-mcp
strava-mcp by jeremysilva1098

MCP server for strava

strava mcp
View Details

Model Context Protocol (MCP) server implementation for Rhinoceros/Grasshopper integration, enabling AI models to interact with parametric design tools

grasshopper mcp
View Details

MCP configuration to connect AI agent to a Linux machine.

security mcp
View Details

AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).

python mcp
View Details