What is Allora MCP Server?
The Allora MCP Server is a Model Context Protocol (MCP) server implementation that allows AI systems to fetch machine learning inferences and access prediction markets data from the Allora Network.
How to use Allora MCP Server?
To use the Allora MCP Server, clone the repository, install the dependencies, set up your environment variables, and run the server. You can interact with it using any MCP client.
Key features of Allora MCP Server?
- Access to Allora prediction markets data through a standardized protocol.
- SSE connection for real-time communications.
- Multiple API endpoints for fetching topics and inference data.
Use cases of Allora MCP Server?
- Integrating prediction market data into AI workflows.
- Fetching real-time predictions for various topics.
- Enhancing AI applications with market insights.
FAQ from Allora MCP Server?
- What is the Model Context Protocol (MCP)?
MCP is a standardized protocol for communication between AI systems and data sources.
- Do I need an API key to use the server?
Yes, you need to sign up at alloralabs.com to obtain an API key.
- Is the Allora MCP Server open source?
Yes, it is available on GitHub under the MIT License.
Allora MCP Server
This is a Model Context Protocol (MCP) server implementation for fetching machine learning inferences from the Allora Network, providing access to Allora's prediction markets data through the Model Context Protocol.
Overview
The Allora MCP server allows AI systems and applications to access Allora prediction markets data through the standardized Model Context Protocol (MCP), enabling seamless integration of prediction market data into AI workflows. This server provides direct access to Allora topics, market predictions, and inference data.
Prerequisites
- Node.js 18 or higher
- An Allora API key (sign up at alloralabs.com)
Setup
- Clone the repository
git clone https://github.com/your-username/allora-mcp.git
cd allora-mcp
- Install dependencies
npm install
- Set up environment variables
Create a .env
file in the project root (or copy from .env.example
):
PORT=3001
ALLORA_API_KEY=your_api_key
Development
npm run dev
Building
npm run build
Running
npm start
Docker
Building the Docker Image
docker build -t allora-mcp .
Running with Docker
# Run the container
docker run -p 3001:3001 -e PORT=3001 -e ALLORA_API_KEY=your_api_key allora-mcp
# Or with environment variables in a file
docker run -p 3001:3001 --env-file .env allora-mcp
Docker Compose (optional)
Create a docker-compose.yml
file:
version: '3'
services:
allora-mcp:
build: .
ports:
- "3001:3001"
environment:
- PORT=3001
- ALLORA_API_KEY=your_api_key
Then run:
docker-compose up
API Usage
Once the server is running, you can interact with it using any MCP client. The server exposes the following endpoints:
GET /sse
- SSE connection endpoint for MCP communicationsPOST /messages
- Message endpoint for MCP communications
Available Tools
Tool Name | Description | Parameters |
---|---|---|
list_all_topics | Fetch a list of all Allora topics | None |
get_inference_by_topic_id | Fetch inference data for a specific topic | topicID : number |
Example Usage with Claude
When connected to Claude or other MCP-compatible AI systems, you can access Allora data with:
What topics are available in Allora?
Or get specific inference data:
What is the current prediction for BTC price in 8 hours?
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.