
WolframAlpha LLM MCP Server
An MCP Server for WolframAlpha's LLM API, able to return structured knowledge & solve math
what is WolframAlpha LLM MCP Server?
WolframAlpha LLM MCP Server is a Model Context Protocol (MCP) server that provides access to WolframAlpha's LLM API, enabling users to return structured knowledge and solve complex mathematical problems.
how to use WolframAlpha LLM MCP Server?
To use the server, clone the repository, install the necessary packages, and configure your WolframAlpha API key in the settings file. You can then query the server with natural language questions.
key features of WolframAlpha LLM MCP Server?
- Query WolframAlpha's LLM API with natural language questions
- Answer complicated mathematical questions
- Retrieve facts about various subjects including science, physics, history, and geography
- Get structured responses optimized for LLM consumption
- Support for both simplified and detailed answers
use cases of WolframAlpha LLM MCP Server?
- Solving advanced mathematical equations
- Retrieving factual information for research purposes
- Assisting in educational settings for students needing help with math and science
FAQ from WolframAlpha LLM MCP Server?
- How do I get my WolframAlpha API key?
You can obtain your API key from the WolframAlpha developer portal at developer.wolframalpha.com.
- Is there a limit to the number of queries I can make?
Yes, the number of queries may be limited based on your API plan with WolframAlpha.
- Can I use this server for commercial purposes?
Please refer to the WolframAlpha API terms of service for details on commercial usage.
WolframAlpha LLM MCP Server

A Model Context Protocol (MCP) server that provides access to WolframAlpha's LLM API. https://products.wolframalpha.com/llm-api/documentation


Features
- Query WolframAlpha's LLM API with natural language questions
- Answer complicated mathematical questions
- Query facts about science, physics, history, geography, and more
- Get structured responses optimized for LLM consumption
- Support for simplified answers and detailed responses with sections
Available Tools
ask_llm
: Ask WolframAlpha a question and get a structured llm-friendly responseget_simple_answer
: Get a simplified answervalidate_key
: Validate the WolframAlpha API key
Installation
git clone https://github.com/Garoth/wolframalpha-llm-mcp.git
npm install
Configuration
-
Get your WolframAlpha API key from developer.wolframalpha.com
-
Add it to your Cline MCP settings file inside VSCode's settings (ex. ~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):
{
"mcpServers": {
"wolframalpha": {
"command": "node",
"args": ["/path/to/wolframalpha-mcp-server/build/index.js"],
"env": {
"WOLFRAM_LLM_APP_ID": "your-api-key-here"
},
"disabled": false,
"autoApprove": [
"ask_llm",
"get_simple_answer",
"validate_key"
]
}
}
}
Development
Setting Up Tests
The tests use real API calls to ensure accurate responses. To run the tests:
-
Copy the example environment file:
cp .env.example .env
-
Edit
.env
and add your WolframAlpha API key:WOLFRAM_LLM_APP_ID=your-api-key-here
Note: The
.env
file is gitignored to prevent committing sensitive information. -
Run the tests:
npm test
Building
npm run build
License
MIT