Waldzell MCP Servers

Waldzell MCP Servers

By waldzellai GitHub

Waldzell AI's monorepo of MCP servers. Use in Claude Desktop, Cline, Roo Code, and more!

Overview

What is Waldzell MCP?

Waldzell MCP is a collection of Model Context Protocol (MCP) servers designed to enhance AI assistants with advanced problem-solving capabilities and decision-making algorithms.

How to use Waldzell MCP?

To use Waldzell MCP, developers can integrate the MCP servers into their AI applications by installing the necessary packages and utilizing the provided APIs for enhanced reasoning and decision-making.

Key features of Waldzell MCP?

  • Advanced problem-solving through sequential thinking and dynamic thought evolution.
  • Stochastic algorithms for improved decision-making, including Markov Decision Processes and Monte Carlo Tree Search.
  • Modular architecture allowing for easy addition of new capabilities.

Use cases of Waldzell MCP?

  1. Enhancing AI assistants to solve complex mathematical problems.
  2. Implementing decision-making strategies in AI applications.
  3. Providing structured reasoning capabilities for various AI tasks.

FAQ from Waldzell MCP?

  • What is the purpose of the MCP servers?

The MCP servers are designed to enhance AI models by providing specialized reasoning tools and algorithms.

  • How can I contribute to the project?

You can contribute by creating new packages or improving existing ones within the monorepo structure.

  • Is there documentation available?

Yes, comprehensive documentation is available on the project's GitHub page.

Content

Waldzell MCP Servers

This is a Turborepo-powered monorepo containing MCP (Model Context Protocol) servers for various AI assistant integrations.

What's inside?

Packages

Utilities

This monorepo uses Turborepo with Yarn 4 Workspaces.

  • Turborepo — High-performance build system for monorepos
  • Yarn 4 — Modern package management with PnP support
  • Changesets — Managing versioning and changelogs
  • GitHub Actions — Automated workflows
  • Smithery — Deployment platform for MCP servers

Getting Started

Prerequisites

  • Node.js 18 or higher
  • Corepack enabled (corepack enable)

Installation

Clone the repository and install dependencies:

git clone https://github.com/waldzellai/mcp-servers.git
cd mcp-servers
yarn install

Development

To develop all packages:

yarn dev

Building

To build all packages:

yarn build

The build output will be in each package's dist/ directory.

Testing

yarn test

Linting

yarn lint

Deploying to Smithery

This repo is set up to easily deploy packages to Smithery:

# Deploy all packages
yarn deploy

# Deploy specific packages
yarn smithery:yelp-fusion
yarn smithery:typestyle
yarn smithery:stochastic
yarn smithery:clear-thought

Workflow

Adding a new feature

  1. Create a new branch
  2. Make your changes
  3. Add a changeset (documents what's changed for version bumping):
    yarn changeset
    
  4. Push your changes

Releasing new versions

We use Changesets to manage versions. Create a PR with your changes and Changesets will create a release PR that you can merge to release new versions.

For manual releases:

yarn publish-packages

Adding a New Package

  1. Create a new directory in the packages directory
  2. Initialize the package with yarn init
  3. Add your source code
  4. Update turbo.json pipeline if needed
  5. Add a smithery.yaml file if you want to deploy to Smithery
  6. Run yarn install at the root to update workspaces

Turborepo

Remote Caching

Turborepo can use a remote cache to share build artifacts across machines. To enable Remote Caching:

yarn dlx turbo login
yarn dlx turbo link

MCP Server Documentation

Each MCP server package in this monorepo has its own README with detailed documentation:

License

All packages in this monorepo are licensed under the MIT License - see each package's LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a pull request.

No tools information available.
No content found.