🤖 mcp-ollama-beeai

🤖 mcp-ollama-beeai

By tamdilip GitHub

A minimal agentic app to interact with OLLAMA models leveraging multiple MCP server tools using BeeAI framework.

machine-learning mcp
Overview

What is mcp-ollama-beeai?

mcp-ollama-beeai is a minimal client application designed to interact with local OLLAMA models by leveraging multiple MCP server tools through the BeeAI framework.

How to use mcp-ollama-beeai?

To use mcp-ollama-beeai, you need to set up a local OLLAMA server and configure your MCP agents in the mcp-servers.json file. After that, clone the repository, install the dependencies, and start the application to access it via your browser.

Key features of mcp-ollama-beeai?

  • Interaction with local OLLAMA models.
  • Configuration of multiple MCP agents for enhanced functionality.
  • User-friendly interface for selecting agents and tools.
  • Integration with the BeeAI framework for easy setup of ReAct agents.

Use cases of mcp-ollama-beeai?

  1. Building AI-driven applications that require model interactions.
  2. Utilizing various MCP agents for different tasks like database operations and data fetching.
  3. Experimenting with local AI models in a controlled environment.

FAQ from mcp-ollama-beeai?

  • What are the prerequisites for using mcp-ollama-beeai?

You need to have a local OLLAMA server running and sufficient memory (at least 16GB RAM) for the models to perform effectively.

  • Can I use remote servers instead of local?

Yes, you can configure the application to use remote servers for model interactions.

  • How do I configure MCP agents?

You can add your MCP agents in the mcp-servers.json file located in the root folder of the application.

Content

🤖 mcp-ollama-beeai

A minimal client app to interact with local OLLAMA models leveraging multiple MCP agent tools using BeeAI framework.

Below is a sample visual of this client app with chat interface, displaying the postgres database operation performed with thinking steps the AI has taken to use the right MCP agent and tranforming the request & response with LLM: demo-pic

Usage

📋 Pre-requisite

1. Local ollama server

Install and serve ollama in your local machine with the following commands.

  • Make sure you have enough memory available in your machine, atleast 16GB RAM for models to perform.
  • Skip this installation in your local, if you're going to use a remote server for model.
        $ curl -fsSL https://ollama.com/install.sh | sh
        $ ollama serve
        $ ollama pull llama3.1

2. MCP servers list configuration

Add your MCP agents in the mcp-servers.json file in root folder, for the app to pickup and work along with the LLM.

3 .env

If you want to use a different LLM model and LLM server, override the below properties before npm start

        OLLAMA_CHAT_MODEL=llama3.1
        OLLAMA_BASE_URL=http://localhost:11434/api

🎮 Boot up your app

        $ git clone https://github.com/tamdilip/mcp-ollama-beeai.git
        $ cd mcp-ollama-beeai
        $ npm i
        $ npm start

Once the app is up and running, hit in Browser -> http://localhost:3000

Additional Context:

  • By default on landing no MCP agent is referred for the questions.
  • The respective MCP agent to be used a question can be selected from the Server & tools dropdown in UI.
  • BeeAI framework is used for ease setup of ReAct (Reason And Act) agent with MCP tools.
  • Markdown JS library is used to render the responses in proper readable visual format.

Happy coding :) !!

No tools information available.
School MCP
School MCP by 54yyyu

A Model Context Protocol (MCP) server for academic tools, integrating with Canvas and Gradescope platforms.

canvas mcp
View Details
repo-template
repo-template by loonghao

A Model Context Protocol (MCP) server for Python package intelligence, providing structured queries for PyPI packages and GitHub repositories. Features include dependency analysis, version tracking, and package metadata retrieval for LLM interactions.

-

google-calendar mcp
View Details
strava-mcp
strava-mcp by jeremysilva1098

MCP server for strava

strava mcp
View Details

Model Context Protocol (MCP) server implementation for Rhinoceros/Grasshopper integration, enabling AI models to interact with parametric design tools

grasshopper mcp
View Details

MCP configuration to connect AI agent to a Linux machine.

security mcp
View Details

AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).

python mcp
View Details