Ollama MCP Client

Ollama MCP Client

By georgi-terziyski GitHub

-

Overview

what is Ollama MCP Client?

Ollama MCP Client is a Model Context Protocol (MCP) client designed for interacting with various language models served through Ollama.

how to use Ollama MCP Client?

To use the Ollama MCP Client, clone the repository, install the package, and run commands to interact with language models and databases using natural language queries.

key features of Ollama MCP Client?

  • Multiple model support for Qwen, Llama 3, Mistral, and Gemini models.
  • Real-time streaming of language model responses.
  • Seamless integration with the Database MCP Server for database operations.
  • Natural language interface for querying databases.

use cases of Ollama MCP Client?

  1. Querying database tables and data using natural language.
  2. Creating and managing database schemas through simple commands.
  3. Interacting with various language models for different applications.

FAQ from Ollama MCP Client?

  • What models does the Ollama MCP Client support?

It supports Qwen, Llama 3, Mistral, and Gemini models.

  • How do I install the Ollama MCP Client?

You can install it by cloning the repository and running pip install -e ..

  • Can I use natural language to query my database?

Yes! The client allows you to interact with your database using natural language queries.

Content

Ollama MCP Client

A Model Context Protocol (MCP) client for language models served through Ollama.

Features

  • Multiple Model Support: Works with Qwen, Llama 3, Mistral, and Gemini models served via Ollama
  • Streaming Responses: Real-time streaming of LLM responses
  • Database Operations: Seamless integration with the Database MCP Server
  • Natural Language Interface: Interact with databases using natural language

Installation

# Clone the repository
git clone <repository-url>

# Install the package
pip install -e .

Quick Start

# Basic usage
mcp-ollama --model llama3 --server http://localhost:8000 "List all tables in my database"

# Specify a different Ollama server
mcp-ollama --model qwen --ollama-api http://localhost:11434 --server http://localhost:8000 "Show me the schema for the users table"

Configuration

The client can be configured using command line arguments or a configuration file:

Command Line Arguments

  • --model: The Ollama model to use (e.g., llama3, qwen, mistral, gemini)
  • --server: URL of the Database MCP server
  • --ollama-api: URL of the Ollama API server (default: http://localhost:11434)
  • --stream: Enable/disable streaming responses (default: true)
  • --config: Path to configuration file

Configuration File

Create a config.json file in your home directory:

{
  "model": "llama3",
  "server": "http://localhost:8000",
  "ollama_api": "http://localhost:11434",
  "stream": true
}

Example Usage

Natural Language Database Queries

# List tables
mcp-ollama "What tables do I have in my database?"

# Query data
mcp-ollama "Show me all users who registered in the last month"

# Create a table
mcp-ollama "Create a new table called products with columns for id, name, price, and description"

License

MIT

No tools information available.
No content found.