Dive AI Agent 🤿 🤖

Dive AI Agent 🤿 🤖

By OpenAgentPlatform GitHub

Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨

ai ai-agents
Overview

What is Dive?

Dive is an open-source AI Agent desktop application that integrates any Tools Call-supported LLM with a frontend MCP Server, part of the Open Agent Platform initiative.

How to use Dive?

To use Dive, download the appropriate version for your operating system (Windows, MacOS, or Linux) and follow the installation instructions. Configure the MCP settings to enable tools like Fetch and Youtube-dl for enhanced functionality.

Key features of Dive?

  • 🌐 Universal LLM Support: Compatible with various models including ChatGPT and OpenAI-compatible models.
  • 💻 Cross-Platform: Available for Windows, MacOS, and Linux.
  • 🔄 Model Context Protocol: Enables seamless AI agent integration.
  • 🔌 MCP Server Integration: Allows external data access and processing.
  • 🌍 Multi-Language Support: Supports Traditional Chinese and English.
  • ⚙️ Advanced API Management: Supports multiple API keys and model switching.
  • 💡 Custom Instructions: Allows personalized system prompts.
  • 💬 Intuitive Chat Interface: User-friendly design for real-time context management.

Use cases of Dive?

  1. Integrating AI agents for various applications.
  2. Accessing external data through MCP for enhanced AI responses.
  3. Supporting multiple languages for diverse user bases.

FAQ from Dive?

  • Can Dive support all AI models?

Yes! Dive is compatible with a wide range of LLMs including ChatGPT and others.

  • Is Dive free to use?

Yes! Dive is open-source and free for everyone.

  • How do I install Dive on Linux?

Download the .AppImage version and follow the specific setup instructions for your distribution.

Content

Dive AI Agent 🤿 🤖

GitHub stars GitHub forks GitHub watchers GitHub repo size GitHub language count GitHub top language GitHub last commit Discord Twitter Follow

Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨

Dive Demo

Features 🎯

  • 🌐 Universal LLM Support: Compatible with ChatGPT, Anthropic, Ollama and OpenAI-compatible models
  • 💻 Cross-Platform: Available for Windows, MacOS, and Linux
  • 🔄 Model Context Protocol: Enabling seamless MCP AI agent integration on both stdio and SSE mode
  • 🌍 Multi-Language Support: Traditional Chinese, Simplified Chinese, English, Spanish with more coming soon
  • ⚙️ Advanced API Management: Multiple API keys and model switching support
  • 💡 Custom Instructions: Personalized system prompts for tailored AI behavior
  • 🔄 Auto-Update Mechanism: Automatically checks for and installs the latest application updates

Recent updates(2025/3/14)

  • 🌍 Spanish Translation: Added Spanish language support
  • 🤖 Extended Model Support: Added Google Gemini and Mistral AI models integration

Download and Install ⬇️

Get the latest version of Dive: Download

For Windows users: 🪟

  • Download the .exe version
  • Python and Node.js environments are pre-installed

For MacOS users: 🍎

  • Download the .dmg version
  • You need to install Python and Node.js (with npx uvx) environments yourself
  • Follow the installation prompts to complete setup

For Linux users: 🐧

  • Download the .AppImage version
  • You need to install Python and Node.js (with npx uvx) environments yourself
  • For Ubuntu/Debian users:
    • You may need to add --no-sandbox parameter
    • Or modify system settings to allow sandbox
    • Run chmod +x to make the AppImage executable

MCP Tips

While the system comes with a default echo MCP Server, your LLM can access more powerful tools through MCP. Here's how to get started with two beginner-friendly tools: Fetch and Youtube-dl.

Set MCP

Quick Setup

Add this JSON configuration to your Dive MCP settings to enable both tools:

 "mcpServers":{
    "fetch": {
      "command": "uvx",
      "args": [
        "mcp-server-fetch",
        "--ignore-robots-txt"
      ],
      "enabled": true
    },
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/path/to/allowed/files"
      ],
      "enabled": true
    },
    "youtubedl": {
      "command": "npx",
      "args": [
        "@kevinwatt/yt-dlp-mcp"
      ],
      "enabled": true
    }
  }

Using SSE Server for MCP

You can also connect to an external MCP server via SSE (Server-Sent Events). Add this configuration to your Dive MCP settings:

{
  "mcpServers": {
    "MCP_SERVER_NAME": {
      "enabled": true,
      "transport": "sse",
      "url": "YOUR_SSE_SERVER_URL"
    }
  }
}

Additional Setup for yt-dlp-mcp

yt-dlp-mcp requires the yt-dlp package. Install it based on your operating system:

Windows

winget install yt-dlp

MacOS

brew install yt-dlp

Linux

pip install yt-dlp

Build 🛠️

See BUILD.md for more details.

Connect With Us 🌐

No tools information available.
FridayAI
FridayAI by VedantRGosavi

AI-gaming companion to help with quests.

ai gaming
View Details

mcp-use is the easiest way to interact with mcp servers with custom agents

Nerve
Nerve by evilsocket

The Simple Agent Development Kit.

Deep-Co
Deep-Co by succlz123

A Chat Client for LLMs, written in Compose Multiplatform.

MCP server that allows AI tools to interact with the CheerLights API.

PostgreSQL MCP Server
PostgreSQL MCP Server by vignesh-codes

This repo is an extension of PostgreSQL MCP Server providing functionalities to create tables, insert entries, update entries, delete entries, and drop tables.

mcp ai-agents
View Details