Deep-Co

Deep-Co

By succlz123 GitHub

A Chat Client for LLMs, written in Compose Multiplatform.

ai mcp
Overview

What is Deep-Co?

Deep-Co is a chat client designed for interacting with various Large Language Models (LLMs) using Compose Multiplatform. It supports multiple API providers, allowing users to chat with advanced AI models.

How to use Deep-Co?

To use Deep-Co, install the necessary dependencies for your operating system, configure your LLM API keys, and run the application to start chatting with LLMs.

Key features of Deep-Co?

  • Chat with multiple LLMs including DeepSeek, OpenAI, and others.
  • Maintain chat history for reference.
  • Support for Model Context Protocol (MCP).
  • Dark/Light theme options for user preference.

Use cases of Deep-Co?

  1. Engaging in conversations with AI for information retrieval.
  2. Utilizing LLMs for creative writing assistance.
  3. Experimenting with different AI models for research purposes.

FAQ from Deep-Co?

  • What LLMs does Deep-Co support?

Deep-Co currently supports DeepSeek and is compatible with OpenAI and other LLMs through API configuration.

  • Is Deep-Co free to use?

Yes! Deep-Co is open-source and free to use.

  • How do I install Deep-Co?

Follow the installation instructions provided in the documentation for your specific operating system.

Content

Deep-Co

icon

windows macos linux
android iOS
kotlin compose
stars gpl release

A Chat Client for LLMs, written in Compose Multiplatform. Target supports API providers such as OpenRouter, Anthropic, Grok, OpenAI, DeepSeek, Coze, Dify, Google Gemini, etc. You can also configure any OpenAI-compatible API or use native models via LM Studio/Ollama.

Release

v1.0.6

Feature

  • Desktop Platform Support(Windows/MacOS/Linux)
  • Mobile Platform Support(Android/iOS)
  • Chat(Stream&Complete) / Chat History
  • Chat Messages Export / Chat Translate Server
  • Prompt Management / User Define
  • SillyTavern Character Adaptation(PNG&JSON)
  • DeepSeek LLM / Grok LLM / Google Gemini LLM
  • Claude LLM / OpenAI LLM / OLLama LLM
  • Online API polling
  • MCP Support
  • MCP Server Market
  • RAG
  • TTS(Edge API)
  • i18n(Chinese/English) / App Color Theme / App Dark&Light Theme

Chat With LLMs

1

Config Your LLMs API Key

2

Prompt Management

4

Chat With Tavern Character

6

User Management

5

Config MCP Servers

3

Setting

7

Model Context Protocol (MCP) ENV

MacOS

brew install uv
brew install node

windows

winget install --id=astral-sh.uv  -e
winget install OpenJS.NodeJS.LTS

Build

Run desktop via Gradle

./gradlew :desktopApp:run

Building desktop distribution

./gradlew :desktop:packageDistributionForCurrentOS
# outputs are written to desktopApp/build/compose/binaries

Run Android via Gradle

./gradlew :androidApp:installDebug

Building Android distribution

./gradlew clean :androidApp:assembleRelease
# outputs are written to androidApp/build/outputs/apk/release

Thanks

No tools information available.
School MCP
School MCP by 54yyyu

A Model Context Protocol (MCP) server for academic tools, integrating with Canvas and Gradescope platforms.

canvas mcp
View Details
repo-template
repo-template by loonghao

A Model Context Protocol (MCP) server for Python package intelligence, providing structured queries for PyPI packages and GitHub repositories. Features include dependency analysis, version tracking, and package metadata retrieval for LLM interactions.

-

google-calendar mcp
View Details
strava-mcp
strava-mcp by jeremysilva1098

MCP server for strava

strava mcp
View Details

Model Context Protocol (MCP) server implementation for Rhinoceros/Grasshopper integration, enabling AI models to interact with parametric design tools

grasshopper mcp
View Details

MCP configuration to connect AI agent to a Linux machine.

security mcp
View Details

AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).

python mcp
View Details