Tiny Chat

Tiny Chat

By to-aoki GitHub

This is an LLM application with chat functionality, featuring chat using RAG, a database, and MCP server capabilities. The UI is designed for Japanese users.

chat rag
Overview

What is Tiny Chat?

Tiny Chat is an LLM application that provides chat functionality, utilizing RAG (Retrieval-Augmented Generation) and a database, designed specifically for Japanese users.

How to use Tiny Chat?

To use Tiny Chat, you can either run it from the source code or install the package. For development, use the command streamlit run tiny_chat/app.py --server.address=127.0.0.1. For the installed package, simply run tiny-chat.

Key features of Tiny Chat?

  • Chat functionality powered by LLM and RAG.
  • Database integration for enhanced performance.
  • Designed with a user interface tailored for Japanese users.

Use cases of Tiny Chat?

  1. Engaging in real-time conversations using AI.
  2. Providing customer support through automated chat.
  3. Enhancing user interaction in applications with chat features.

FAQ from Tiny Chat?

  • Is Tiny Chat free to use?

Yes! Tiny Chat is open-source and free to use under the MIT license.

  • What programming language is Tiny Chat built with?

Tiny Chat is built using Python.

  • How can I install Tiny Chat?

You can install Tiny Chat by following the installation instructions provided in the documentation.

Content

Tiny Chat

Installation

Tested with Python 3.10 or later

Development Installation

pip install -r requirements.txt

Package Installation

# Build the package
pip install build
python -m build

# Install the built package
pip install dist/*.whl

Usage

Running from source (development)

streamlit run tiny_chat/app.py --server.address=127.0.0.1

Running installed package

tiny-chat

img.png

No tools information available.

A powerful knowledge management system that forges wisdom from experiences, insights, and best practices. Built with Qdrant vector database for efficient knowledge storage and retrieval.

mcp-local-rag
mcp-local-rag by nkapila6

"primitive" RAG-like web search model context protocol (MCP) server that runs locally. ✨ no APIs ✨

chatmcp
chatmcp by daodao97

ChatMCP is an AI chat client implementing the Model Context Protocol (MCP).

chat client
View Details

A custom MCP server with RAG capabilities and multiple search providers (Gemini 2.0 and Linkup)

MindBridge is an AI orchestration MCP server that lets any app talk to any LLM — OpenAI, Anthropic, DeepSeek, Ollama, and more — through a single unified API. Route queries, compare models, get second opinions, and build smarter multi-LLM workflows.

chat agent
View Details
SeekChat
SeekChat by seekrays

✨ A Sleek and Powerful AI Desktop Assistant that supports MCP integration✨

An MCP server implementation that provides tools for retrieving and processing documentation through vector search, enabling AI assistants to augment their responses with relevant documentation context.