oterm

oterm

By ggozad GitHub

a text-based terminal client for Ollama

python machine-learning
Overview

What is oterm?

Oterm is a text-based terminal client designed for interacting with the Ollama platform, allowing users to run and manage machine learning models directly from their terminal.

How to use oterm?

To use oterm, ensure that the Ollama server is running, then simply type oterm in your terminal. You can customize the connection settings using environment variables if needed.

Key features of oterm?

  • Intuitive terminal UI with no need for additional servers or frontends.
  • Supports multiple persistent chat sessions with customizable system prompts and parameters.
  • Integration with various tools for enhanced functionality, such as fetching URLs and accessing current weather.
  • Ability to customize models and their parameters easily.

Use cases of oterm?

  1. Running and managing machine learning models from the terminal.
  2. Creating and editing chat sessions for interactive model inference.
  3. Utilizing external tools to enhance model capabilities and access real-time data.

FAQ from oterm?

  • Is oterm free to use?

Yes! Oterm is open-source and free to use.

  • What do I need to run oterm?

You need to have the Ollama server installed and running.

  • Can I customize the models used in oterm?

Yes! You can select and customize models, including their system prompts and parameters.

Content

oterm

the text-based terminal client for Ollama.

Features

  • intuitive and simple terminal UI, no need to run servers, frontends, just type oterm in your terminal.
  • supports Linux, MacOS, and Windows and most terminal emulators.
  • multiple persistent chat sessions, stored together with system prompt & parameter customizations in sqlite.
  • support for Model Context Protocol (MCP) tools & prompts integration.
  • can use any of the models you have pulled in Ollama, or your own custom models.
  • allows for easy customization of the model's system prompt and parameters.
  • supports tools integration for providing external information to the model.

Quick install

uvx oterm

See Installation for more details.

Documentation

oterm Documentation

What's new

  • MCP Sampling is here!
  • In-app log viewer for debugging and troubleshooting.
  • Support sixel graphics for displaying images in the terminal.
  • Support for Model Context Protocol (MCP) tools & prompts!
  • Create custom commands that can be run from the terminal using oterm. Each of these commands is a chat, customized to your liking and connected to the tools of your choice.

Screenshots

Splash The splash screen animation that greets users when they start oterm.

Chat A view of the chat interface, showcasing the conversation between the user and the model.

Model selection The model selection screen, allowing users to choose and customize available models.

Tool support oterm using the git MCP server to access its own repo.

Image selection The image selection interface, demonstrating how users can include images in their conversations.

Theme oterm supports multiple themes, allowing users to customize the appearance of the interface.

License

This project is licensed under the MIT License.

No tools information available.

The open-source multi-agent chat interface that lets you manage multiple agents in one dynamic conversation and add MCP servers for deep research

python typescript
View Details

AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).

python mcp
View Details

MCP Client Implementation Using LangChain ReAct Agent / Python

python mcp
View Details

An MCP server for processing images using Florence-2

python florence-2
View Details

Real-time stock API with Python, MCP server example, yfinance stock analysis dashboard

python fastapi
View Details

YouTube MCP Server is an AI-powered solution designed to revolutionize your YouTube experience. It empowers users to search for YouTube videos, retrieve detailed transcripts, and perform semantic searches over video content—all without relying on the official API. By integrating with a vector database, this server streamlines content discovery.

python machine-learning
View Details

A simple MCP server for weather

python mcp
View Details