Console Chat Gpt

Console Chat Gpt

By amidabuddha GitHub

Python CLI for AI Chat API

Overview

what is Console Chat GPT?

Console Chat GPT is a Python command-line interface (CLI) tool that allows users to interact with various AI models, including those from OpenAI, MistralAI, Anthropic, and more, directly from their terminal.

how to use Console Chat GPT?

To use Console Chat GPT, clone the repository, install the necessary dependencies, obtain your API keys from the respective AI providers, and run the executable to start chatting with the AI models.

key features of Console Chat GPT?

  • Supports multiple AI models for diverse interactions.
  • Customizable settings through a configuration file.
  • Role selection and temperature control for personalized conversations.
  • Conversation history and error handling for improved user experience.
  • Streaming capabilities and command handling for intuitive interaction.

use cases of Console Chat GPT?

  1. Engaging in AI-driven conversations for information retrieval.
  2. Testing and experimenting with different AI models in a CLI environment.
  3. Automating tasks and generating responses based on user input.

FAQ from Console Chat GPT?

  • Is Console Chat GPT free to use?

The tool is open-source and free to use, but you may incur costs based on the API usage of the AI models.

  • What platforms does Console Chat GPT support?

It works on Linux and macOS, with recommendations for using WSL on Windows.

  • Can I customize the AI model used?

Yes! You can select different models and configure settings to suit your preferences.

Content

console-chat-gpt v6

Your Ultimate CLI Companion for Chatting with AI Models

Enjoy seamless interactions with OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, Inception or Ollama hosted models directly from your command line.
Elevate your chat experience with efficiency and ease.

Homepage | Examples

Released under the Apache license. Working on Python 3.10+


Table of Contents


DISCLAIMER: The intention and implementation of this code are entirely unconnected and unrelated to OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Aliababa, Inception or any other related parties. There is no affiliation or relationship with OpenAI, MistralAI, Anthropic, xAI, Google, DeepSeek, Alibaba, Inception or their subsidiaries in any form.


Features

  • :star: Run Ollama hosted models locally. Ollama should be installed and the selected models to be already downloaded :star:
  • :star: Anthropic Prompt caching Fully supported :star:
  • :star: Model Context Protocol (MCP) supported! If you are already using MCP servers just copy your claude_desktop_config.json to the root directory and rename to mcp_config.json to start using with any model! :star:
  • Unified chat completion function separated as independent library to be used in any application for seamless cross-provider API experience. The source code is available in Python and TypeScript
  • Streaming with all supported models, disabled by default, may be enabled in settings menu
  • OpenAI Assistants Beta fully supported
  • AI Managed mode Based on the complexity of the task, automatically determines which model to use.
  • Configuration File: Easily customize the app's settings through the config.toml file for complete control over how the app works. Also supported in-app via the settings command.
  • Role selection: Users can define the role of the AI in the conversation, allowing for a more personalized and interactive experience.
  • Temperature Control: Adjust the temperature of generated responses to control creativity and randomness in the conversation.
  • Command Handling: The app responds to various commands entered by the user for easy and intuitive interaction.
  • Image input: with selected models.
  • Error Handling: Clear and helpful error messages to easily understand and resolve any issues.
  • Conversation History: Review previous interactions and save conversations for future reference, providing context and continuity.
  • Graceful Exit: Smoothly handle interruptions, ensuring conversations are saved before exiting to avoid loss of progress.
  • A nice team: Actively adding features, open for ideas and fixing bugs.

Overall, this app focuses on providing a user-friendly and customizable experience with features that enhance personalization, control, and convenience.


Installation and Usage

The script works fine on Linux and MacOS terminals. For Windows it's recommended to use WSL.

  1. Clone the repository:

    git clone https://github.com/amidabuddha/console-chat-gpt.git
    
  2. Go inside the folder:

    cd console-chat-gpt
    
  3. Install the necessary dependencies:

    python3 -m pip install -r requirements.txt
    
  4. Get your API key from OpenAI, MistralAI, Anthropic, xAI, Google AI Studio, DeeepSeek, Alibaba, Inception depending on your selected LLM.

  5. The config.toml.sample will be automatically copied into config.toml upon first run, with a prompt to enter your API key/s. Feel free to change any of the other defaults that are not available in the settings in-app menu as per your needs.

  6. Run the executable:

    python3 main.py
    

    Pro-tip: Create an alias for the executable to run from anywhere.

  7. Use the help command within the chat to check the available options.

  8. Enjoy


Examples

  • Prompt example:

    example_python

  • Markdown visualization example:

    example_markdown

  • Settings and help:

    example_settings

You can find more examples on our Examples page.


Contributing

Contributions are welcome! If you find any bugs, have feature requests, or want to contribute improvements, please open an issue or submit a pull request.

No tools information available.
No content found.