Report Abuse

Basic Information

AIChat is an end-user command line interface that brings large language models to the terminal by combining chat, REPL, shell automation, retrieval augmented generation, tools and agent capabilities into one package. The project is designed to let users interact with many LLM providers through a unified CLI, run an interactive Chat-REPL with history and autocompletion, and convert plain-language tasks into shell commands via the Shell Assistant. It also supports sessions to maintain conversational context, macros to automate repetitive REPL sequences, and a lightweight built-in HTTP server exposing chat and embeddings endpoints plus a web playground and model comparison arena. The repository includes installation instructions for multiple package managers and prebuilt binaries and is licensed under MIT or Apache 2.0.

Links

Categorization

App Details

Features
AIChat supports integration with over twenty LLM providers and multiple model backends including OpenAI, Claude, Gemini, Ollama, Groq and others. It provides a CMD mode and an interactive REPL with tab completion, multi-line input, history search and configurable keybindings. The Shell Assistant converts natural language into OS-aware shell commands. Multi-form input accepts stdin, local files and directories, remote URLs and external command output. Role and session management let you customize prompts and preserve context. Features include macros to combine REPL commands, RAG to include external documents, function calling and an AI Tools & MCP system for tool integration, CLI-style AI Agents, a local server with REST endpoints, a web playground and an LLM arena for model comparison, plus custom themes and documentation.
Use Cases
AIChat helps users and power users access and compare many LLMs without switching SDKs or web UIs by providing a consistent CLI and web endpoints. The Shell Assistant speeds up command-line workflows by translating natural-language tasks into shell commands tailored to the OS and shell. REPL, sessions and roles make iterative conversations and role-based prompts easy to manage. RAG and multi-form input enable context-rich interactions using local files, directories or remote URLs. Function calling, AI Tools and MCP let models trigger external actions and integrate data sources. Macros automate repetitive sequences and the local server with a playground and arena enables lightweight deployment, API proxying and side-by-side model evaluation for testing and experimentation.

Please fill the required fields*