Report Abuse

Basic Information

Chuanhu Chat is a user-focused web application that provides a lightweight graphical interface to interact with ChatGPT and many other large language models. It aims to make multi-model conversational workflows accessible by unifying API-based models and locally deployed models under a single UI. The project includes features for running a local server or deploying to hosting services, launching the app via a Python script, and configuring API keys and server settings through a config.json file. It advertises compatibility with a wide range of providers and models, including OpenAI (GPT-5/GPT-4/GPT-3.5), Azure, Google Gemini, Claude, and local models such as ChatGLM and LLaMA. The README highlights support for file-based question answering, online search integration, an AutoGPT-like assistant, fine-tuning for GPT-3.5, mobile and PWA installation, and an emphasis on a polished, responsive UI.

Links

Categorization

App Details

Features
The repository bundles a polished web UI with features like a reworked small-and-beautiful theme, adaptive mobile layout, PWA installability, and enhanced history management including search, regex search, rename, delete, and automatic LLM-generated titles. It supports system prompt templates and one-click prompt loading, rich rendering of LaTeX, tables, and code with syntax highlighting, and per-message controls for copying and viewing raw Markdown. Backend features include multi-provider support, custom model integrations, local LLM deployment, file-based knowledge base for document Q&A, online search augmentation, an AutoGPT-style assistant, GPT-3.5 fine-tuning support, configurable API hosts and proxies, multiple API-key load balancing, and deployment-related settings such as bind address, port, and public sharing options.
Use Cases
For end users and hobbyist deployers, Chuanhu Chat centralizes access to many LLMs and streamlines conversational workflows with a modern, mobile-friendly UI and features that improve productivity and traceability. It makes it easy to consult documents via a built-in knowledge base, augment model responses with web search, and run or connect to local models for privacy or offline use. Power users get fine-tuning support, adjustable model parameters, multiple API key management, and proxy configuration. Developers and non-technical users can deploy the app by cloning the repo, installing requirements, copying config_example.json to config.json, providing API keys, and launching the Python app to serve a browser-based chat interface.

Please fill the required fields*