Report Abuse

Basic Information

LLMChat is a privacy-focused, monorepo platform for building and running AI chat experiences and agentic research workflows. It provides a Next.js and TypeScript web and desktop application scaffold plus shared packages for AI integrations, orchestration, UI, and utilities. The repository is designed to let developers compose multi-step agent workflows (planning, information gathering, analysis, report generation) that coordinate LLM calls, emit typed events, and maintain a shared context. It emphasizes local-first storage of user data in the browser via IndexedDB (Dexie.js) so chat history does not leave the device. LLMChat supports multiple LLM providers and exposes an AI SDK and workflow builder to assemble reusable tasks and run research agents from development to local deployment.

Links

Categorization

App Details

Features
LLMChat offers advanced research modes including Deep Research and Pro Search that enable in-depth topic exploration and web-integrated search. It supports multiple LLM providers such as OpenAI, Anthropic, Google, Fireworks, Together AI, and xAI via a unified AI SDK. A modular workflow orchestration system lets developers define typed events, shared context, and task pipelines with examples for task planning, information gathering, analysis, and report generation. Privacy features include local storage of all user data using IndexedDB via Dexie.js and no server-side storage by default. The monorepo structure contains apps for web and desktop and packages for AI, orchestrator, actions, common utilities, UI components, and shared configuration files.
Use Cases
For developers and teams, LLMChat provides a ready-made architecture and examples to build agentic chat tools and research assistants without shipping user data to servers. Its workflow engine makes it easier to break complex queries into discrete tasks, coordinate multi-step LLM interactions, and stream events to a UI for real-time visibility. The bundled frontend and desktop apps, shared UI components, and TypeScript types accelerate integration and iteration. Multi-provider support and an AI SDK let teams experiment with different models. Local-first storage and clear tech choices (Next.js, Turborepo, Bun, Dexie.js) reduce infrastructure overhead during prototyping and maintain user privacy by design.

Please fill the required fields*