Report Abuse

Basic Information

Super Agent Party is a pluggable platform and desktop application that upgrades large language model (LLM) capabilities and deploys intelligent agents across multiple channels. It is presented as a 3D AI desktop companion while providing an extensible enhancement layer for existing LLM APIs that requires no code modification. The project integrates knowledge bases, long-term memory, real-time networking, code execution, multimodal capabilities (vision, drawing, audio, speech), and research/deep-thinking control as modular functions. It exposes OpenAI API-compatible and MCP protocol interfaces to allow developers and external systems to connect easily. The repo supports local and cloud model vendors, cross-platform deployment on Windows, macOS and Linux, Docker containerization, and rapid one-click distribution to chat UIs, messaging bots and VRM virtual pets.

Links

Categorization

App Details

Features
Modular LLM enhancement that plugs into existing model interfaces without changing client code. Persistent memory and lorebook world-building for long-term context. Multimodal toolchain support including vision, drawing, audio and speech. Code execution and research/deep-thinking control modules. One-click deployment pathways to desktop, web, Docker and multiple endpoints such as WeChat, QQ, Bilibili and VRM desktop companions. Developer-friendly OpenAI API-compatible interface and MCP protocol support. Integration with third-party tools and workflows including ComfyUI conversion and multi-server load balancing. Support for both locally deployed engines and cloud vendor APIs. Asynchronous tool invocation to avoid blocking agent replies. Installers and packages for Windows, macOS and Linux and Docker images for quick setup.
Use Cases
The project helps both end users and developers by providing a ready-to-use desktop AI companion while offering a framework to enhance and orchestrate LLM capabilities at scale. Teams can add knowledge bases, permanent memory, code execution and multimodal tools to models without rewriting callers. Developers can expose the platform as an OpenAI-compatible API or MCP server to integrate agents into existing systems and pipelines. Operators gain flexible deployment options including native apps, Docker containers and web services, with local data caching to retain privacy. Toolchain integrations convert existing workflows into agent tools and enable cross-platform aggregation. The dual licensing model and documentation guide usage and commercial licensing where required.

Please fill the required fields*