Report Abuse

Basic Information

LocalAGI is a self-hostable AI agent platform for designing, running and managing customizable AI assistants, automations, chat bots and cooperative agent teams entirely on local hardware. It provides a web-based no-code interface, a Go library and a REST API to create single agents or agent pools, configure planning and reasoning, enable short and long term memory via LocalRecall, and connect agents to external services using built-in connectors or MCP servers. The project is presented as a drop-in replacement for OpenAI's Responses API while avoiding cloud dependencies and data leakage. LocalAGI supports CPU and multiple GPU configurations through Docker Compose, allows model customization with LocalAI and GGUF/GGML formats, and exposes endpoints for chat, observability, action execution, import/export and agent lifecycle management.

Links

App Details

Features
LocalAGI includes no-code agent creation via a web UI, advanced agent teaming and group generation, and a comprehensive REST API compatible with OpenAI Responses. It offers connectors for Discord, Slack, Telegram, GitHub Issues and IRC, multimodal and image model support, and persistent memory powered by LocalRecall. The platform supports planning and reasoning, periodic cron-style tasks, memory management and real-time observability via SSE. Developers can extend functionality with interpreted Go custom actions and integrate external tools using MCP servers. Deployment is simplified with Docker Compose profiles for CPU, NVIDIA, Intel and AMD GPUs, plus prebuilt binaries and the option to use LocalAGI as a Go library for programmatic control.
Use Cases
LocalAGI helps teams and individuals build private, offline AI workflows by running agents entirely on their own hardware so no data is sent to cloud providers. It simplifies creation and orchestration of agents with graphical and programmatic interfaces, enables persistent knowledge storage and retrieval using LocalRecall, and supports flexible model choices through LocalAI compatibility. The REST API and Go library allow integration into existing systems and automation pipelines, while connectors and MCP servers let agents interact with chat platforms and external services. Observability, agent lifecycle controls, import/export, scheduling and custom actions make it practical to deploy, monitor and iterate on multi-agent automations without requiring cloud keys or extra agentic Python libraries.

Please fill the required fields*