compose for agents

Report Abuse

Basic Information

This repository provides a curated collection of runnable demos that show how to build and run AI agent systems using Docker Compose. It bundles multiple self-contained example projects that demonstrate single-agent and multi-agent setups, orchestration of open-source and hosted models, and integration with Model Control Planes (MCPs) and external tools. The README documents prerequisites such as Docker Desktop or Docker Engine, GPU requirements or the Docker Offload alternative, and guidance for enabling GPU support and installing Docker Compose on Linux. Each demo directory contains compose files and optional secret examples. The repository also documents how to switch from running models locally with Docker Model Runner to using OpenAI by adding a secret API key file and using a compose.overlay file.

Links

Categorization

App Details

Features
The repo includes dozens of ready-to-run demos with compose.yaml files for different agent systems and model configurations. Each demo is self-contained and usually requires creating a .mcp.env from an mcp.env.example and running docker compose up --build. It supports running models locally via Docker Model Runner and offloading to OpenAI by using a compose.openai.yaml overlay and a secret.openai-api-key file. The README lists platform prerequisites, GPU and driver requirements, and Docker Compose version requirements. The demos cover multi-agent and single-agent examples, show integrations with MCPs like DuckDuckGo and GitHub, and include a matrix table enumerating included projects, models, MCPs and compose file locations.
Use Cases
This repository helps developers and researchers quickly prototype, reproduce, and compare agent deployments using Docker Compose. By providing curated example projects, preconfigured compose files, and clear prerequisites, it reduces setup friction for running local models or switching to hosted model providers. The demos illustrate multi-agent collaboration, single-agent tasks, and integrations with search and data MCPs so users can learn orchestration patterns, secret management (.mcp.env and secret.openai-api-key), and model selection workflows. It also documents hardware and software requirements for GPU acceleration and points to Docker Offload for users without local GPUs. The dual Apache-2.0 or MIT licensing clarifies reuse and integration options.

Please fill the required fields*