Features
DeerFlow provides modular multi-agent orchestration using LangGraph, a Planner for task decomposition and a Research Team of specialized agents for web search, crawling and code execution. LLM integration supports litellm providers, open source models and OpenAI-compatible APIs with multi-tier model configurations. Search and retrieval support includes Tavily, DuckDuckGo, Brave Search and Arxiv plus private RAG providers such as RAGFlow. Tooling includes Jina for crawling, advanced content extraction, a Python REPL tool for code analysis and execution, and MCP integrations for extended domain access. Human-in-the-loop capabilities let users review or auto-accept plans and edit reports with a Notion-like editor. Output features include automated report generation, PowerPoint generation and TTS audio synthesis via a volcengine integration. Development and operations features include LangGraph Studio debugging, LangSmith tracing, checkpointing to Postgres/MongoDB, CLI flags, Docker and docker-compose setups.
Use Cases
DeerFlow helps researchers, engineers and teams automate complex research tasks while preserving control and reproducibility. It decomposes high-level queries into executable plans, orchestrates specialized agents to collect and validate evidence from web and private sources, and aggregates results into structured reports that can be edited and polished by humans. Built-in code execution enables technical validation and reproduction of findings. Integrations with RAG providers and private knowledgebases make it possible to leverage proprietary documents. Debugging with LangGraph Studio and tracing support simplifies workflow inspection and troubleshooting. Checkpointing and chat stream replay aid traceability and auditability. Export options for presentations and text-to-speech let teams produce shareable deliverables such as slides and podcasts. The stack supports local development, containerized deployment and cloud marketplace deployment for production use.