Features
Dynamiq exposes composable primitives and examples for building LLM-driven systems. Key features shown in the README include predefined node types for OpenAI LLMs, agent implementations such as ReAct, Reflection, and SimpleAgent, and tools like an E2B interpreter and search integrations. Workflow primitives let users add nodes, set dependencies, map inputs and outputs, and run flows both synchronously and asynchronously. RAG support is provided via document converters, splitters, embedders, and Pinecone writers and retrievers. Orchestration options include a Workflow/Flow system, an AdaptiveOrchestrator with an agent manager, and a GraphOrchestrator with custom state routing. The library also offers a Memory module with backends, example patterns for chatbots, async agent execution, and sample integrations to services like OpenAI, Pinecone, ScaleSerp, and E2B. Documentation and examples accompany the codebase.
Use Cases
Dynamiq helps developers accelerate building production-grade LLM applications by providing reusable components and orchestration patterns. Instead of wiring model calls, embedding pipelines, vector stores, and custom logic from scratch, engineers can assemble nodes, tools, and agents into workflows that handle dependencies, input transformation, and parallel or sequential execution. The RAG examples illustrate indexing and retrieval flows for PDFs and other documents, simplifying document search and answer generation. Memory backends and agent roles enable stateful chatbots and iterative refinement loops. Orchestrators and agent managers help coordinate multiple specialist agents for planning, coding, and search tasks. The repository includes examples, installation instructions, and API patterns to reduce integration effort with external services and to prototype, test, and deploy LLM-driven systems faster.