comfyui_LLM_party
Basic Information
ComfyUI LLM Party provides a comprehensive set of custom ComfyUI nodes and example workflows to build, orchestrate and run LLM and VLM driven pipelines inside the ComfyUI visual frontend. The project is aimed at users who want to integrate large language models, multimodal models, retrieval-augmented generation, and tool plugins into image-generation and streaming workflows. It enables fast assembly of single-agent assistants, multi-agent topologies (radial and ring interaction modes), and industry-specific knowledge bases managed locally. The repo supports API-based and local model usage (including GGUF, llama.cpp and ollama), offers MCP integration to expose external tools to models, supplies image hosting and TTS support, and includes installation and configuration guidance for ComfyUI environments. Prebuilt workflows and node documentation help users connect models, tools and social app connectors for real use cases.