Report Abuse

Basic Information

npcpy is a Python framework and developer toolkit for building and deploying LLM-driven agents, multi-agent teams, and NLP pipelines. The library provides an NPC abstraction for creating persona-driven agents, a Team abstraction for orchestrating multiple NPCs under a coordinator forenpc, and Jinx templates for composing multi-step workflows that mix natural language prompts with executable Python steps. It is designed to handle text, images, and video generation, to integrate external data sources such as CSVs, PDFs and images into agent responses, and to enable tool calling and structured output extraction. The project includes client functions to call LLMs directly, utilities for streaming responses, and optional local/enterprise inference integrations through liteLLM-compatible providers like Ollama, OpenAI, Anthropic, Gemini and others.

Links

Categorization

App Details

Features
Provides NPC, Team and Jinx abstractions for building single agents, coordinated teams, and reusable workflow templates. Supports tool calling with auto-generated tool schemas and a structured tool call/result response format. Includes convenience functions like get_llm_response, gen_image and gen_video for multimodal generation and supports attachments and streaming outputs. Offers a built-in Flask server and REST API endpoints (/api/execute, /api/stream, /api/models, /api/npc_team_global, /api/jinxs/global) to serve teams and individual NPCs. Enables Jinxs (Jinja execution templates) to combine prompt-driven and code-driven steps. Integrates with many model providers via liteLLM and exposes install extras for local, lite and full feature sets. Ships examples for frontend and Python clients, a CLI shell and references to a GUI studio.
Use Cases
npcpy accelerates development of agentic and NLP applications by providing ready-made building blocks for persona creation, team orchestration, and reproducible workflows. Developers can register tools, attach data sources, and let models orchestrate tool calls while receiving structured tool results, which simplifies verification and automation. Jinxs let teams encode complex pipelines mixing Python and natural-language steps so researchers can prototype data analysis, multimedia content pipelines, and scheduled automation without wiring low-level orchestration. The included Flask server and documented API endpoints make it straightforward to deploy agents for frontends or production services. Support for streaming, attachments, multiple providers, and local models eases experimentation, testing at the edge, and integration into research or production stacks.

Please fill the required fields*