Report Abuse

Basic Information

Sim is a lightweight, user-friendly platform and open-source project for building, running, and self-hosting AI agent workflows. The repository provides a full-stack application composed of a Next.js frontend, a realtime socket server, and tooling to create visual flows and agent pipelines. It supports both a cloud-hosted offering and multiple self-hosting methods including an NPM launcher, Docker Compose setups, dev containers, and manual Bun-based development. The system is designed for teams and developers who want to prototype, run local models, or deploy production-ready agent workflows with support for vector embeddings, knowledge bases, and semantic search backed by PostgreSQL with the pgvector extension. The README documents setup, requirements, and recommended configurations for local GPUs and CPU-only environments.

Links

Categorization

App Details

Features
Sim includes a visual flow editor built with ReactFlow for composing agent workflows and integrations. The stack uses Next.js with the Bun runtime, Drizzle ORM for PostgreSQL, and pgvector for embeddings to enable semantic search and knowledge bases. It offers realtime communication via Socket.io, background job orchestration with Trigger.dev, authentication via Better Auth, and UI components powered by Shadcn and Tailwind CSS. Deployment options include a simple NPM start command, Docker Compose manifests including Ollama support for local models, and dev container workflows for development. The repo is a monorepo managed with Turborepo and includes state management with Zustand, migration tooling, and documented commands for running migrations and pulling local models.
Use Cases
Sim helps developers and teams accelerate the creation and deployment of AI-driven agents by combining a visual editor, embeddings-backed knowledge features, and flexible hosting options. Users can prototype quickly with the NPM package or run fully self-hosted stacks using Docker Compose or dev containers, and they can run local LLMs via Ollama to avoid external APIs. The platform"s use of pgvector and Drizzle ORM enables semantic search and persistent knowledge bases, while realtime sockets and background job support provide interactive and production-capable workflows. The documented setup steps, migration commands, and recommended requirements make it practical to adopt for on-premise, privacy-sensitive, or experimental AI agent projects.

Please fill the required fields*