Report Abuse

Basic Information

Open Assistant API is an open-source, self-hosted API for running AI assistants and building LLM applications. It implements an assistant API compatible with the official OpenAI interface so developers can use the OpenAI Python client and other OpenAI-compatible tools to create, configure, and run assistants. The project is targeted at teams or developers who want a locally deployable assistant backend that supports multiple LLMs via One API, integrates retrieval-augmented generation (RAG) engines such as R2R, and exposes REST/OpenAPI endpoints and example scripts. The README emphasizes easy startup with Docker Compose, configurable environment variables for API keys and RAG endpoints, and an examples directory demonstrating assistant creation, streaming, tools, and retrieval usage.

Links

Categorization

App Details

Features
The repository provides compatibility with the OpenAI API surface and the OpenAI Python client, enabling existing clients to point to a self-hosted base_url. It supports integration with One API for multi-model management and an R2R RAG engine for retrieval; supported file types are txt, html, markdown, pdf, docx, pptx, xlsx, png, mp3 and mp4. The service supports message streaming output, multimodal inputs, internet search tools (e.g., Bing), and extendable custom tools defined via OpenAPI/Swagger. Deployment is simplified through docker-compose and configuration via environment variables. The README documents token-based authentication, admin token configuration, API base URL and Swagger docs, and includes test case references and example scripts for tool and auth-enabled tool usage.
Use Cases
This project helps developers and teams host an assistant API that mirrors the OpenAI assistant interface so they can run LLM-driven services without relying on a closed cloud provider. It enables local deployment for data control and extensibility, supports a broader set of LLMs through One API, and improves LLM responses with a configurable RAG backend for document retrieval across common file types. Built-in support for streaming responses and tool integrations lets applications provide richer interactions such as web search and authenticated external actions. The examples, API documentation, and simple Docker Compose startup lower the barrier to testing and integrating the assistant into existing applications or pipelines.

Please fill the required fields*