Report Abuse

Basic Information

Bolna is an end-to-end open source, production-ready framework for quickly building LLM-based, voice-first conversational agents. The repository provides the server, example payloads and a dockerized local setup to create agents that can initiate phone calls, stream bi-directional audio to and from telephony providers, transcribe speech, process conversation with LLMs and synthesize responses back to callers. It is aimed at developers who want to define agents and toolchains via JSON payloads and run agents locally or extend provider integrations. The README documents how to create agents via an HTTP API, how to invoke calls, and how to configure provider credentials and environment variables for ASR, TTS, LLM and telephony services.

Links

App Details

Features
Bolna includes a configurable task and toolchain model where pipelines can chain transcriber->LLM->synthesizer in streaming or parallel execution. It supports multiple telephony providers such as Twilio and Plivo and multiple ASR, LLM and TTS providers including Deepgram, LiteLLM/VLLM-based LLMs, OpenAI, AWS Polly and ElevenLabs. The repo provides docker-compose based local setup with four containers (telephony server, Bolna server, ngrok and redis), example telephony server implementations, example agent creation and call initiation APIs, environment variable driven provider configuration and extension points for adding new input and output handler modules under input_handlers and output_handlers.
Use Cases
Bolna lowers the development effort required to build voice-first conversational applications by offering a ready-made orchestration framework that handles call initiation, real-time transcription, LLM-driven dialog and TTS playback. Developers can reuse provider integrations, run a local dockerized demo, and customize agents via JSON without building low-level telephony or streaming plumbing. The plug-in handler pattern and documented environment variables let teams add new telephony, ASR or TTS providers and experiment with hosted or self-hosted LLMs. The project is open source, MIT licensed, includes contribution guidance and examples, and provides an option to engage the maintainers for managed or customized hosted offerings.

Please fill the required fields*