Report Abuse

Basic Information

Bosquet is a Clojure LLMOps toolkit intended to help developers build production and experimental applications that use large language models. It provides primitives and conventions for managing LLM and tool services, composing and templating prompts, chaining prompt logic, and defining agents that interact with external APIs. The project targets common LLM application needs such as prompt template complexity, limited model context windows that require memory handling, and the need to combine model calls with external tools. The README includes CLI usage, examples of programmatic generation and chat flows, configuration practice using config.edn and secrets.edn, and demonstrates integrations with services like OpenAI and Ollama. The library is aimed at developers and teams building LLM-based systems rather than end users.

Links

Categorization

App Details

Features
Bosquet bundles a set of practical features for LLM application development. It offers LLM and tool service management so multiple backends can be configured and selected. Prompt templating is integrated with the Selmer templating library to manage complex templates. Prompt chaining and composition are enabled via a Pathom graph processing machine to orchestrate multi-step generation workflows. The project exposes abstractions for defining agents and tools that call external APIs, examples for tool registration and invocation, and support for model tooling (OpenAI and Ollama examples). It includes memory handling for LLM conversations, call response caching, CLI utilities for service configuration and running demos, and generator APIs that return conversation and completions metadata including usage and timing.
Use Cases
Bosquet simplifies the engineering overhead of building LLM-driven applications by providing structured components that address recurring problems. Developers get a consistent way to define prompts, compose multi-turn interactions, and connect model outputs to external tool functions, which reduces ad hoc glue code. Memory handling and response caching help manage context window limits and reduce redundant calls, improving cost and performance. The Pathom integration makes it easier to express complex prompt chains and data flows. CLI commands and documented examples enable quick prototyping and reproducible runs. Overall it speeds development of chat, agent-based workflows, and LLM pipelines for teams using Clojure and common LLM backends.

Please fill the required fields*