openinference

Report Abuse

Basic Information

OpenInference provides a specification, conventions, and language-specific plugins to instrument and trace AI applications alongside OpenTelemetry. The repo defines semantic conventions and a transport-agnostic spec to capture LLM invocations, retrieval steps, external tool usage, and application context. It hosts instrumentation libraries and reusable utilities across Python, JavaScript, and Java, with packaged modules for many SDKs and frameworks. The project includes example applications and integrations for common stacks such as LangChain, LlamaIndex, OpenAI SDKs, VertexAI, Bedrock, and multiple agent frameworks. It is intended for developers and observability teams who need structured tracing data from LLM-powered services and supports exporting spans to Arize-Phoenix, Arize, or any OpenTelemetry-compatible collector.

Links

App Details

Features
A transport- and file-format-agnostic specification stored in a spec directory, defining semantic conventions for LLM apps. Language libraries and packages for Python, JavaScript, and Java that include core utilities, semantic conventions, and instrumentation modules. Prebuilt instrumentations for many SDKs and frameworks such as OpenAI, LangChain, LlamaIndex, VertexAI, Bedrock, MistralAI, Anthropic, LangChain4j and others. Published packages available via PyPI, NPM, and Maven Central. Numerous beginner-to-intermediate examples showing RAG, chatbots, LangChain, FastAPI backends, Next.js frontends, and agent examples. Support for exporting spans to Arize-Phoenix, Arize, or any OTEL-compatible collector.
Use Cases
OpenInference standardizes how AI model calls and related application events are traced so teams can observe, debug, and analyze LLM behavior in production. By providing semantic conventions and ready-made instrumentations, it reduces integration effort and ensures consistent span data across different SDKs and languages. The examples and utilities accelerate adoption in common architectures like RAG pipelines, agentic frameworks, and web backends. Because it is compatible with OpenTelemetry collectors and Arize destinations, organizations can route trace data into existing observability platforms for monitoring, performance analysis, and incident investigation without rearchitecting their telemetry stack.

Please fill the required fields*