openinference
Basic Information
OpenInference provides a specification, conventions, and language-specific plugins to instrument and trace AI applications alongside OpenTelemetry. The repo defines semantic conventions and a transport-agnostic spec to capture LLM invocations, retrieval steps, external tool usage, and application context. It hosts instrumentation libraries and reusable utilities across Python, JavaScript, and Java, with packaged modules for many SDKs and frameworks. The project includes example applications and integrations for common stacks such as LangChain, LlamaIndex, OpenAI SDKs, VertexAI, Bedrock, and multiple agent frameworks. It is intended for developers and observability teams who need structured tracing data from LLM-powered services and supports exporting spans to Arize-Phoenix, Arize, or any OpenTelemetry-compatible collector.