Home

Awesome

<p align="center"> <img alt="OpenInfence" src="https://raw.githubusercontent.com/Arize-ai/phoenix-assets/main/logos/OpenInference/Full%20color/OI-full-horiz.svg" width="40%" height="100%" /> </p> <p align="center"> <a href="https://arize-ai.github.io/openinference/"> <img src="https://img.shields.io/static/v1?message=Spec&logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAIAAAACACAYAAADDPmHLAAAG4ElEQVR4nO2d4XHjNhCFcTf+b3ZgdWCmgmMqOKUC0xXYrsBOBVEqsFRB7ApCVRCygrMriFQBM7h5mNlwKBECARLg7jeDscamSQj7sFgsQfBL27ZK4MtXsT1vRADMEQEwRwTAHBEAc0QAzBEBMEcEwBwRAHNEAMwRATBnjAByFGE+MqVUMcYOY24GVUqpb/h8VErVKAf87QNFcEcbd4WSw+D6803njHscO5sATmGEURGBiCj6yUlv1uX2gv91FsDViArbcA2RUKF8QhAV8RQc0b15DcOt0VaTE1oAfWj3dYdCBfGGsmSM0XX5HsP3nEMAXbqCeCdiOERQPx9og5exGJ0S4zRQN9KrUupfpdQWjZciure/YIj7K0bjqwTyAHdovA805iqCOg2xgnB1nZ97IvaoSCURdIPG/IHGjTH/YAz/A8KdJai7lBQzgbpx/0Hg6DT18UzWMXxSjMkDrElPNEmKfAbl6znwI3IMU/OCa0/1nfckwWaSbvWYYDnEsvCMJDNckhqu7GCMKWYOBXp9yPGd5kvqUAKf6rkAk7M2SY9QDXdEr9wEOr9x96EiejMFnixBNteDISsyNw7hHRqc22evWcP4vt39O85bzZH30AKg4+eo8cQRI4bHAJ7hyYM3CNHrG9RrimSXuZmUkZjN/O6nAPpcwCcJNmipAle2QM/1GU3vITCXhvY91u9geN/jOY27VuTnYL1PCeAcRhwh7/Bl8Ai+IuxPiOCShtfX/sPDtY8w+sZjby86dw6dBeoigD7obd/Ko6fI4BF8DA9HnGdrcU0fLt+n4dfE6H5jpjYcVdu2L23b5lpjHoo+18FDbcszddF1rUee/4C6ZiO+80rHZmjDoIQUQLdRtm3brkcKIUPjjqVPBIUHgW1GGN4YfawAL2IqAVB8iEE31tvIelARlCPPVaFOLoIupzY6xVcM4MoRUyHXyHhslH6PaPl5RP1Lh4UsOeKR2e8dzC0Aiuvc2Nx3fwhfxf/hknouUYbWUk5GTAIwmOh5e+H0cor8vEL91hfOdEqINLq1AV+RKImJ6869f9tFIBVc6y7gd3lHfWyNX0LEr7EuDElhRdAlQjig0e/RU31xxDltM4pF7IY3pLIgxAhhgzF/iC2M0Hi4dkOGlyGMd/g7dsMbUlsR9ICe9WhxbA3DjRkSdjiHzQzlBSKNJsCzIcUlYdfI0dcWS8LMkPDkcJ0n/O+Qyy/IAtDkSPnp4Fu4WpthQR/zm2VcoI/51fI28iYld9/HEh4Pf7D0Bm845pwIPnHMUJSf45pT5x68s5T9AW6INzhHDeP1BYcNMew5SghkinWOwVnaBhHGG5ybMn70zBDe8buh8X6DqV0Sa/5tWOIOIbcWQ8KBiGBnMb/P0OuTd/lddCrY5jn/VLm3nL+fY4X4YREuv8vS9wh6HSkAExMs0viKySZRd44iyOH2FzPe98Fll7A7GNMmjay4GF9BAKGXesfCN0sRsDG+YrhP4O2ACFgZXzHdKPL2RMJoxc34ivFOod3AMMNUj5XxFfOtYrUIXvB5MandS+G+V/AzZ+MrEcBPlpoFtUIEwBwRAG+OIgDe1CIA5ogAmCMCYI4IgDkiAOaIAJgjAmCOCIA5IgDmiACYIwJgjgiAOSIA5ogAmCMCYI4IgDkiAOaIAJgjAmCOCIA5IgDmiACYIwJgjgiAOSIA5ogAmCMCYI4IgDkiAOaIAJgjAmDOVYBXvwvxQV8NWJOd0esvJ94babZaz7B5ovldxnlDpYhp0JFr/KTlLKcEMMQKpcDPXIQxGXsYmhZnXAXQh/EWBQrr3bc80mATyyrEvs4+BdBHgbdxFOIhrDkSg1/6Iu2LCS0AyoqI4ftUF00EY/Q3h1fRj2JKAVCMGErmnsH1lfnemEsAlByvgl0z2qx5B8OPCuB8EIMADBlEEOV79j1whNE3c/X2PmISAGUNr7CEmUSUhjfEKgBDAY+QohCiNrwhdgEYzPv7UxkadvBg0RrekMrNoAozh3vLN4DPhc7S/WL52vkoSO1u4BZC+DOCulC0KJ/gqWaP7C8hlSGgjxyCmDuPsEePT/KuasrrAcyr4H+f6fq01yd7Sz1lD0CZ2hs06PVJufs+lrIiyLwufjfBtXYpjvWnWIoHoJSYe4dIK/t4HX1ULFEACkPCm8e8wXFJvZ6y1EWhJkDcWxw7RINzLc74auGrgg8e4oIm9Sh/CA7LwkvHqaIJ9pLI6Lmy1BigDy2EV8tjdzh+8XB6MGSLKH4INsZXDJ8MGhIBK+Mrpo+GnRIBO+MrZjFAFxoTNBwCvj6u4qvSZJiM3iNX4yvmHoA9Sh4PF0QAzBEBMEcEwBwRAHNEAMwRAXBGKfUfr5hKvglRfO4AAAAASUVORK5CYII=&labelColor=grey&color=blue&logoColor=white&label=%20"/> </a> <a target="_blank" href="https://join.slack.com/t/arize-ai/shared_invite/zt-1px8dcmlf-fmThhDFD_V_48oU7ALan4Q"> <img src="https://img.shields.io/static/v1?message=Community&logo=slack&labelColor=grey&color=blue&logoColor=white&label=%20"/> </a> </p>

OpenInference is a set of conventions and plugins that is complimentary to OpenTelemetry to enable tracing of AI applications. OpenInference is natively supported by arize-phoenix, but can be used with any OpenTelemetry-compatible backend as well.

Specification

The OpenInference specification is edited in markdown files found in the spec directory. It's designed to provide insight into the invocation of LLMs and the surrounding application context such as retrieval from vector stores and the usage of external tools such as search engines or APIs. The specification is transport and file-format agnostic, and is intended to be used in conjunction with other specifications such as JSON, ProtoBuf, and DataFrames.

Instrumentation

OpenInference provides a set of instrumentations for popular machine learning SDKs and frameworks in a variety of languages.

Python

Libraries

PackageDescriptionVersion
openinference-semantic-conventionsSemantic conventions for tracing of LLM Apps.PyPI Version
openinference-instrumentation-openaiOpenInference Instrumentation for OpenAI SDK.PyPI Version
openinference-instrumentation-llama-indexOpenInference Instrumentation for LlamaIndex.PyPI Version
openinference-instrumentation-dspyOpenInference Instrumentation for DSPy.PyPI Version
openinference-instrumentation-bedrockOpenInference Instrumentation for AWS Bedrock.PyPI Version
openinference-instrumentation-langchainOpenInference Instrumentation for LangChain.PyPI Version
openinference-instrumentation-mistralaiOpenInference Instrumentation for MistralAI.PyPI Version
openinference-instrumentation-guardrailsOpenInference Instrumentation for Guardrails.PyPI Version
openinference-instrumentation-vertexaiOpenInference Instrumentation for VertexAI.PyPI Version
openinference-instrumentation-crewaiOpenInference Instrumentation for CrewAI.PyPI Version
openinference-instrumentation-haystackOpenInference Instrumentation for Haystack.PyPI Version
openinference-instrumentation-litellmOpenInference Instrumentation for liteLLM.PyPI Version
openinference-instrumentation-groqOpenInference Instrumentation for Groq.PyPI Version
openinference-instrumentation-instructorOpenInference Instrumentation for Instructor.PyPI Version
openinference-instrumentation-anthropicOpenInference Instrumentation for Anthropic.PyPI Version

Examples

NameDescriptionComplexity Level
OpenAI SDKOpenAI Python SDK, including chat completions and embeddingsBeginner
MistralAI SDKMistralAI Python SDKBeginner
VertexAI SDKVertexAI Python SDKBeginner
LlamaIndexLlamaIndex query enginesBeginner
DSPyDSPy primitives and custom RAG modulesBeginner
Boto3 Bedrock ClientBoto3 Bedrock clientBeginner
LangChainLangChain primitives and simple chainsBeginner
LiteLLMA lightweight LiteLLM frameworkBeginner
LiteLLM ProxyLiteLLM Proxy to log OpenAI, Azure, Vertex, BedrockBeginner
GroqGroq and AsyncGroq chat completionsBeginner
AnthropicAnthropic Messages clientBeginner
LlamaIndex + Next.js ChatbotA fully functional chatbot using Next.js and a LlamaIndex FastAPI backendIntermediate
LangServeA LangChain application deployed with LangServe using custom metadata on a per-request basisIntermediate
DSPyA DSPy RAG application using FastAPI, Weaviate, and CohereIntermediate
HaystackA Haystack QA RAG applicationIntermediate

JavaScript

Libraries

PackageDescriptionVersion
@arizeai/openinference-semantic-conventionsSemantic conventions for tracing of LLM Apps.NPM Version
@arizeai/openinference-coreCore utility functions for instrumentationNPM Version
@arizeai/openinference-instrumentation-openaiOpenInference Instrumentation for OpenAI SDK.NPM Version
@arizeai/openinference-instrumentation-langchainOpenInference Instrumentation for LangChain.js.NPM Version
@arizeai/openinference-vercelOpenInference Support for Vercel AI SDKNPM Version

Examples

NameDescriptionComplexity Level
OpenAI SDKOpenAI Node.js clientBeginner
LlamaIndex Express AppA fully functional LlamaIndex chatbot with a Next.js frontend and a LlamaIndex Express backend, instrumented using openinference-instrumentation-openaiIntermediate
LangChain OpenAIA simple script to call OpenAI via LangChain, instrumented using openinference-instrumentation-langchainBeginner
LangChain RAG Express AppA fully functional LangChain chatbot that uses RAG to answer user questions. It has a Next.js frontend and a LangChain Express backend, instrumented using openinference-instrumentation-langchainIntermediate
Next.js + OpenAIA Next.js 13 project bootstrapped with create-next-app that uses OpenAI to generate textBeginner

Supported Destinations

OpenInference supports the following destinations as span collectors.

Community

Join our community to connect with thousands of machine learning practitioners and LLM observability enthusiasts!