Supported Frameworks
Phoenix provides first-class support for the most popular LLM application frameworks:LangChain
Auto-instrument LangChain applications in Python and JavaScript
LlamaIndex
Trace LlamaIndex queries, retrievals, and agent workflows
OpenAI
Monitor OpenAI API calls including GPT-4, embeddings, and agents
Anthropic
Instrument Claude API calls with streaming support
OpenTelemetry
Custom instrumentation using OpenTelemetry primitives
LLM Providers
Phoenix supports instrumentation for all major LLM providers:Python Integrations
| Provider | Package | Description |
|---|---|---|
| OpenAI | openinference-instrumentation-openai | GPT models, embeddings, and function calling |
| Anthropic | openinference-instrumentation-anthropic | Claude models with streaming support |
| Google GenAI | openinference-instrumentation-google-genai | Gemini and PaLM models |
| AWS Bedrock | openinference-instrumentation-bedrock | Amazon Bedrock models |
| MistralAI | openinference-instrumentation-mistralai | Mistral models |
| VertexAI | openinference-instrumentation-vertexai | Google Vertex AI models |
| Groq | openinference-instrumentation-groq | Groq inference engine |
| LiteLLM | openinference-instrumentation-litellm | Unified interface for 100+ LLMs |
JavaScript/TypeScript Integrations
| Provider | Package | Description |
|---|---|---|
| OpenAI | @arizeai/openinference-instrumentation-openai | OpenAI Node.js SDK |
| LangChain.js | @arizeai/openinference-instrumentation-langchain | LangChain JavaScript framework |
| Vercel AI SDK | @arizeai/openinference-vercel | Vercel AI SDK streaming |
| BeeAI | @arizeai/openinference-instrumentation-beeai | BeeAI agent framework |
Agent Frameworks
Phoenix supports popular agent and workflow frameworks:- DSPy -
openinference-instrumentation-dspy - CrewAI -
openinference-instrumentation-crewai - Haystack -
openinference-instrumentation-haystack - Guardrails -
openinference-instrumentation-guardrails - Instructor -
openinference-instrumentation-instructor - Agno -
openinference-instrumentation-agno - Pydantic AI -
openinference-instrumentation-pydantic-ai
Getting Started
Most integrations follow a simple three-step pattern:How It Works
Phoenix integrations use OpenTelemetry to capture traces from your LLM applications:- Auto-instrumentation: Instrumentation packages automatically patch SDK methods to capture trace data
- OpenInference conventions: Traces follow standardized semantic conventions for LLM observability
- OTLP export: Trace data is exported to Phoenix using the OpenTelemetry Protocol (OTLP)
- Zero-code changes: Most integrations require no changes to your application code
All Phoenix integrations are open source and part of the OpenInference project.
Next Steps
OpenAI Integration
Get started with OpenAI tracing
LangChain Integration
Instrument LangChain applications
Custom Instrumentation
Build custom traces with OpenTelemetry
All Integrations
Browse all available integrations