Core Concepts
Sessions
Sessions track all LLM calls within a context for debugging and analysis. They automatically capture traces, metadata, and provide access to collected data.Trajectories
Trajectories represent multi-step workflows where each LLM call becomes a step with assignable rewards. Use the@trajectory decorator to automatically convert function execution into structured trajectories.
Installation
The SDK is included in therllm package:
Quick Start
Basic Usage
Nested Sessions with Metadata Inheritance
Sessions can be nested, and metadata is automatically merged:Architecture
Data Models
The SDK uses three primary data models:Trace
Low-level trace from a single LLM call.StepView
Trace wrapper with a reward field for RL training.TrajectoryView
Collection of steps forming a complete workflow.Core Functions
Session Management
Chat Clients
Trajectory Decorator
Design Principles
- Minimal API surface: Simple, focused functions
- Context-based: Uses Python’s
contextvarsfor automatic propagation - Distributed-ready: OpenTelemetry backend for cross-process tracing
- Pluggable storage: Supports in-memory, SQLite, or custom backends
- Type-safe: Full type annotations with Pydantic models
- Async-native: First-class async/await support
- Proxy-integrated: Built-in support for LiteLLM proxy routing
Next Steps
Sessions
Learn about session contexts and management
Trajectories
Understand trajectory tracking and rewards
Integrations
Integrate with LangGraph, SmolAgent, and more
API Reference
Explore the complete API documentation