What is Observatory?
Observatory is an open-source AI agent observability SDK that helps you monitor, trace, and debug your AI applications in real-time. Built by The Context Company, it provides deep visibility into how your AI agents behave, what they do, and where they might be going wrong.Local-first by default — Observatory can run completely offline with no account, API key, or external dependencies required.
Why Observatory?
Building AI agents is fundamentally different from traditional software development. With AI agents:- Non-deterministic behavior makes it hard to predict what your agent will do
- Complex multi-step workflows involving LLM calls, tool usage, and reasoning steps
- Token costs and latency require careful monitoring and optimization
- User feedback is critical for understanding agent performance in production
Key Features
Local-First Mode
Run Observatory completely offline without any account or API key. Perfect for development, debugging, and privacy-sensitive applications.instrumentation.ts
Local mode currently supports Vercel AI SDK on Next.js. Support for other frameworks is coming soon.
Real-Time Visualization Widget
See your AI agent traces in real-time with an in-browser overlay that shows:- LLM requests and responses
- Token usage and estimated costs
- Tool calls and execution times
- Multi-step agent workflows
- Performance metrics and latency

OpenTelemetry-Based Instrumentation
Observatory uses industry-standard OpenTelemetry for tracing, making it:- Compatible with existing observability tools
- Extensible for custom instrumentation needs
- Standards-based rather than proprietary
Multi-Framework Support
Observatory provides first-class integrations for popular AI frameworks:Vercel AI SDK
Automatic instrumentation via OpenTelemetry for Next.js applications
Claude Agent SDK
Built-in observability for Anthropic’s Claude agents
Mastra
Native integration with the Mastra AI framework
Custom Agents
Manual instrumentation SDK for any TypeScript or Python agent
Session and Run Tracking
Track conversations and individual AI interactions with built-in session management:route.ts
User Feedback Integration
Capture and link user feedback (thumbs up/down, comments) directly to specific agent runs for continuous improvement.How It Works
Observatory follows a simple architecture:- Instrumentation — Observatory wraps your AI framework calls using OpenTelemetry
- Collection — Traces are collected locally or sent to The Context Company backend
- Visualization — The widget displays traces in real-time in your browser
- Analysis — View detailed metrics, logs, and performance data
Observatory Packages
Observatory is a monorepo containing multiple packages for different use cases:TypeScript/JavaScript
- @contextcompany/otel — OpenTelemetry integration for instrumenting AI SDK calls in Node.js and Next.js
- @contextcompany/widget — Local-first UI overlay for visualizing traces in real-time
- @contextcompany/claude — Instrumentation for Claude Agent SDK
- @contextcompany/mastra — Integration for the Mastra AI framework
- @contextcompany/custom — Manual instrumentation SDK for custom agents
Python
- contextcompany — Observatory SDK for Python agents
Use Cases
Development
- Debug agent behavior in real-time
- Understand multi-step reasoning workflows
- Identify performance bottlenecks
- Monitor token usage and costs
Testing
- Validate agent outputs and tool calls
- Ensure proper error handling
- Test edge cases and failure modes
Production
- Monitor live agent performance
- Collect user feedback
- Track costs and usage patterns
- Identify and fix issues quickly
Privacy & Telemetry
When running in local mode, Observatory collects limited anonymous usage data by default. No sensitive or personally identifiable information is ever collected. You can view exactly which events and values are tracked in the source code. To disable anonymous telemetry, set theTCC_DISABLE_ANONYMOUS_TELEMETRY environment variable:
.env
Open Source
Observatory is 100% open source and available on GitHub. Contributions are welcome!View Source Code
Explore the code, report issues, or contribute to Observatory
Contributing Guide
Learn how to contribute to the project
Next Steps
Ready to get started? Follow the quickstart guide to add Observatory to your AI application:Quickstart Guide
Get Observatory running in your Next.js + AI SDK application in 5 minutes
