
What is OpenLLMetry?
OpenLLMetry is an open source project that enables monitoring and debugging of LLM application execution. It provides non-intrusive tracing built on top of OpenTelemetry, allowing you to export traces to your existing observability stack. Since it’s built on OpenTelemetry, you can use any OpenTelemetry compatible backend (Jaeger, Zipkin, Datadog, New Relic, etc.) or the hosted Traceloop platform.What you get with the Writer integration
The Writer integration with OpenLLMetry provides enhanced observability beyond standard OpenTelemetry traces. You’ll see detailed information about your Writer API calls as additional span attributes:gen_ai.request.model
: the model requested (for example,palmyra-x5
)gen_ai.prompt
: array of prompts sent to the Writer modelgen_ai.completion
: array of completions returned from Writergen_ai.usage.total_tokens
: total tokens usedgen_ai.usage.prompt_tokens
: number of tokens used for promptsgen_ai.usage.completion_tokens
: number of tokens used for completions
For a complete list of supported attributes, see the full GenAI Semantic Conventions.
Prerequisites
Before you begin, make sure you have:- Python 3.11 or higher installed
- A Writer AI Studio account
- A Traceloop account
- A Writer API key. See instructions in the API Quickstart
- Basic familiarity with Python and OpenLLMetry concepts
The example below uses Traceloop as the observability provider. See the connect to an external provider section to learn how to connect to other providers.
Configuration details
Environment Variable | Required | Default | Description |
---|---|---|---|
WRITER_API_KEY | Required | None | Writer API key for authenticating API calls. Must be set for the application to function. |
TRACELOOP_API_KEY | Optional | None | If unset and TRACELOOP_BASE_URL is https://api.traceloop.com , the SDK generates a new API key in the Traceloop dashboard on first run. |
TRACELOOP_BASE_URL | Optional | https://api.traceloop.com | OpenTelemetry endpoint to connect to. If prefixed with http /https , uses OTLP/HTTP protocol; otherwise uses OTLP/GRPC. The SDK appends /v1/traces . |
TRACELOOP_HEADERS | Optional | None | Custom HTTP headers for authentication. If set, API key is ignored. |
TRACELOOP_TRACE_CONTENT | Optional | true | Enable/disable logging of prompts, completions, and embeddings to span attributes. |
TRACELOOP_TELEMETRY | Optional | true | Enable/disable anonymous telemetry data collection. |
For additional configuration option details, see the SDK initialization docs.
Integrate Writer with OpenLLMetry
1. Install the SDK
Install OpenLLMetry in your Python environment:2. Initialize and configure
In your Writer application, initialize the Traceloop tracer. For local development, you may want to disable batch sending to see traces immediately.3. (Optional) annotate your workflows
Workflow annotations are useful when you have groups of operations that you want to group together for better trace organization and debugging.Workflow annotations are optional. OpenLLMetry traces Writer API calls automatically without needing any decorators or additional code.
Connect to external providers
Because the Traceloop SDK is built on OpenTelemetry, the data it generates can be used in any observability platform that supports the OpenTelemetry standard. OpenLLMetry extends this by using the OTLP protocol to connect with external observability providers, letting you send traces directly to your existing stack without relying on Traceloop’s hosted platform.Configure external provider connection
To connect to an external provider, configure the following environment variables:Environment Variable | Description |
---|---|
TRACELOOP_BASE_URL | The OTLP endpoint URL for your observability provider |
TRACELOOP_HEADERS | Authentication headers (if required by your provider) |
Example configurations
Grafana Cloud
Datadog
New Relic
For a complete list of supported integrations, see the full OpenLLMetry integrations catalog.
Next steps
Now that you’ve set up Writer with OpenLLMetry, you can start monitoring your LLM applications. See the following resources to help you get the most out of your integration:- Explore OpenLLMetry’s advanced features for more detailed tracing capabilities
- Learn about workflow annotations to better organize your traces
- Set up custom integrations with your existing observability stack
- Review the Writer OpenLLMetry integration for additional details and examples