This guide shows you how to integrate Writer with OpenLLMetry for monitoring and debugging your LLM applications. After completing these steps, you can trace and observe your Writer API calls alongside other LLM operations in your observability stack. Traceloop dashboard showing Writer integration with OpenLLMetry

What is OpenLLMetry?

OpenLLMetry is an open source project that enables monitoring and debugging of LLM application execution. It provides non-intrusive tracing built on top of OpenTelemetry, allowing you to export traces to your existing observability stack. Since it’s built on OpenTelemetry, you can use any OpenTelemetry compatible backend (Jaeger, Zipkin, Datadog, New Relic, etc.) or the hosted Traceloop platform.

What you get with the Writer integration

The Writer integration with OpenLLMetry provides enhanced observability beyond standard OpenTelemetry traces. You’ll see detailed information about your Writer API calls as additional span attributes:
  • gen_ai.request.model - The model requested (e.g., palmyra-x5)
  • gen_ai.prompt - Array of prompts sent to the Writer model
  • gen_ai.completion - Array of completions returned from Writer
  • gen_ai.usage.total_tokens - Total tokens used
  • gen_ai.usage.prompt_tokens - Number of tokens used for prompts
  • gen_ai.usage.completion_tokens - Number of tokens used for completions
This gives you comprehensive visibility into your Writer API usage, including prompt engineering insights, token consumption patterns, and performance characteristics that aren’t available in standard OpenTelemetry traces.
For a complete list of supported attributes, see the full GenAI Semantic Conventions.

Prerequisites

Before you begin, make sure you have:
The example below uses Traceloop as the observability provider. See the connect to an external provider section to learn how to connect to other providers.

Configuration details

Environment VariableRequiredDefaultDescription
WRITER_API_KEYRequiredNoneWriter API key for authenticating API calls. Must be set for the application to function.
TRACELOOP_API_KEYOptionalNoneIf unset and TRACELOOP_BASE_URL is https://api.traceloop.com, the SDK generates a new API key in the Traceloop dashboard on first run.
TRACELOOP_BASE_URLOptionalhttps://api.traceloop.comOpenTelemetry endpoint to connect to. If prefixed with http/https, uses OTLP/HTTP protocol; otherwise uses OTLP/GRPC. The SDK appends /v1/traces.
TRACELOOP_HEADERSOptionalNoneCustom HTTP headers for authentication. If set, API key is ignored.
TRACELOOP_TRACE_CONTENTOptionaltrueEnable/disable logging of prompts, completions, and embeddings to span attributes.
TRACELOOP_TELEMETRYOptionaltrueEnable/disable anonymous telemetry data collection.
For additional configuration option details, see the SDK initialization docs.

Integrate Writer with OpenLLMetry

1. Install the SDK

Install OpenLLMetry in your Python environment:
pip install traceloop-sdk

2. Initialize and configure

In your Writer application, initialize the Traceloop tracer:
import os
from traceloop.sdk import Traceloop
from writerai import Writer

# Initialize Traceloop for Writer monitoring
Traceloop.init(
    app_name="YOUR_APP_NAME",
    api_key=os.environ.get("TRACELOOP_API_KEY")
)

# Initialize Writer client
client = Writer(api_key=os.environ.get("WRITER_API_KEY"))
For local development, you may want to disable batch sending to see traces immediately:
Traceloop.init(
    app_name="YOUR_APP_NAME",
    disable_batch=True  # See Writer API traces immediately in local development
)

3. (Optional) annotate your workflows

Workflow annotations are useful when you have groups of operations that you want to group together for better trace organization and debugging.
Workflow annotations are optional - Writer API calls are automatically traced by OpenLLMetry without needing any decorators or additional code.
from traceloop.sdk.decorators import workflow

@workflow(name="content_generation")
def generate_content():
    # This call will be automatically traced by OpenLLMetry
    completion = client.chat.completions.create(
        model="palmyra-x5",
        messages=[{"role": "user", "content": "Write a short story about AI"}],
        max_tokens=500
    )
    
    return completion.choices[0].message.content

# Works with async functions too
@workflow(name="async_content_generation")
async def async_generate_content():
    completion = await client.chat.completions.acreate(
        model="palmyra-x5",
        messages=[{"role": "user", "content": "Write a poem about technology"}],
        max_tokens=300
    )
    
    return completion.choices[0].message.content

# Execute the workflows
result = generate_content()
print(result)

# Execute async workflow
import asyncio
async_result = asyncio.run(async_generate_content())
print(async_result)

Connect to external providers

Because the Traceloop SDK is built on OpenTelemetry, the data it generates can be used in any observability platform that supports the OpenTelemetry standard. OpenLLMetry extends this by using the OTLP protocol to connect with external observability providers, letting you send traces directly to your existing stack without relying on Traceloop’s hosted platform.

Configure external provider connection

To connect to an external provider, configure the following environment variables:
Environment VariableDescription
TRACELOOP_BASE_URLThe OTLP endpoint URL for your observability provider
TRACELOOP_HEADERSAuthentication headers (if required by your provider)

Example configurations

Grafana Cloud

# Generate base64 encoded credentials
echo -n "<your stack id>:<your api key>" | base64

# Set environment variables
export TRACELOOP_BASE_URL=https://otlp-gateway-<zone>.grafana.net/otlp
export TRACELOOP_HEADERS="Authorization=Basic%20<base64 encoded stack id and api key>""

Datadog

# Connect to your Datadog Agent (requires OTLP HTTP collector enabled)
export TRACELOOP_BASE_URL="http://<datadog-agent-hostname>:4318"

New Relic

export TRACELOOP_BASE_URL=https://otlp.nr-data.net:443
export TRACELOOP_HEADERS="api-key=<YOUR_NEWRELIC_LICENSE_KEY>"
For a complete list of supported integrations, see the full OpenLLMetry integrations catalog.

Next steps

Now that you’ve set up Writer with OpenLLMetry, you can start monitoring your LLM applications. See the following resources to help you get the most out of your integration: