This guide provides an overview of Writer’s integrations with popular AI frameworks and platforms. After reading this guide, you can choose the right integration for your specific use case and development needs. Writer integrates with several leading AI frameworks and cloud platforms to provide flexible deployment options and enhanced capabilities. Each integration offers unique benefits depending on your infrastructure, development preferences, and specific requirements.

Compare integrations

IntegrationBest forKey benefits
Amazon BedrockEnterprise AWS environmentsServerless scaling, AWS ecosystem integration
Amazon Strands AgentsMulti-agent AI applicationsAgent orchestration, durable sessions
LangChainComplex AI workflowsTool integration, data source connections
Traceloop/OpenLLMetryProduction monitoringObservability, debugging, performance tracking
InstructorStructured data extractionPydantic validation, retry mechanisms

Amazon Bedrock

Amazon Bedrock is a fully managed service for building and scaling generative AI applications. Writer’s Palmyra X5 and X4 models are available on Bedrock, enabling you to use AWS’s infrastructure and ecosystem to build and scale your applications.
  • Serverless scaling: automatic scaling based on demand
  • AWS ecosystem integration: integration with other AWS services
  • Enterprise security: built-in security and compliance features
  • Cross-region inference: distribute traffic across multiple AWS regions

Amazon Strands Agents

Amazon Strands Agents is a platform for building sophisticated multi-agent AI applications that require orchestration and collaboration. Strands Agents SDK enables you to create AI applications using a model-driven approach with multiple agents working together. Writer models integrate with the Strands ecosystem.
  • Multi-agent orchestration: coordinate multiple AI agents in complex workflows
  • Durable session management: maintain state across agent interactions
  • Asynchronous support: handle concurrent operations efficiently
  • Model-driven approach: define agent behavior through configuration

LangChain

LangChain provides a framework for developing applications powered by language models. Writer’s integration allows you to use Writer’s capabilities within the LangChain ecosystem alongside other tools and data sources.
  • Tool integration: connect Writer with external tools and APIs
  • Data source connections: integrate with databases, files, and web sources
  • Chain composition: build complex workflows with multiple components
  • Ecosystem compatibility: works with the broader LangChain tool ecosystem

Traceloop/OpenLLMetry

OpenLLMetry provides observability for LLM applications through OpenTelemetry-compatible tracing. It offers detailed insights into your Writer API usage, performance characteristics, and debugging capabilities.
  • Comprehensive observability: track prompts, completions, and token usage
  • OpenTelemetry compatibility: works with any OpenTelemetry-compatible backend
  • Performance insights: monitor latency, throughput, and error rates
  • Debugging capabilities: trace issues across complex AI workflows

Instructor

Instructor provides a high-level framework for getting structured outputs from language models. It offers built-in retry mechanisms, Pydantic validation, and prompt optimization specifically designed for structured data extraction.
  • Structured data extraction: convert unstructured text into validated data structures
  • Built-in retry logic: automatic error handling and retry mechanisms
  • Pydantic validation: type-safe data validation and parsing
  • Prompt optimization: automatic prompt minimization and optimization

Next steps

Once you’ve chosen an integration:
  1. Set up your development environment with the required dependencies
  2. Follow the integration-specific guide to implement Writer in your application
  3. Test your implementation with sample data and workflows
  4. Deploy to production following best practices for your chosen platform
For additional help, see the API Quickstart or explore the SDKs documentation for language-specific implementation details.