Compare integrations
Integration | Best for | Key benefits |
---|---|---|
Amazon Bedrock | Enterprise AWS environments | Serverless scaling, AWS ecosystem integration |
Amazon Strands Agents | Multi-agent AI applications | Agent orchestration, durable sessions |
LangChain | Complex AI workflows | Tool integration, data source connections |
Traceloop/OpenLLMetry | Production monitoring | Observability, debugging, performance tracking |
Instructor | Structured data extraction | Pydantic validation, retry mechanisms |
Amazon Bedrock
Amazon Bedrock is a fully managed service for building and scaling generative AI applications. Writer’s Palmyra X5 and X4 models are available on Bedrock, enabling you to use AWS’s infrastructure and ecosystem to build and scale your applications.- Serverless scaling: automatic scaling based on demand
- AWS ecosystem integration: integration with other AWS services
- Enterprise security: built-in security and compliance features
- Cross-region inference: distribute traffic across multiple AWS regions
Get started
Amazon Strands Agents
Amazon Strands Agents is a platform for building sophisticated multi-agent AI applications that require orchestration and collaboration. Strands Agents SDK enables you to create AI applications using a model-driven approach with multiple agents working together. Writer models integrate with the Strands ecosystem.- Multi-agent orchestration: coordinate multiple AI agents in complex workflows
- Durable session management: maintain state across agent interactions
- Asynchronous support: handle concurrent operations efficiently
- Model-driven approach: define agent behavior through configuration
LangChain
LangChain provides a framework for developing applications powered by language models. Writer’s integration allows you to use Writer’s capabilities within the LangChain ecosystem alongside other tools and data sources.- Tool integration: connect Writer with external tools and APIs
- Data source connections: integrate with databases, files, and web sources
- Chain composition: build complex workflows with multiple components
- Ecosystem compatibility: works with the broader LangChain tool ecosystem
Get started
Traceloop/OpenLLMetry
OpenLLMetry provides observability for LLM applications through OpenTelemetry-compatible tracing. It offers detailed insights into your Writer API usage, performance characteristics, and debugging capabilities.- Comprehensive observability: track prompts, completions, and token usage
- OpenTelemetry compatibility: works with any OpenTelemetry-compatible backend
- Performance insights: monitor latency, throughput, and error rates
- Debugging capabilities: trace issues across complex AI workflows
Get started
Instructor
Instructor provides a high-level framework for getting structured outputs from language models. It offers built-in retry mechanisms, Pydantic validation, and prompt optimization specifically designed for structured data extraction.- Structured data extraction: convert unstructured text into validated data structures
- Built-in retry logic: automatic error handling and retry mechanisms
- Pydantic validation: type-safe data validation and parsing
- Prompt optimization: automatic prompt minimization and optimization
Get started
Next steps
Once you’ve chosen an integration:- Set up your development environment with the required dependencies
- Follow the integration-specific guide to implement Writer in your application
- Test your implementation with sample data and workflows
- Deploy to production following best practices for your chosen platform