Using Writer with LangChain
The Writer LangChain integration allows you to leverage Writer’s capabilities within the LangChain ecosystem, making it easy to build sophisticated AI applications. In this tutorial, you’ll explore each component of the integration and understand how they work.
Prerequisites
Before you begin, make sure you have:
- Python 3.11 or higher installed
- A Writer AI Studio account
- A Writer API key. See instructions in the API Quickstart.
- Basic familiarity with Python and LangChain concepts
Installation and setup
First, install the necessary packages:
Next, create a .env
file with WRITER_API_KEY
set to your Writer API key:
Components of the Writer LangChain integration
The langchain-writer
package provides several key components:
ChatWriter
for text generation- Tool calling capabilities, including:
GraphTool
for Knowledge Graph integrationNoCodeAppTool
for no-code applicationsLLMTool
for specialized model delegation
- Additional tools like
PDFParser
for parsing PDFs andWriterTextSplitter
for intelligent text splitting
ChatWriter
ChatWriter
is a LangChain chat model that provides access to Writer’s AI capabilities for text generation. It supports streaming, non-streaming, batching, and asynchronous operations. You can use any of the Palmyra chat models available in AI Studio.
See the full documentation for ChatWriter
to learn more about the available parameters.
Usage
This example uses ChatWriter
to ask the Palmyra X 004 model to explain what LangChain is in plain terms.
Streaming
Streaming allows you to receive the generated text in chunks as it’s being produced. This example shows how to stream a response from the Palmyra X 004 model using synchronous streaming:
You can also use asynchronous streaming with the async for
loop and the astream
method:
Batch processing
You can batch process multiple prompts for efficient processing. The following example batches three individual LLM invocations and runs them in parallel:
Note that batch
returns results in the same order as the inputs. You can use batch_as_completed
to return results as they complete. Results may arrive out of order, but each includes the input index for matching.
You can also optionally set the max_concurrency
parameter to control the number of concurrent requests, which can be useful when you want to limit the number of parallel calls to prevent overloading a server or API:
See the LangChain documentation on parallel execution for more information.
Tool calling
ChatWriter
supports tool calling, which allows the model to use external functions to enhance its capabilities. Tool calling is available with Palmyra X 004 and later.
Tool calling basics
To use tool calling, follow these steps:
- Define a function that will be called by the model and decorate it with the
@tool
decorator. - Bind the tool to the chat model using the
bind_tools
method. - Use the tool in a chat and append the response to the messages list.
- Execute the tool call with the arguments given by the model and append the response to the messages list.
- Invoke the chat model with the updated messages list to receive the final response.
Here’s an example of how to use tool calling:
GraphTool
GraphTool
is a LangChain tool that allows you to retrieve information from a Knowledge Graph to enhance its responses. The tool executes remotely, so you simply need to provide the IDs of the Knowledge Graph you want to use. For more details on the built-in Knowledge Graph chat tool in Writer, see the Knowledge Graph chat support guide.
Usage
NoCodeAppTool
NoCodeAppTool
is a specialized tool that enables access to Writer’s no-code applications as LLM tools. This tool allows language models to interact with pre-built applications to enhance their responses.
The tool is designed as a standard “function” type tool and requires manual execution. Unlike the GraphTool
and LLMTool
which are executed remotely by Writer’s servers, you’ll need to handle the execution of the NoCodeAppTool
in your code.
When initializing the tool, you must provide an app_id
(either directly or using an environment variable) that corresponds to a no-code application created in your Writer account. The tool automatically retrieves the input parameters for the app during initialization, so you don’t need to specify them manually.
You can customize the tool’s name (default is “No-code application”) and description (default is “No-code application powered by Palmyra”) to provide better context to the model. The API key is typically read from environment variables but can also be provided directly.
When working with the tool, remember that all required inputs must be provided when invoking it, or a ValueError
will be raised. Input values can be either strings or lists of strings, depending on what the no-code application expects.
This tool is particularly useful for generating content, performing specialized data transformations, creating custom outputs based on user inputs, integrating with domain-specific applications, enhancing responses with formatted content, or leveraging pre-built applications for common tasks.
Usage
Here’s an example of how to use the NoCodeAppTool
:
When providing inputs to the tool, you’ll need to format them as a dictionary:
LLMTool
LLMTool
is a specialized tool that enables delegation to specific Writer models. This tool allows language models to delegate calls to different Palmyra model types to enhance their responses.
Unlike most LangChain tools that use the standard “function” type, the LLMTool
has a type of llm
and is designed specifically for use within the Writer environment. Due to its remote execution nature, this tool doesn’t support direct invocation through the _run
method—attempting to call this method will raise a NotImplementedError
.
When initializing the tool, you can specify which Palmyra model to delegate to using the model_name
parameter. Options include general-purpose models like palmyra-x-004
, as well as domain-specific models such as palmyra-med
for medical content, palmyra-fin
for financial analysis, and palmyra-creative
for creative content generation. The default model is palmyra-x-004
. The description
parameter plays a crucial role in helping the model understand when to use this tool.
When the model uses the LLM tool, the execution happens remotely on Writer’s servers, and the response includes additional data in the additional_kwargs["llm_data"]
field that you can access in your application.
This tool is particularly valuable when you need specialized domain knowledge. For example, you might delegate medical questions to the palmyra-med
model, use palmyra-creative
for generating creative content, or leverage palmyra-fin
for financial analysis.
Usage
Here’s an example of how to use the LLMTool
:
Additional tools
PDFParser
PDFParser
is a document loader that uses Writer’s PDF parsing capabilities to extract text from PDF documents. It converts PDFs into LangChain Document objects that can be used in your applications. For more details on the underlying API, see the PDF parser API reference.
Usage
Here’s an example of how to use the PDFParser
:
Output formats
The PDFParser
supports different output formats:
text
: Plain text extractionmarkdown
: Structured markdown with preserved formatting
WriterTextSplitter
WriterTextSplitter
is a text splitter that uses Writer’s context-aware splitting capabilities to divide documents into semantically meaningful chunks. This is particularly useful for preparing documents for retrieval systems. For more details on the underlying API, see the context-aware splitting tool API reference.
Usage
Here’s an example of how to use the WriterTextSplitter
:
Splitting strategies
The WriterTextSplitter
supports different splitting strategies:
llm_split
: Uses language model for precise semantic splittingfast_split
: Uses heuristic-based approach for quick splittinghybrid_split
: Combines both approaches for a balance of speed and quality
Conclusion
In this tutorial, you’ve explored the LangChain integration with Writer, covering each of its components:
ChatWriter
for text generation- Tool calling capabilities, including:
GraphTool
for Knowledge Graph integrationNoCodeAppTool
for no-code applicationsLLMTool
for specialized model delegation
- Additional tools like
PDFParser
for parsing PDFs andWriterTextSplitter
for intelligent text splitting
This integration provides a powerful foundation for building AI applications with Writer and LangChain. Check out the package README and documentation, as well as the LangChain Documentation for more information on how to use LangChain with Writer.
Was this page helpful?