Use a Knowledge Graph in a chat
This guide demonstrates how to send questions to a Knowledge Graph during a chat completion. Knowledge Graph chat is a predefined tool you can use to reference a Knowledge Graph when users chat with a Palmyra LLM.
You need an API key to access the Writer API. Get an API key by following the steps in the API quickstart.
We recommend setting the API key as an environment variable in a .env
file with the name WRITER_API_KEY
.
Tool structure
Knowledge Graph chat is a predefined tool supported by Palmyra-X-004 and later models, to be used with tool calling in the chat endpoint. To use Knowledge Graph chat, add the following object to the tools
array when calling the chat endpoint:
Parameter | Type | Description |
---|---|---|
type | string | The type of tool. Must be graph for Knowledge Graph chat. |
function | object | An object containing the graph_ids , description , and subqueries parameters |
function.graph_ids | array | An array of strings containing the graph IDs you wish to reference |
function.description | string | A description of the graphs you are referencing. This helps the model understand when to use the Knowledge Graph tool in the chat. If there are multiple graphs, include a description for each, referencing the graph by name. |
function.subqueries | Boolean | A Boolean indicating whether to include the subqueries used by Palmyra in the response. |
Response format
When a chat completion uses the Knowledge Graph tool, the response from the Knowledge Graph tool is in the graph_data
object. That object contains the following fields:
Parameter | Type | Description |
---|---|---|
sources | array | An array of objects containing the source file IDs and snippets that helped the model answer the question. |
sources.file_id | string | The ID of the source file. |
sources.snippet | string | A snippet from the source file that helped the model answer the question. |
status | string | The status of the query. |
subqueries | array | An array of objects containing the subqueries used by Palmyra in the response. |
subqueries.query | string | The query used by Palmyra to answer the question. |
subqueries.answer | string | The answer to the question. |
subqueries.sources | array | An array of objects containing the source file IDs and snippets that helped the model answer the question. |
The full response has the following structure.
The subqueries and sources shown are abbreviated for readability. If the subqueries
parameter is false
, or if the model doesn’t need subqueries to answer the question, this array is be empty.
Usage example
The following example uses a hypothetical product information Knowledge Graph to answer a question about which food products contain both food coloring and chocolate.
Create the tools array
First, define the tools
array with the type
set to graph
. The function
object contains the graph_ids
, description
, and subqueries
parameters.
In this example, the subqueries are included in the response. Subqueries can be useful for debugging or for providing additional context to the user about how the model arrived at the answer.
Send the request
Add the tools array to the chat endpoint call along with your array of messages. Setting tool_choice
to auto
allows the model to choose when to use the Knowledge Graph tool, based on the user’s question and the description of the tool.
This example streams the response in real-time as the model generates it.
If you are unfamiliar with the chat completions endpoint or streaming vs. non-streaming responses, learn more in the chat completion guide.
Display Knowledge Graph subqueries and sources
You may want to display the sources
or subqueries
in your UI to assist your user in understanding how the model derived the answer to the question. The following example shows how to display the subqueries as well as the status of the query from the Knowledge Graph.
Next steps
By following this guide, you can reference Knowledge Graphs in your users’ chats in your application.
Next, learn how to delegate user questions to a domain-specific LLM with the prebuilt LLM tool.
Was this page helpful?