This guide demonstrates how to send questions to a Knowledge Graph during a chat completion. Knowledge Graph chat is a predefined tool you can use to reference a Knowledge Graph when users chat with a Palmyra LLM.

You need an API key to access the Writer API. Get an API key by following the steps in the API quickstart.

We recommend setting the API key as an environment variable in a .env file with the name WRITER_API_KEY.

Tool structure

Knowledge Graph chat is a predefined tool supported by Palmyra-X-004 and later models, to be used with tool calling in the chat endpoint. To use Knowledge Graph chat, add the following object to the tools array when calling the chat endpoint:

ParameterTypeDescription
typestringThe type of tool. Must be graph for Knowledge Graph chat.
functionobjectAn object containing the graph_ids, description, and subqueries parameters
function.graph_idsarrayAn array of strings containing the graph IDs you wish to reference
function.descriptionstringA description of the graphs you are referencing. This helps the model understand when to use the Knowledge Graph tool in the chat. If there are multiple graphs, include a description for each, referencing the graph by name.
function.subqueriesBooleanA Boolean indicating whether to include the subqueries used by Palmyra in the response.
"tools": [
    {
        "type": "graph",
        "function": {
            "description": "Description of the graph(s)",
            "graph_ids": [
                "your-graph-id"
            ],
            "subqueries": true
        }
    }  
]

Response format

When a chat completion uses the Knowledge Graph tool, the response from the Knowledge Graph tool is in the graph_data object. That object contains the following fields:

ParameterTypeDescription
sourcesarrayAn array of objects containing the source file IDs and snippets that helped the model answer the question.
sources.file_idstringThe ID of the source file.
sources.snippetstringA snippet from the source file that helped the model answer the question.
statusstringThe status of the query.
subqueriesarrayAn array of objects containing the subqueries used by Palmyra in the response.
subqueries.querystringThe query used by Palmyra to answer the question.
subqueries.answerstringThe answer to the question.
subqueries.sourcesarrayAn array of objects containing the source file IDs and snippets that helped the model answer the question.

The full response has the following structure.

The subqueries and sources shown are abbreviated for readability. If the subqueries parameter is false, or if the model doesn’t need subqueries to answer the question, this array is be empty.

{
    "id": "1234",
    "object": "chat.completion.chunk",
    "choices": [
        {
            "index": 0,
            "finish_reason": "stop",
            "delta": {
                "content": "None of our products contain both chocolate and food coloring. The products containing chocolate are different from those containing food coloring.",
                "role": "assistant",
                "tool_calls": null,
                "graph_data": {
                    "sources": [
                        {
                            "file_id": "1234",
                            "snippet": "with cocoa for an extra touch of chocolate…"
                        },
                        {
                            "file_id": "5678",
                            "snippet": "Sugar, corn syrup, artificial flavors, food coloring…"
                        }
                    ],
                    "status": "finished",
                    "subqueries": [
                        {
                            "query": "Which of our products contain food coloring?",
                            "answer": "The products that contain food coloring are...",
                            "sources": [
                                {
                                    "file_id": "1234",
                                    "snippet": "Sugar, citric acid, artificial flavors…"
                                },
                                {
                                    "file_id": "5678",
                                    "snippet": "Coffee, coconut milk, ice"
                                }
                            ]
                        },
                        {
                            "query": "Which of our products contain chocolate?",
                            "answer": "Several products contain chocolate. These include…",
                            "sources": [
                                {
                                    "file_id": "1234",
                                    "snippet": "with cocoa for an extra touch of chocolate…"
                                }
                            ]
                        }
                    ]
                }
            },
        }
    ]
    // Other fields omitted for brevity    
}

Usage example

The following example uses a hypothetical product information Knowledge Graph to answer a question about which food products contain both food coloring and chocolate.

Create the tools array

First, define the tools array with the type set to graph. The function object contains the graph_ids, description, and subqueries parameters.

In this example, the subqueries are included in the response. Subqueries can be useful for debugging or for providing additional context to the user about how the model arrived at the answer.

"tools": [
    {
        "type": "graph",
        "function": {
            "description": "Knowledge Graph containing information about Acme Inc. food products",
            "graph_ids": [
                "6029b226-1ee0-4239-a1b0-cdeebfa3ad5a"
            ],
            "subqueries": true
        }
    }
]

Send the request

Add the tools array to the chat endpoint call along with your array of messages. Setting tool_choice to auto allows the model to choose when to use the Knowledge Graph tool, based on the user’s question and the description of the tool.

This example streams the response in real-time as the model generates it.

If you are unfamiliar with the chat completions endpoint or streaming vs. non-streaming responses, learn more in the chat completion guide.

curl --location 'https://api.writer.com/v1/chat' \
    --header 'Content-Type: application/json' \
    --header "Authorization: Bearer $WRITER_API_KEY" \
    --data '{
        "model": "palmyra-x-004",
        "messages": [
            {
                "role": "user",
                "content": "Which of our products contain both food coloring and chocolate?"
            }
        ],
        "tool_choice": "auto",
        "tools": [
            {
                "type": "graph",
                "function": {
                    "description": "Knowledge Graph containing information about Acme Inc. food products",
                    "graph_ids": [
                        "6029b226-1ee0-4239-a1b0-cdeebfa3ad5a"
                    ],
                    "subqueries": true
                }
            }
        ],
        "stream": true
    }'

Display Knowledge Graph subqueries and sources

You may want to display the sources or subqueries in your UI to assist your user in understanding how the model derived the answer to the question. The following example shows how to display the subqueries as well as the status of the query from the Knowledge Graph.

from writerai import Writer

# Initialize the client.
client = Writer()

messages = [{"role": "user", "content": "Which of our products contain both food coloring and chocolate?"}]

response = client.chat.chat(
    model="palmyra-x-004",
    messages=messages,
    tools=tools,
    tool_choice="auto",
    stream=True
)

for chunk in response:
    if chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="", flush=True)

    if chunk.choices[0].delta.graph_data is not None:
        if chunk.choices[0].delta.graph_data.status is not None:
            print(f"Query status: {chunk.choices[0].delta.graph_data.status}")
        if chunk.choices[0].delta.graph_data.subqueries:
            print(f"Subquery: {chunk.choices[0].delta.graph_data.subqueries[0].query}")

Next steps

By following this guide, you can reference Knowledge Graphs in your users’ chats in your application.

Next, learn how to delegate user questions to a domain-specific LLM with the prebuilt LLM tool.