yes
, the answer will be streamed to the UI as it is generated. If you choose no
, the answer will be generated and then returned to the agent once it is complete.question
policy_answer
state variable in a text block["Company Policies", "Employee Handbook"]
@{question}
(set from a text input in the UI)policy_answer
yes
yes
Name | Type | Control | Default | Description | Options | Validation |
---|---|---|---|---|---|---|
Question | Text | - | - | The natural language question to ask. | - | - |
Use streaming | Boolean | - | yes | - | - | - |
Link Variable | Binding | - | - | Set the variable here and use it across your agent. | - | - |
Graph Ids | Graph Ids | - | [] | IDs of the graphs to query. | - | - |
Use subqueries | Boolean | - | yes | Enables LLM to ask follow-up questions to the knowledge graph. This improves answers, but may be slower. | - | - |
Name | Field | Type | Description |
---|---|---|---|
Success | - | success | Successfully streamed the answer. |
Error | - | error | If the function raises an Exception. |
Link Variable
field. Access the output by referencing the state variable you defined, or use the @{result}
variable in the next block.