POST
/
agents
/
{id}
/
query
Query an agent for a specific question.
curl --request POST \
  --url https://dashboard.laburen.com/api/agents/{id}/query \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '{
  "query": "<string>",
  "conversationId": "<string>",
  "visitorId": "<string>",
  "temperature": 123,
  "streaming": true,
  "modelName": "gpt_4o",
  "maxTokens": 123,
  "presencePenalty": 123,
  "frequencyPenalty": 123,
  "topP": 123,
  "filters": {
    "custom_ids": [
      "<string>"
    ],
    "datasource_ids": [
      "<string>"
    ]
  },
  "systemPrompt": "<string>",
  "userPrompt": "<string>",
  "promptType": "raw",
  "promptTemplate": "<string>"
}'
{
  "answer": "<string>",
  "conversationId": "<string>",
  "visitorId": "<string>",
  "sources": [
    {}
  ]
}

Streaming

When streaming is enabled, the endpoint will emit events “answer” (answer of the model) and “endpoint_response” (full response of the endpoint)
import {
  EventStreamContentType,
  fetchEventSource,
} from '@microsoft/fetch-event-source';

let buffer = '';
let bufferEndpointResponse = '';
const ctrl = new AbortController();

await fetchEventSource(queryAgentURL, {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
        },
        signal: ctrl.signal,
        body: JSON.stringify({
          streaming: true,
          query,
          conversationId,
          visitorId,
        }),

        async onopen(response) {
          if (response.status === 402) {
            throw new ApiError(ApiErrorType.USAGE_LIMIT);
          }
        },
        onmessage: (event) => {
          if (event.data === '[DONE]') {
            // End of stream
            ctrl.abort();

            try {
              const { sources, conversationId, visitorId } = JSON.parse(
                bufferEndpointResponse
              ) as ChatResponse;
            } catch {}
          } else if (event.data?.startsWith('[ERROR]')) {
            // ...
          } else if (event.event === "endpoint_response") {
            bufferEndpointResponse += event.data;
          } else if (event.event === "answer") {
            buffer += event.data;
            // ...
          }
       },
  });

Authorizations

Authorization
string
header
required

Bearer authentication header of the form Bearer <token>, where <token> is your auth token.

Path Parameters

id
string
required

ID (or handle) of the agent

Body

application/json
query
string
required

Query for the agent

conversationId
string

Existing conversation ID

visitorId
string

Visitor / user ID

temperature
number
streaming
boolean
modelName
enum<string>
Available options:
gpt_4o,
gpt_4o_mini,
gpt_41,
gpt_41_mini,
gpt_41_nano,
claude_3_7_sonnet,
claude_4_sonnet,
gemini_2_5_flash,
gemini_2_5_pro,
grok_3,
grok_3_beta,
grok_3_fast_beta,
grok_3_mini_beta,
grok_3_mini_fast_beta,
sonar_pro,
sonar,
llama_3_3_70b,
llama_4_maverick_17b,
model_route
maxTokens
number
presencePenalty
number
frequencyPenalty
number
topP
number
filters
object
systemPrompt
string
userPrompt
string
promptType
enum<string>
deprecated
Available options:
raw,
customer_support
promptTemplate
string
deprecated

Response

Success

answer
string
conversationId
string
visitorId
string
sources
object[]