Skip to main content
POST
https://dashboard.laburen.com
/
api
/
agents
/
{agentId}
/
query
curl --location --request POST 'https://dashboard.laburen.com/api/agents/<agentId>/query' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer <API_KEY>' \
--data-raw '{
    "query": "Hello, I need help with my order",
    "conversationId": "clxxxxxxxxxxxxxxxxx",
    "visitorId": "clxxxxxxxxxxxxxxxxx"
}'
{
  "answer": "Hello! I'd be happy to help you with your order. Could you please provide your order number so I can look it up?",
  "usage": {
    "completionTokens": 28,
    "promptTokens": 1250,
    "totalTokens": 1278,
    "cost": 0.0064
  },
  "sources": [
    {
      "source": "FAQ.pdf",
      "chunk": "To check your order status, please provide your order number..."
    }
  ],
  "approvals": [],
  "messageId": "clxxxxxxxxxxxxxxxxx",
  "conversationId": "clxxxxxxxxxxxxxxxxx",
  "visitorId": "clxxxxxxxxxxxxxxxxx",
  "request_human": false,
  "status": "UNRESOLVED"
}
This endpoint allows you to send messages to an AI agent and receive responses. It supports:
  • Simple text queries
  • File attachments (documents, images, audio)
  • Real-time response streaming
  • Continuation of existing conversations
  • Contact association (CRM)

Path

agentId
string
required
The ID of the agent you want to query (CUID format).

Body

Required

query
string
required
The user’s message or question.

Optional - Basic Configuration

streaming
boolean
default:"false"
If true, responds with Server-Sent Events in real-time.
conversationId
string
ID to continue an existing conversation. Auto-generated if not provided.
visitorId
string
Unique ID of the visitor/user. Auto-generated if not provided.
context
string
Additional context for the AI (e.g., specific instructions).

Optional - File Attachments

attachments
array
List of file attachments.

Optional - Contact (CRM)

contact
object
Contact data to associate with the conversation.

Response (without streaming)

answer
string
The agent’s response.
sources
array
Sources used to generate the response.
messageId
string
ID of the response message.
conversationId
string
ID of the conversation (save this to continue the conversation).
visitorId
string
ID of the visitor.
request_human
boolean
Whether the agent requested human intervention.
status
string
Conversation status (e.g., UNRESOLVED, RESOLVED).
usage
object
Token usage and cost information.
approvals
array
List of approvals for tool executions (if any).

Streaming Response

When streaming: true, the endpoint responds with Server-Sent Events:
Content-Type: text/event-stream

event: answer
data: Hello,

event: answer
data:  how can

event: answer
data:  I help you?

event: endpoint_response
data: {"messageId":"...","answer":"...","conversationId":"..."}

data: [DONE]
Events:
  • answer: Partial response text (concatenate to build the full answer)
  • endpoint_response: Full response object (JSON) with all metadata

Error Responses

Status CodeTypeDescription
401UNAUTHORIZEDInvalid API Key or insufficient permissions
404NOT_FOUNDAgent not found
500Internal ErrorError processing the message
curl --location --request POST 'https://dashboard.laburen.com/api/agents/<agentId>/query' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer <API_KEY>' \
--data-raw '{
    "query": "Hello, I need help with my order",
    "conversationId": "clxxxxxxxxxxxxxxxxx",
    "visitorId": "clxxxxxxxxxxxxxxxxx"
}'
{
  "answer": "Hello! I'd be happy to help you with your order. Could you please provide your order number so I can look it up?",
  "usage": {
    "completionTokens": 28,
    "promptTokens": 1250,
    "totalTokens": 1278,
    "cost": 0.0064
  },
  "sources": [
    {
      "source": "FAQ.pdf",
      "chunk": "To check your order status, please provide your order number..."
    }
  ],
  "approvals": [],
  "messageId": "clxxxxxxxxxxxxxxxxx",
  "conversationId": "clxxxxxxxxxxxxxxxxx",
  "visitorId": "clxxxxxxxxxxxxxxxxx",
  "request_human": false,
  "status": "UNRESOLVED"
}

Streaming Example (JavaScript)

import {
  EventStreamContentType,
  fetchEventSource,
} from '@microsoft/fetch-event-source';

const apiUrl = 'https://dashboard.laburen.com/api';
const apiKey = '<API_KEY>';
const agentId = '<agentId>';

let answer = '';
let endpointResponse = '';
const ctrl = new AbortController();

await fetchEventSource(`${apiUrl}/agents/${agentId}/query`, {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
    'Authorization': `Bearer ${apiKey}`,
  },
  signal: ctrl.signal,
  body: JSON.stringify({
    streaming: true,
    query: 'Hello, I need help',
    conversationId: 'optional-conversation-id',
    visitorId: 'optional-visitor-id',
  }),

  async onopen(response) {
    if (response.status === 401) {
      throw new Error('Unauthorized');
    }
    if (response.status === 402) {
      throw new Error('Usage limit exceeded');
    }
  },

  onmessage: (event) => {
    if (event.data === '[DONE]') {
      // End of stream
      ctrl.abort();

      // Parse the full response
      const fullResponse = JSON.parse(endpointResponse);
      console.log('Full response:', fullResponse);
    } else if (event.data?.startsWith('[ERROR]')) {
      console.error('Stream error:', event.data);
    } else if (event.event === 'endpoint_response') {
      endpointResponse += event.data;
    } else if (event.event === 'answer') {
      answer += event.data;
      // Update UI with partial answer
      console.log('Partial answer:', answer);
    }
  },

  onerror: (error) => {
    console.error('Connection error:', error);
  },
});