curl --location --request POST 'https://dashboard.laburen.com/api/agents/<agentId>/query' \--header 'Content-Type: application/json' \--header 'Authorization: Bearer <API_KEY>' \--data-raw '{ "query": "Hello, I need help with my order", "conversationId": "clxxxxxxxxxxxxxxxxx", "visitorId": "clxxxxxxxxxxxxxxxxx"}'
Copy
{ "answer": "Hello! I'd be happy to help you with your order. Could you please provide your order number so I can look it up?", "usage": { "completionTokens": 28, "promptTokens": 1250, "totalTokens": 1278, "cost": 0.0064 }, "sources": [ { "source": "FAQ.pdf", "chunk": "To check your order status, please provide your order number..." } ], "approvals": [], "messageId": "clxxxxxxxxxxxxxxxxx", "conversationId": "clxxxxxxxxxxxxxxxxx", "visitorId": "clxxxxxxxxxxxxxxxxx", "request_human": false, "status": "UNRESOLVED"}
Agentes
Agent - Query
Send messages to an AI agent and receive responses. Supports text queries, file attachments, real-time streaming, and conversation continuity.
POST
https://dashboard.laburen.com
/
api
/
agents
/
{agentId}
/
query
Copy
curl --location --request POST 'https://dashboard.laburen.com/api/agents/<agentId>/query' \--header 'Content-Type: application/json' \--header 'Authorization: Bearer <API_KEY>' \--data-raw '{ "query": "Hello, I need help with my order", "conversationId": "clxxxxxxxxxxxxxxxxx", "visitorId": "clxxxxxxxxxxxxxxxxx"}'
Copy
{ "answer": "Hello! I'd be happy to help you with your order. Could you please provide your order number so I can look it up?", "usage": { "completionTokens": 28, "promptTokens": 1250, "totalTokens": 1278, "cost": 0.0064 }, "sources": [ { "source": "FAQ.pdf", "chunk": "To check your order status, please provide your order number..." } ], "approvals": [], "messageId": "clxxxxxxxxxxxxxxxxx", "conversationId": "clxxxxxxxxxxxxxxxxx", "visitorId": "clxxxxxxxxxxxxxxxxx", "request_human": false, "status": "UNRESOLVED"}
This endpoint allows you to send messages to an AI agent and receive responses. It supports:
When streaming: true, the endpoint responds with Server-Sent Events:
Copy
Content-Type: text/event-streamevent: answerdata: Hello,event: answerdata: how canevent: answerdata: I help you?event: endpoint_responsedata: {"messageId":"...","answer":"...","conversationId":"..."}data: [DONE]
Events:
answer: Partial response text (concatenate to build the full answer)
endpoint_response: Full response object (JSON) with all metadata
curl --location --request POST 'https://dashboard.laburen.com/api/agents/<agentId>/query' \--header 'Content-Type: application/json' \--header 'Authorization: Bearer <API_KEY>' \--data-raw '{ "query": "Hello, I need help with my order", "conversationId": "clxxxxxxxxxxxxxxxxx", "visitorId": "clxxxxxxxxxxxxxxxxx"}'
Copy
{ "answer": "Hello! I'd be happy to help you with your order. Could you please provide your order number so I can look it up?", "usage": { "completionTokens": 28, "promptTokens": 1250, "totalTokens": 1278, "cost": 0.0064 }, "sources": [ { "source": "FAQ.pdf", "chunk": "To check your order status, please provide your order number..." } ], "approvals": [], "messageId": "clxxxxxxxxxxxxxxxxx", "conversationId": "clxxxxxxxxxxxxxxxxx", "visitorId": "clxxxxxxxxxxxxxxxxx", "request_human": false, "status": "UNRESOLVED"}
When your agent has conversational mode enabled, use webhookUrl to receive the AI response asynchronously:
cURL
Copy
curl --location --request POST 'https://dashboard.laburen.com/api/agents/<agentId>/query' \--header 'Content-Type: application/json' \--header 'Authorization: Bearer <API_KEY>' \--data-raw '{ "query": "Hello, I need help", "channel": "api", "webhookUrl": "https://your-server.com/webhook/laburen"}'
Immediate Response:
Copy
{ "status": "queued", "conversationId": "clxxxxxxxxxxxxxxxxx", "visitorId": "visitor-abc123", "inputMessageId": "msg_input123", "message": "Your message has been queued. Response will be sent to the provided webhookUrl after processing.", "webhookUrl": "https://your-server.com/webhook/laburen"}
Webhook Receiver Example (Node.js/Express):
Copy
const express = require('express');const app = express();app.use(express.json());app.post('/webhook/laburen', (req, res) => { const event = req.headers['x-laburen-event']; if (event === 'agent.response') { const { conversationId, messages, agentResponse } = req.body; console.log('Conversation:', conversationId); console.log('Messages:', messages.length); // Process each split message messages.forEach((msg, i) => { console.log(`Message ${i + 1}:`, msg.text); }); // Full answer available in agentResponse console.log('Full answer:', agentResponse.answer); console.log('Usage:', agentResponse.usage); } res.status(200).send('OK');});app.listen(4000, () => { console.log('Webhook server listening on port 4000');});