Stop Stream

Stop a chat response that's currently generating.

Endpoint

POST /api/ai/chat/:chatId/stop

Request

curl -X POST BOX_URL/api/ai/chat/my-chat-id/stop \
  -H "x-api-key: YOUR_API_KEY"

Replace my-chat-id with the id you used in the chat request.

Response

{
  "success": true,
  "message": "Stream stopped successfully"
}

Or if no active stream:

{
  "success": false,
  "message": "No active stream found"
}

Check Stream Status

You can also check if a stream is still active:

GET /api/ai/chat/:chatId/stream-status
curl BOX_URL/api/ai/chat/my-chat-id/stream-status \
  -H "x-api-key: YOUR_API_KEY"

Response

{
  "status": "active",
  "isActive": true,
  "startTime": "2024-01-15T10:30:00Z",
  "lastActivity": "2024-01-15T10:30:05Z",
  "messageCount": 42
}

Status Values

StatusDescription
activeCurrently generating
completedFinished normally
abortedStopped by user
errorError occurred
not_foundNo stream for this ID

JavaScript Example

chat-with-timeout.js
const BOX_URL = 'https://your-box.intelligencebox.it'; // Your server URL

async function chatWithTimeout(message, timeoutMs = 30000) {
  const chatId = 'chat-' + Date.now();

  // Start the chat
  const chatPromise = fetch(`${BOX_URL}/api/ai/chat`, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'x-api-key': 'YOUR_API_KEY'
    },
    body: JSON.stringify({
      id: chatId,
      messages: [{ role: 'user', content: message }],
      boxAddress: BOX_URL
    })
  });

  // Set up timeout
  const timeoutId = setTimeout(async () => {
    console.log('Timeout - stopping stream...');
    await fetch(`${BOX_URL}/api/ai/chat/${chatId}/stop`, {
      method: 'POST',
      headers: { 'x-api-key': 'YOUR_API_KEY' }
    });
  }, timeoutMs);

  try {
    const response = await chatPromise;
    // Parse response...
    clearTimeout(timeoutId);
    return response;
  } catch (e) {
    clearTimeout(timeoutId);
    throw e;
  }
}

When to Stop

  • Response is taking too long
  • User clicks "Stop" button
  • You have enough of the response
  • AI is going off-topic