Skip to main content
POST
/
browser
/
sessions
/
{id}
/
ai
/
evaluate
/
stream
AI Evaluate (Stream)
curl --request POST \
  --url https://api.example.com/browser/sessions/{id}/ai/evaluate/stream \
  --header 'Content-Type: application/json' \
  --data '
{
  "prompt": "<string>",
  "mode": "<string>",
  "maxSteps": 123,
  "model": "<string>",
  "schema": {}
}
'

Overview

Same input as POST /browser/sessions/{id}/ai/evaluate, but the response is a long-lived Server-Sent Events stream. Response Content-Type is text/event-stream. Each chunk follows the OpenAI Chat Completions streaming shape — a data: line carrying a JSON object, separated by a blank line. The stream terminates with data: [DONE].

Path Parameters

id
string
required
Browser session ID (UUID).

Body

Same shape as the non-streaming endpoint.
prompt
string
required
Natural-language instruction for the model.
mode
string
default:"simple"
One of simple, agent.
maxSteps
integer
default:"20"
Agent-mode step cap.
model
string
Explicit LLM model ID.
schema
object
JSON schema for structured output.

Example Request

curl -N -X POST "https://api.scrapengine.io/api/v1/browser/sessions/baa3f390-fa6e-4a24-b84a-a575a5f3a9c7/ai/evaluate/stream" \
  -H "Authorization: Bearer $SCRAPENGINE_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "prompt": "Summarize this page in 3 bullets.",
    "mode": "simple"
  }'

Response

Success Response (200)

  • Content-Type: text/event-stream
  • Cache-Control: no-cache
  • Connection: keep-alive
Each chunk is a JSON payload matching OpenAI’s streaming format. Example stream:
data: {"id":"agent-baa3f390","object":"chat.completion.chunk","choices":[{"index":0,"delta":{"role":"assistant"}}]}

data: {"id":"agent-baa3f390","object":"chat.completion.chunk","choices":[{"index":0,"delta":{"content":"- Example"}}]}

data: {"id":"agent-baa3f390","object":"chat.completion.chunk","choices":[{"index":0,"delta":{"content":" Domain is..."}}]}

data: {"id":"agent-baa3f390","object":"chat.completion.chunk","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}

data: [DONE]
On error, a final chunk is emitted with delta.content: "Error: ..." and finish_reason: "stop", followed by data: [DONE].

Error Responses

StatusDescription
400Invalid body — same validations as the non-streaming endpoint.
401Unauthorized — invalid or missing API key.
404Session not found or not owned by the caller.

Notes

  • Use -N (no-buffer) with cURL to see chunks as they arrive.
  • Any OpenAI-compatible SSE parser can consume this endpoint.
  • For one-shot JSON output without streaming, use POST /browser/sessions/{id}/ai/evaluate.