Stream an AI evaluation against the current page as Server-Sent Events in an OpenAI-compatible format.
POST /browser/sessions/{id}/ai/evaluate, but the response is a long-lived Server-Sent Events stream. Response Content-Type is text/event-stream.
Each chunk follows the OpenAI Chat Completions streaming shape — a data: line carrying a JSON object, separated by a blank line. The stream terminates with data: [DONE].
simple, agent.Content-Type: text/event-streamCache-Control: no-cacheConnection: keep-alivedelta.content: "Error: ..." and finish_reason: "stop", followed by data: [DONE].
| Status | Description |
|---|---|
400 | Invalid body — same validations as the non-streaming endpoint. |
401 | Unauthorized — invalid or missing API key. |
404 | Session not found or not owned by the caller. |
-N (no-buffer) with cURL to see chunks as they arrive.POST /browser/sessions/{id}/ai/evaluate.