Open Responses API Streaming Events
This document describes the Server-Sent Events (SSE) emitted by an Open Responses API implementation.
It is written from the perspective of clients (UI, CLI, notebook integrations) that consume streaming responses.
Overview
The API uses SSE (
text/event-stream) and emits events in this wire format:event: <event_name>
data: {"type":"<event_name>","event":"<event_name>", ...payload...}
Notes:
- Terminal state is represented by
response.completedorresponse.failed. - Some runtimes also emit a separate
errorevent beforeresponse.failed.
Common Event Families
Events can belong to different categories:
1. Response lifecycle
response.createdresponse.in_progressresponse.completedresponse.failederror(non-terminal helper event; usually followed byresponse.failed)
response.* lifecycle events typically include:response: a response snapshot object with fields such as:idobject("response")statusmodelcreatedAtmetadatausageoutput
2. Output/message assembly events
These allow a client to render an assistant message incrementally.
response.output_item.addedresponse.content_part.addedresponse.output_text.deltaresponse.text.doneresponse.content_part.doneresponse.output_item.done
Typical fields:
response_idoutput_indexcontent_indexitemorpartdelta(incremental text)text(final text for the part)
3. Function-call events
response.function_call.arguments.deltaresponse.function_call.arguments.done
Typical fields:
response_idoutput_indexcall_iddelta(JSON string chunk)arguments(final JSON string)
4. File search events
Emitted in rich streams when file search style tools run.
response.file_search_call.in_progressresponse.file_search_call.searchingresponse.file_search_call.completed
Typical fields:
response_idcall_idtool_name
5. Code interpreter events
Emitted in
code_mode, biomni, and ds_star.response.code_interpreter.in_progressresponse.code_interpreter.call.code.deltaresponse.code_interpreter.call.code.doneresponse.code_interpreter.call.interpretingresponse.code_interpreter.call.completedresponse.code_interpreter.output_text.deltaresponse.code_interpreter.output_image.delta
Typical fields:
response_idcall_iddelta(code or output text)mime_type,base64(for image deltas)
6. Refusal / annotation events
Some implementation also emit OpenAI-style text metadata events when available:
response.refusal.deltaresponse.refusal.doneresponse.output_text.annotation.added
These are optional and depend on upstream container output.
Response Snapshot Shape (what clients should expect)
Across runtimes,
response.* events carry a response object with a similar core shape:{
"id": "uuid",
"object": "response",
"status": "running|completed|failed|deleted",
"model": "gpt-5-mini",
"createdAt": "unix-ts-string",
"metadata": { "...": "..." },
"usage": {
"input_tokens": 0,
"output_tokens": 0,
"total_tokens": 0,
"input_token_details": {},
"output_token_details": {},
"usage_usd": 0.0
},
"output": []
}
Client Integration Guidance
Event handling strategy (recommended)
Handle the stream as:
response.created→ initialize UI stateresponse.in_progress→ refresh status/usage/progress- output/function/code events → render incremental content and tools
error(optional) → show error banner/logresponse.completedorresponse.failed→ finalize UI state
Treat unknown events as non-fatal
New event types may be added. A robust client should:
- ignore unknown events
- log them for debugging
- continue processing
Use response snapshots as source of truth
Incremental events are useful for UX, but final status/usage should come from the latest
response snapshot in:response.in_progressresponse.completedresponse.failed
Cancellation / Delete Semantics
Current implementation notes:
- A
DELETE /responses/{id}marks the taskdeleted. - In coarse polling, a deleted task is streamed as
response.failed(terminal mapping). - Rich streams typically terminate as soon as the server-side stream ends; deletion is primarily represented through the response object state and delete endpoint behavior.
Minimal SSE Client Example (Python)
from openai import OpenAI
client = OpenAI(
api_key="<TWENTYMINDS_API_KEY>",
base_url="https://api.20minds.ai/v1/border-collie",
)
stream = client.responses.create(
model="gpt-5.2",
stream=True,
input="Hello",
)
for event in stream:
print(event.type, event)
if event.type in ("response.completed", "response.failed"):
break
Minimal SSE Client Example (TypeScript)
Using the
openai package (recommended for Node/server-side clients):import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.TWENTYMINDS_API_KEY,
baseURL: "https://api.20minds.ai/v1/border-collie",
});
async function streamResponse(): Promise<void> {
const stream = await client.responses.create({
model: "gpt-5.2",
stream: true,
input: "Hello",
});
for await (const event of stream) {
console.log(event.type, event);
if (event.type === "response.completed" || event.type === "response.failed") {
break;
}
}
}
Notes:
- Set
baseURLto the runtime-specific endpoint prefix (for example/v1/border-collie). - The SDK yields typed stream events with
event.typevalues likeresponse.output_text.delta,response.in_progress, andresponse.completed. - Do not use this pattern directly in the browser with a secret API key.
Related Document
For runtime base URLs and the full endpoint surface (
/responses, /files, /health, /reset) across research agents, see Open Responses APIs for Research Agents.