Open Responses APIs for Research Agents

This document summarizes the API endpoints exposed by the 20minds research-agent runtimes using the Open Responses-compatible interface.
It focuses on the runtime URLs used by clients such as the OpenAI SDK, CLI tools, and web apps.

Runtime Base URLs

Each runtime exposes the same core Open Responses endpoint shape under a different base URL:
RuntimeBase URLReference
20minds (Border Collie)https://api.20minds.ai/v1/border-collie20minds.ai
AgentMDhttps://api.20minds.ai/v1/agentmdNature Communications
Biomnihttps://api.20minds.ai/v1/biomnibioRxiv
DeepEvidencehttps://api.20minds.ai/v1/deepevidencearXiv
DS-Starhttps://api.20minds.ai/v1/ds-stararXiv
DSWizardhttps://api.20minds.ai/v1/dswizardNature Biomedical Engineering
GeneAgenthttps://api.20minds.ai/v1/geneagentNature Methods
InformGenhttps://api.20minds.ai/v1/informgenJAMIA
TrialGPThttps://api.20minds.ai/v1/trialgptNature Communications
TrialMind SLRhttps://api.20minds.ai/v1/trialmind-slrnpj digital medicine
VirtualLabhttps://api.20minds.ai/v1/virtuallabNature
Use your 20minds API key as an api_key in the OpenAI SDK (or a Bearer token in requests).
These runtimes use the same core endpoint shape described below (/responses, /files, /health, /reset), with runtime-specific behavior in models, tools, and response metadata.

Core Endpoints

These are the main endpoints clients should use.

Health

  • GET /health
Purpose:
  • Basic service health check for the selected runtime.
Example:
curl -sS https://api.20minds.ai/v1/biomni/health

Responses API (Open Responses-compatible)

Create a response

  • POST /responses
Purpose:
  • Create a new response/task.
  • Supports polling (stream: false) or SSE streaming (stream: true).
Notes:
  • This is the primary endpoint used by client.responses.create(...).
  • previous_response_id continues a conversation/session when supported by the runtime.
  • extra_body.task_mode is primarily relevant to Border Collie (agent vs coding).

Retrieve a response

  • GET /responses/{response_id}
Purpose:
  • Fetch the current response snapshot (status, usage, output, metadata).
Typical statuses:
  • running
  • completed
  • failed
  • deleted

Delete a response

  • DELETE /responses/{response_id}
Purpose:
  • Cancel/delete a response task.
Notes:
  • For container-backed runtimes (for example Biomni and DS-Star), this also tears down the corresponding underlying session state for that response.

File Upload Endpoints

These endpoints support the upload flow used by the OpenAI SDK-compatible file interface.

Create file slot

  • POST /files
Purpose:
  • Create a file record and receive an upload URL.
Typical response fields:
  • id
  • response_id
  • filename
  • status
  • upload_url
  • upload_expires_at

Upload file bytes

  • PUT /files/{file_id}
Purpose:
  • Upload raw bytes to the file record created by POST /files.
Notes:
  • This is usually called via the returned upload_url.
  • The upload URL is time-limited.

List files

  • GET /files
Purpose:
  • List uploaded/available files for the runtime/session context.

Download file

  • GET /files/{file_id}
Purpose:
  • Retrieve file contents (or file metadata proxy behavior, depending on runtime/file type).

Delete file

  • DELETE /files/{file_id}
Purpose:
  • Remove an uploaded file.

Runtime Utility Endpoints

Reset workspace/runtime state

  • POST /reset
Purpose:
  • Reset runtime workspace/session state for the authenticated user.
Notes:
  • Available on the routed runtime API surface used by the 20minds application.
  • Useful when you want a clean runtime without deleting individual responses/files.

Endpoint Matrix by Runtime

All three runtimes expose the same core shape:
  • GET /health
  • POST /responses
  • GET /responses/{response_id}
  • DELETE /responses/{response_id}
  • POST /files
  • PUT /files/{file_id}
  • GET /files
  • GET /files/{file_id}
  • DELETE /files/{file_id}
  • POST /reset
Runtime-specific behavior differs mainly in:
  • Streaming event richness (response.code_interpreter.*, file-search events, etc.)
  • Supported model lists and defaults
Use the OpenAI SDK and point base_url to the runtime prefix:
from openai import OpenAI

client = OpenAI(
    base_url="https://api.20minds.ai/v1/border-collie",
    api_key="<TWENTYMINDS_API_KEY>",
)

resp = client.responses.create(
    model="gpt-5.2",
    input="Summarize the key pathways involved in apoptosis.",
)

print(resp.id, resp.status)