Task API
Call LLM
The Call endpoint is a simple interface to issue a task to an LLM. It is a declarative interface with input and output schemas that supports text, image, audio inputs and outputs and is highly model agnostic.
POST
Authorizations
Bearer authentication header of the form Bearer <token>
, where <token>
is your auth token.
Body
application/json
Response
200
application/json
Successful Response
The response is of type object
.
Previous
Stream LLM CallStream a function call execution in real-time using Server-Sent Events (SSE).
This endpoint returns a continuous stream of ServerSentEvent objects as the function executes,
allowing for real-time streaming of responses. The response follows the Server-Sent Events
specification with proper event structure for SDK compatibility.
Each ServerSentEvent contains:
- `id`: Optional event identifier
- `event`: Optional event type
- `data`: StreamingChunk with actual content
- `retry`: Optional retry interval
The StreamingChunk data payload includes:
- `delta`: Incremental text content (if any)
- `span_id`: Unique identifier for the execution span (when available)
Note: When streaming is enabled, any output_schema will be ignored as structured output
cannot be streamed. The response will be unstructured text content.
Next