Skip to main content
POST
/
completions
/
create_stream
Completions Create Stream
curl --request POST \
  --url https://api.example.com/completions/create_stream \
  --header 'Authorization: Basic <encoded-value>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "project_id": "<string>",
  "inputs": {
    "model": "<string>",
    "messages": [],
    "timeout": 123,
    "temperature": 123,
    "top_p": 123,
    "n": 123,
    "stop": "<string>",
    "max_completion_tokens": 123,
    "max_tokens": 123,
    "modalities": [
      "<unknown>"
    ],
    "presence_penalty": 123,
    "frequency_penalty": 123,
    "stream": true,
    "logit_bias": {},
    "user": "<string>",
    "response_format": {},
    "seed": 123,
    "tools": [
      "<unknown>"
    ],
    "tool_choice": "<string>",
    "logprobs": true,
    "top_logprobs": 123,
    "parallel_tool_calls": true,
    "extra_headers": {},
    "functions": [
      "<unknown>"
    ],
    "function_call": "<string>",
    "api_version": "<string>"
  },
  "wb_user_id": "<string>",
  "track_llm_call": true
}
'
[
  {}
]

Authorizations

Authorization
string
header
required

Basic authentication header of the form Basic <encoded-value>, where <encoded-value> is the base64-encoded string username:password.

Body

application/json
project_id
string
required
inputs
CompletionsCreateRequestInputs · object
required
wb_user_id
string | null

Do not set directly. Server will automatically populate this field.

track_llm_call
boolean | null
default:true

Whether to track this LLM call in the trace server

Response

Stream of data in JSONL format