Authorizations
API key for authentication. Make sure to include the word apikey
, followed by a single space and then your token.
Example: apikey 1234$abcdef
Path Parameters
Model ID
Response
OK
Category of the model.
"Text Classification"
Default flavor for the model.
"inference-16vcpu-232gib-1xh100-80gb"
Description of the model.
"My first model"
Developer of the model.
"Stability AI"
Path to the documentation page.
"/docs"
URL to the EULA text.
"https://example.com/eula"
Example curl request to the model.
"curl -X POST http://localhost:8080/predict -d '{\"data\": \"sample\"}'"
Whether the model has an EULA.
true
Model ID.
"3fa85f64-5717-4562-b3fc-2c963f66afa6"
Image registry of the model.
"123e4567-e89b-12d3-a456-426614174999"
Image URL of the model.
"registry.hub.docker.com/my_model:latest"
Describing underlying inference engine.
"torch"
"tensorflow"
Describing model frontend type.
"gradio"
"vllm"
"triton"
Model name to perform inference call.
"mistralai/Pixtral-12B-2409"
Name of the model.
"model1"
OpenAI compatibility level.
"full"
"partial"
"none"
Port on which the model runs.
8080
Version of the model.
"v0.1"