GET
/
cloud
/
v3
/
inference
/
models
import os
from gcore import Gcore

client = Gcore(
    api_key=os.environ.get("GCORE_API_KEY"),  # This is the default and can be omitted
)
page = client.cloud.inference.models.list()
page = page.results[0]
print(page.id)
{
  "count": 1,
  "results": [
    {
      "category": "Text Classification",
      "default_flavor_name": "inference-16vcpu-232gib-1xh100-80gb",
      "description": "My first model",
      "developer": "Stability AI",
      "documentation_page": "/docs",
      "eula_url": "https://example.com/eula",
      "example_curl_request": "curl -X POST http://localhost:8080/predict -d '{\"data\": \"sample\"}'",
      "has_eula": true,
      "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
      "image_registry_id": "123e4567-e89b-12d3-a456-426614174999",
      "image_url": "registry.hub.docker.com/my_model:latest",
      "inference_backend": "torch",
      "inference_frontend": "gradio",
      "model_id": "mistralai/Pixtral-12B-2409",
      "name": "model1",
      "openai_compatibility": "full",
      "port": 8080,
      "version": "v0.1"
    }
  ]
}

Authorizations

APIKey
string
header
required

API key for authentication.

Query Parameters

limit
integer
default:1000

Optional. Limit the number of returned items

Required range: 0 < x <= 1000
Examples:

1000

offset
integer
default:0

Optional. Offset value is used to exclude the first set of records from the result

Required range: x >= 0
Examples:

0

order_by
enum<string>

Order instances by transmitted fields and directions

Available options:
name.asc,
name.desc

Response

200 - application/json

OK

The response is of type object.