Skip to main content
LLM providers connect Sinas to language model APIs. Supported providers:
TypeDescription
openaiOpenAI API (GPT-4, GPT-4o, o1, etc.) and OpenAI-compatible endpoints
anthropicAnthropic API (Claude 3, Claude 4, etc.)
mistralMistral AI (Mistral Large, Pixtral, etc.)
ollamaLocal models via Ollama
Key properties:
PropertyDescription
nameUnique provider name
provider_typeopenai, anthropic, mistral, or ollama
api_keyAPI key (encrypted at rest, never returned in API responses)
api_endpointCustom endpoint URL (required for Ollama, useful for proxies)
default_modelModel used when agents don’t specify one
configAdditional settings (e.g., max_tokens, organization_id)
is_defaultWhether this is the system-wide default provider
Provider resolution for agents:
  1. Agent’s explicit llm_provider_id if set
  2. Agent’s model field with the resolved provider
  3. Provider’s default_model
  4. System default provider as final fallback
Endpoints (admin only):
POST   /api/v1/llm-providers             # Create provider
GET    /api/v1/llm-providers             # List providers
GET    /api/v1/llm-providers/{name}      # Get provider
PATCH  /api/v1/llm-providers/{id}        # Update provider
DELETE /api/v1/llm-providers/{id}        # Delete provider