Skip to content

OpenAI

Multi supports two OpenAI providers:

  • OpenAI - Direct API access with your OpenAI API key
  • OpenAI Compatible - Connect to any OpenAI-compatible API endpoint (e.g., local servers, third-party hosts)
  1. Get an API key from platform.openai.com
  2. Open the Multi panel → Settings (gear icon)
  3. Click Add Profile
  4. Select OpenAI as the provider
  5. Paste your API key (starts with sk-)
  6. Choose a model
  7. Save

Use any OpenAI-compatible API endpoint. This is useful for:

  • Self-hosted models with OpenAI-compatible APIs
  • Third-party providers that expose an OpenAI-compatible interface
  • Local inference servers (vLLM, text-generation-inference, etc.)
  1. Open the Multi panel → Settings
  2. Click Add Profile
  3. Select OpenAI Compatible as the provider
  4. Enter your Base URL (e.g., http://localhost:8000/v1)
  5. Enter your API key (if required)
  6. Enter the model ID
  7. Save
OptionDescription
API KeyYour OpenAI API key
Base URLAPI endpoint (OpenAI Compatible only)
ModelSelect or enter a model ID
TemperatureControl response randomness
Max TokensMaximum output tokens

See OpenAI’s pricing page for current rates.