OpenAI
Multi supports two OpenAI providers:
- OpenAI - Direct API access with your OpenAI API key
- OpenAI Compatible - Connect to any OpenAI-compatible API endpoint (e.g., local servers, third-party hosts)
OpenAI (Direct API)
Section titled “OpenAI (Direct API)”- Get an API key from platform.openai.com
- Open the Multi panel → Settings (gear icon)
- Click Add Profile
- Select OpenAI as the provider
- Paste your API key (starts with
sk-) - Choose a model
- Save
OpenAI Compatible
Section titled “OpenAI Compatible”Use any OpenAI-compatible API endpoint. This is useful for:
- Self-hosted models with OpenAI-compatible APIs
- Third-party providers that expose an OpenAI-compatible interface
- Local inference servers (vLLM, text-generation-inference, etc.)
- Open the Multi panel → Settings
- Click Add Profile
- Select OpenAI Compatible as the provider
- Enter your Base URL (e.g.,
http://localhost:8000/v1) - Enter your API key (if required)
- Enter the model ID
- Save
Configuration Options
Section titled “Configuration Options”| Option | Description |
|---|---|
| API Key | Your OpenAI API key |
| Base URL | API endpoint (OpenAI Compatible only) |
| Model | Select or enter a model ID |
| Temperature | Control response randomness |
| Max Tokens | Maximum output tokens |
Pricing
Section titled “Pricing”See OpenAI’s pricing page for current rates.