Online Providers

Connect hosted AI providers and manage cloud model access

Online providers let you use hosted models with your API keys, without running model inference on your device.

Need API key links? See Find Online Model API Keys.

Connect an Online Provider

Open Model Providers

Go to Model Hub > Model Providers.

Add Provider

Select Add Provider and choose your provider.

Enter Credentials

Add your API key (and any required endpoint details), then save. If you need key locations, use Find Online Model API Keys.

Test in Chat

Open a conversation and select one of the provider's models.

Supported Providers

Msty Studio supports these online provider options:

  • Anthropic
  • Cohere
  • Gemini
  • Groq
  • Mistral AI
  • OpenAI
  • OpenRouter
  • Perplexity
  • Together AI
  • SambaNova
  • xAI
  • Azure OpenAI
  • Amazon Bedrock

For key locations and endpoint references, see Find Online Model API Keys.

Bring Your Own Provider

If your provider exposes an OpenAI-compatible API, you can connect it by using an OpenAI-compatible endpoint in Model Providers.

This is useful for:

  • Self-hosted gateways
  • Enterprise AI platforms
  • Private inference endpoints that follow the OpenAI API format

Manage Provider Setup

After connecting, you can:

  • Rename providers and models for readability
  • Set a default model in chat
  • Apply model purpose tags from Managing Models