Skip to main content

Overview

Bring Your Own Model (BYOM) lets you connect your own AI provider API keys to Harmonica. When BYOM is active, your sessions use your own keys instead of Harmonica’s — removing usage limits and giving you full control over which models power your facilitation.

Why use BYOM?

  • No credit limits — BYOM sessions don’t consume your Harmonica balance
  • Model choice — pick the exact model for your use case (faster, cheaper, higher quality, or open-source)
  • Data control — requests go directly to your chosen provider under your own terms of service
  • Self-hosting — combine BYOM with a self-hosted Harmonica instance for fully self-contained deployment

Supported providers

Harmonica supports API keys from these providers:
ProviderExample modelsNotes
OpenAIGPT-4o, GPT-4o mini, GPT-5Most widely used
AnthropicClaude Sonnet 4.6, Claude HaikuHigh quality facilitation
GoogleGemini 2.5 Flash, Gemini 3.1 ProGood balance of speed and quality
OllamaLlama, Mistral, QwenLocal models, fully offline
OpenAI-compatibleAny provider with an OpenAI-compatible APICustom base URL supported

Setting up BYOM

1

Go to Settings

Navigate to SettingsAI Models tab in your Harmonica dashboard.
2

Add an API key

Click Add API Key and enter:
  • Provider — select from the supported list, or choose “OpenAI-compatible” for custom endpoints
  • API Key — your key from the provider’s dashboard
  • Base URL — auto-filled for standard providers, customizable for self-hosted or compatible APIs
  • Name — a label to help you identify this key
3

Select models for each tier

Harmonica uses three model tiers for different tasks:
  • Small — lightweight tasks (e.g., classification, short responses)
  • Medium — main facilitation conversations
  • Large — summaries, synthesis, and complex analysis
Choose which of your available models to use for each tier.
4

Create sessions as usual

Once configured, all new sessions automatically use your models. No other changes needed — the facilitation experience is the same.

Platform models vs BYOM

Even without BYOM, Harmonica provides platform models:
ModelProviderTierAvailable on Free plan
GPT-4o miniOpenAISmallYes
Gemini 2.5 FlashGoogleSmall / Medium / LargeYes
Apertus 70BPublicAI (Swiss AI)Small / MediumYes
Claude Sonnet 4.6AnthropicMedium / LargePro only
GPT-5OpenAIMedium / LargePro only
Gemini 3.1 ProGoogleMedium / LargePro only
BYOM gives you access to any model from any supported provider, regardless of your Harmonica plan.

How billing works with BYOM

  • BYOM users pay their AI provider directly — Harmonica doesn’t charge for LLM usage
  • Session creation doesn’t reserve balance — no credit checks when creating or running sessions
  • Summary generation skips settlement — no per-participant costs deducted
  • You still need a Harmonica account (free plan is sufficient with BYOM)
BYOM is especially useful for organizations that already have enterprise agreements with AI providers, or for self-hosters who want to run Harmonica with local models via Ollama.