Apr 1, 2026
Enterprise Server v5.0.0

Bring Your Own Key (BYOK)

DeepSource Enterprise Server customers can now run AI Review using their own model provider credentials. Inference calls go directly from your Enterprise Server instance to your chosen provider, without passing through DeepSource Cloud or any third-party endpoint.

Supported providers

ModelProviders
Anthropic ClaudeAmazon Bedrock, direct API
OpenAI GPT CodexAzure OpenAI, direct API
Google GeminiGCP Vertex AI, direct API

Configuration requires two model deployments:

  • A flagship model that powers AI Code Review
  • A smaller, faster model that handles everything else (generating issue descriptions, filtering, summarization)

Security and compliance

With BYOK, inference calls stay within your existing compliance boundary. If your org has a BAA with Azure OpenAI or a data residency agreement with GCP Vertex AI, those terms govern every AI feature on DeepSource. This matters for teams operating under SOC 2, HIPAA, FedRAMP, or internal policies that require DPAs with every vendor in the data path.

BYOK is available on all Enterprise Server v5.0.0 deployments. See the blog post for details and the docs for setup instructions.

The AI Code Review Platform for fast-moving teams and their agents.

14-day free trial, no credit card needed
For growing teams and enterprises