Bring Your Own Key (BYOK)
DeepSource Enterprise Server customers can now run AI Review using their own model provider credentials. Inference calls go directly from your Enterprise Server instance to your chosen provider, without passing through DeepSource Cloud or any third-party endpoint.
Supported providers
| Model | Providers |
|---|---|
| Anthropic Claude | Amazon Bedrock, direct API |
| OpenAI GPT Codex | Azure OpenAI, direct API |
| Google Gemini | GCP Vertex AI, direct API |
Configuration requires two model deployments:
- A flagship model that powers AI Code Review
- A smaller, faster model that handles everything else (generating issue descriptions, filtering, summarization)
Security and compliance
With BYOK, inference calls stay within your existing compliance boundary. If your org has a BAA with Azure OpenAI or a data residency agreement with GCP Vertex AI, those terms govern every AI feature on DeepSource. This matters for teams operating under SOC 2, HIPAA, FedRAMP, or internal policies that require DPAs with every vendor in the data path.
BYOK is available on all Enterprise Server v5.0.0 deployments. See the blog post for details and the docs for setup instructions.