Table of Contents
Enterprise AI deployment usually narrows to three options: self-hosted dedicated, Azure OpenAI Service, or AWS Bedrock.
Azure OpenAI: strong compliance, GPT-4 access, Azure integration. AWS Bedrock: multi-model, AWS integration, similar enterprise features. Self-hosted: cheapest at scale, full data control, requires ops team. Most enterprises end up with hybrid.
Comparison
| Aspect | Azure OpenAI | AWS Bedrock | Self-hosted |
|---|---|---|---|
| Frontier-model access | GPT-4o, o1 | Claude, Llama via Anthropic/Meta | Open-weight only |
| Cost at high volume | Per-token | Per-token | Fixed monthly |
| Data residency | Specific regions | Specific regions | Anywhere |
| Compliance certifications | SOC2, HIPAA, etc. | SOC2, HIPAA, etc. | Inherits from datacenter |
| Integration | Azure-native | AWS-native | Standalone |
| Operational overhead | Low | Low | Medium |
| Customisation | Limited fine-tuning | Limited fine-tuning | Full |
Verdict
- Need GPT-4o + Azure-native: Azure OpenAI
- Need Claude + AWS-native: AWS Bedrock
- Cost-anchored at scale: self-hosted
- UK/EU residency, no US transfer: self-hosted
- Hybrid: most large enterprises
Bottom line
The three shapes coexist comfortably. See SageMaker alternatives.