Cerebras
GDPR Compliance
Data Handling
Cerebras announced inference data centers in the United States and Canada, and announced a Europe region planned for Q4 2025. Primary-source materials reviewed do not state that API customers can enforce EU-only routing or processing. Cerebras also stated in April 2025 that all inference-serving data centers were in North America at that time.
Cerebras states it does not retain inputs and outputs associated with its training, inference, and chatbot services, and deletes logs associated with those services when they are no longer necessary to provide services.
Prompt caching documentation says caches are ZDR-compliant, remain ephemeral in memory, are never persisted, and are stored in key-value stores colocated in the same data center as the serving model instance, with a TTL guaranteed for 5 minutes and up to 1 hour depending on load.
Certifications & EU AI Act
Verification
- https://inference-docs.cerebras.ai ↗
- https://inference-docs.cerebras.ai/capabilities/prompt-caching ↗
- https://inference-docs.cerebras.ai/api-reference/authentication ↗
- https://www.cerebras.ai/privacy-policy ↗
- https://www.cerebras.ai/pricing ↗
- https://www.cerebras.ai/contact ↗
- https://www.cerebras.ai/company ↗
- https://www.cerebras.ai/press-release/cerebras-announces-six-new-ai-datacenters-across-north-america-and-europe-to-deliver-industry-s ↗
- https://www.cerebras.ai/news/meta-unleashes-llama-api-running-18x-faster-than-openai-cerebras-partnership-delivers-2-600 ↗
- https://trust.cerebras.ai/ ↗
Cerebras provides a public privacy policy and a trust center that indicates a DPA and subprocessor information are available, but the reviewed public pages did not expose enough primary-source detail to confirm SCC terms, EU-only routing guarantees, or whether customer API data is used for model training. Their current public posture suggests predominantly North American inference infrastructure, with a Europe region announced but no verified customer-selectable EU-only processing commitment found.