Skip to main content
Tero Tero connects to your observability platform, analyzes your telemetry, and helps you understand what’s valuable and what’s waste. This page explains what we see, what we store, and how you stay in control.

What Tero sees

Tero connects via API to your observability platform (Datadog, Splunk, etc.). We read your telemetry to understand its structure and patterns. From this analysis, we build a semantic catalog: field names, types, volume patterns, quality classifications. This metadata powers everything Tero does: identifying waste, answering questions, suggesting optimizations.
We don’t store your log content, metric values, or trace data. The catalog describes the shape and patterns of your telemetry, not the content itself. We can tell you “this log event costs $8,000/month” without storing what those logs say.
For AI classification, we send samples to an AI provider. Samples are processed in memory and discarded, not persisted in our systems or the provider’s. This is enforced through API contracts with zero-retention guarantees. Each integration documents exactly what permissions it requires. You control what access to grant, and you can revoke or scope it down anytime. See How Tero works for the full picture.

Self-hosted: we never see your data

If you can’t send data to Tero Cloud, self-host instead. The control plane runs in your infrastructure. AI classification uses your provider: AWS Bedrock, Azure OpenAI, or anything compatible. Your network, your security boundary, your compliance certifications. Tero provides the software. You run it. We never see your telemetry, your samples, or your metadata. We provide updates and support; you control everything else. This isn’t a limited version. It’s the same product, running where you need it.
Need SOC 2 or HIPAA compliance now? Self-host the control plane. Your existing certifications apply, no waiting on ours.

Bring your own AI

Even with Tero Cloud, you can use your own AI provider. By default, we use Anthropic Claude with zero-retention API agreements. If you prefer your own provider (AWS Bedrock, Azure OpenAI, or others), configure Tero to use your API keys and your data agreements. Your samples go to your approved provider, under your compliance terms.

Security

Application security: All code changes require review. Automated tests run before deploy. Dependencies are scanned for vulnerabilities and patched promptly. Secrets are managed through Doppler and GCP Secret Manager, never in code. Infrastructure security: Everything runs on GCP with encryption in transit (TLS 1.3) and at rest (AES-256). Access requires SSO with MFA. Production access is time-limited and logged. See Details for specifics. Incident response: We monitor for security issues continuously. If something affects your data, we notify you within 24 hours. Vulnerability reporting: Found an issue? Email . We respond within 24 hours.

Compliance

FrameworkStatus
GDPR / CCPACompliant (DPA available)
SOC 2 Type 22026
Penetration testingQ1 2025
If you need SOC 2 today, self-host. Your existing certifications cover Tero running in your infrastructure.

Get what you need

We respond to security questionnaires (SIG, CAIQ, VSA) and provide:
  • Data Processing Agreement with SCCs
  • Architecture documentation
  • Security questionnaire responses

Request documents

Email with what you need and your timeline.
For infrastructure details, sub-processors, and compliance specifics, see Details.