Skip to main content
Use your own Anthropic API key for Tero’s AI classification. This is our recommended provider.

Connect

1

Run tero

Open your terminal and run:
tero
This opens the Tero TUI, which guides you through setup.
Don’t have the CLI installed? See the quickstart.
2

Log in to Tero

The TUI opens your browser to create an account or log in. Complete the flow in your browser, then confirm the code shown in your terminal. The TUI logs you in automatically.
3

Select Anthropic

The TUI asks which integration you want to connect. Select Anthropic.
4

Create an API key

Go to the Anthropic Console and create an API key. Copy the key.
5

Paste your API key

Paste the API key into the TUI. Tero validates the key and confirms the connection.
6

Done

Tero is now configured to use your Anthropic API key for classification.

How Tero uses AI

Tero sends telemetry samples for classification. This powers:
  • Log event classification: Understanding what each log event represents and whether it’s valuable
  • Service enrichment: Adding context about what services do based on their telemetry patterns
  • Waste identification: Determining which telemetry is debug noise, health checks, or otherwise low-value
See How Tero evaluates waste for the full classification process.

What gets sent

Tero sends sample log lines, metric names, and span names. These samples include:
  • Log message bodies and attributes
  • Service names and descriptions
  • Metric and span metadata
Samples are processed in memory and discarded. They are not stored by Tero or the AI provider.

Usage scales with diversity, not volume

AI usage depends on how many unique event types you have, not how much telemetry you generate. If you have 1 billion logs but they all match 1,000 distinct event types, Tero only needs to classify those 1,000 events. The other 999,999,000 logs match existing classifications and don’t require AI calls. Rough estimates:
EnvironmentUnique event typesInitial analysis
Small (10 services)~500-2,000~50K-200K tokens
Medium (50 services)~2,000-10,000~200K-1M tokens
Large (200+ services)~10,000-50,000~1M-5M tokens
After initial analysis, ongoing usage is minimal. New event types are rare once your telemetry patterns are understood.
These are rough estimates. Actual usage depends on telemetry diversity, not volume. A service with verbose logging creates more unique event types than one with structured logs.

Data handling

Anthropic’s API has a zero-retention policy for API requests. Samples sent for classification are:
  • Processed in memory
  • Not stored by Anthropic
  • Not used to train models

Fallback

You can configure OpenAI as a fallback in case Anthropic is unavailable. Fallback triggers on unrecoverable errors (API unavailable, authentication failures). Rate limits do not trigger fallback.

API key

RequirementDetails
ModelClaude Sonnet (latest)
PermissionsDefault API key permissions
Rate limitsStandard limits are sufficient