Ask AI

Use case

Data reduction

Automatically remove irrelevant and low-value logs at scale so your SIEM receives cleaner, lighter, and more actionable telemetry.

Start free trial
data reduction

The challenge

Filter noise before it becomes cost

SOC teams are buried under duplicate, redundant, and low-value logs that inflate SIEM costs and hide real threats. Relying on brittle scripts and one-off regex rules creates overhead, slows down detection, and fails to scale in enterprise environments. 

The Solution

DataStream – intelligent security telemetry pipeline

Filter with confidence

DataStream removes low-value events and redundant fields in real time using clear, transparent policies. You keep the information needed for detections and investigations, while dropping the rest before it becomes costly to store, process, or search.

Reduce without losing fidelity

Because reduction happens after normalization and enrichment, decisions are based on clean, context-rich data. This ensures far fewer false drops and makes every reduction outcome more predictable and reliable.

Two-step filtering

DataStream reduces telemetry in two safe steps. First, it trims redundant and low-signal fields, typically cutting ingest volume by around 50% without affecting detection quality. Then, it filters out low-value events using clear rules such as event IDs, datasets, IPs, or known-benign activity. These steps preserve security-critical data while achieving up to 90% total reduction before SIEM ingestion.

No-code configuration

Reduction policies are built in the UI using contextual filters and processors (not regex, brittle rules, or custom scripts). Teams can tune policies quickly as environments evolve, without introducing operational overhead.

Key benefits

Why this approach works

Lower ingestion
costs

Reduce volumes by up to 90% before SIEM ingestion

Cleaner
signals

Less noise means fewer false positives and faster triage

Less manual
work

Policies replace brittle scripts and ad-hoc regex

Easier
compliance

Only the right data is retained, reducing audit friction

Frequently asked questions

How much reduction can we expect?

Results vary by source and policy, but customers typically get 50-90% SIEM ingest reduction. 

Can we keep certain events unfiltered? 

Yes. You can create exceptions/allow-lists by source, dataset, severity, or field to ensure critical logs always pass. 

How can we handle sensitive data?

Use policy-based masking and redaction to remove or obfuscate fields like credentials or PII before forwarding.

Do we need to write scripts?

No. Policies and vendor packs control filtering; you tune them in the UI instead of maintaining custom code.

Does filtering impact system performance?

Filtering is applied in the pipeline using vectorized processing. Benchmarks show no measurable latency even at 150k EPS.

Get DataStream on Azure Marketplace

azure marketplace

Deploy DataStream in minutes with Azure Managed Identity support built in. No credential management, no manual setup.