Overview
The organization is a major operator of gas and LNG infrastructure, responsible for the import, storage, and distribution of natural gas across national networks. Its assets include LNG import terminals, transmission pipelines, and gas storage facilities, classified as critical national infrastructure.
Operations run 24/7 in highly regulated, safety-critical environments, combining modern IT platforms with long-lived operational technology (OT) such as SCADA systems and field devices. Any disruption can have economic and societal impact, making cybersecurity a core operational concern.
As digitalization increased IT/OT connectivity, the operator faced growing pressure to modernize its security data architecture. Log volumes rose due to expanded network segmentation, firewalling between IT and OT zones, and stricter regulatory logging requirements.
At the same time:
- Microsoft Sentinel ingestion costs were growing faster than the budget
- Log delivery from distributed sites and terminals was unreliable
- Security engineers spent excessive time maintaining fragile ingestion pipelines
The organization was preparing to expand security monitoring across additional LNG terminals, pipeline assets, and OT environments, requiring reliable log delivery, consistent data quality, and intelligent telemetry routing without increasing SIEM spend.
By adopting VirtualMetric DataStream, the operator consolidated fragmented pipelines, closed long-standing IT/OT visibility gaps, and reduced Sentinel ingestion costs by more than half, without sacrificing security coverage.
Challenges
Key security and SIEM challenges
1. Unreliable log delivery across distributed IT/OT assets
Legacy syslog servers, custom forwarders, and site-specific scripts caused inconsistent log delivery from terminals and remote OT environments. During congestion or maintenance, data was intermittently lost, creating blind spots the SOC could not tolerate.
2. SIEM cost pressure from high-volume network and OT logs
Extensive segmentation between IT, OT, and DMZ zones generated large volumes of firewall and network telemetry. While required for security and compliance, this data pushed Sentinel ingestion costs beyond budget limits.
The organization needed to:
– Reduce Sentinel ingestion by 50–75%
– Filter noise before logs reached Sentinel
– Maintain full security and forensic visibility
– Avoid disabling critical gas infrastructure telemetry
3. Manual, high-risk pipeline engineering
Security engineers maintained dozens of custom pipelines using Logstash, scripts, and vendor-specific parsers. In OT environments, even small pipeline changes required strict change management and narrow maintenance windows, increasing operational risk as these systems operate 24/7 and sit close to safety-critical control networks.
Challenges included:
– Frequent schema changes from network and OT vendors
– Brittle parsing logic tied directly to Sentinel schemas
– High effort to onboard new terminals or pipeline assets
Engineers spent more time fixing pipelines than improving detection coverage.
4. Inefficient log routing and retention
Long-term retention requirements meant the organization had to store 6–12 months of telemetry but keeping all logs inside Sentinel’s Log Analytics workspace was too expensive.
Without a centralized routing engine, it was difficult to:
– Separate high-value security events from bulk or compliance logs
– Route high-volume data (e.g., DNS/firewall/syslogs) to lower-cost storage (Sentinel data lake, ADX or Blob Storage)
– Control Sentinel ingestion without losing visibility
– Meet retention and compliance requirements without inflating SIEM costs
Solution
DataStream: a unified security data pipeline
The provider selected VirtualMetric DataStream as a unified security data pipeline, replacing fragmented ingestion paths with automated, high-performance telemetry engine.
DataStream enabled the organization to:
– Apply real-time filtering and field reduction before data reached Microsoft Sentinel
– Deliver consistent, normalized, ASIM-ready data to Microsoft Sentinel
– Route high-volume logs to low-cost storage
– Ensure zero data loss with a WAL-based architecture
– Replace dozens of manual pipelines with automated, scalable processing
Implementation highlights
Stabilizing Syslog and network log collection
DataStream replaced unstable syslog servers with a reliable, source-aware collector. Firewall and network logs were ingested consistently, normalized, and routed to the appropriate destinations.
Re-enabling high-volume firewall telemetry
Firewall logs that had been previously disabled due to cost and overload were safely re-enabled. DataStream filtered noise and compressed data before forwarding, restoring complete visibility without exceeding budget.
Automating and standardizing ingestion pipelines
DataStream eliminated the need for Logstash, custom scripts, and manual parsers. Normalization, enrichment, and field mapping were fully automated, enabling engineers to focus on threat detection rather than pipeline maintenance.
Tiering data intelligently across destinations
Using DataStream’s conditional routing, the organization implemented a cost-efficient tiering strategy: high-value events → Microsoft Sentinel; bulk, verbose, or compliance logs → ADX or Blob Storage; non-actionable logs → archived with 80% compression.
Results
Security, cost, and operational impact
✔ 63% reduction in Sentinel ingestion costs
By removing noise, dropping redundant fields, and routing non-critical data to cost-effective storage, the provider dramatically reduced monthly ingestion costs.
✔ Hours of manual work eliminated
DataStream replaced fragile pipelines and manual configuration with automated, no-code processors, reducing engineering workload by 45%.
✔ Reliable, complete, and consistent log delivery
The SOC have predictable, gap-free data from syslog, firewalls, Windows, and OT sources supported by zero-loss WAL architecture.
✔ Faster threat detection and better data quality
Normalized, enriched, ASIM-ready logs improved rule accuracy and reduced false positives, giving analysts cleaner signals and fewer distractions.
✔ Future-ready data architecture
With DataStream in place, the organization can confidently scale into OT telemetry ingestion and long-term log retention without increasing SIEM costs.
Talk to our experts
Schedule a technical session with our engineering team to explore DataStream’s architecture, deployment options, and integration capabilities.
Try DataStream
Test DataStream’s vectorized pipeline with your SIEM environment. Process terabytes of data with 99% compression while maintaining complete security visibility.
Try now