🦀 New: Expanso ❤️ OpenClaw - Try the AI coding assistant now! Learn More →
Impact Story - Energy & Utilities

They Deleted 93% of Their Grid Data

A major power utility had tens of thousands of sensors. After 7 days, the data got deleted - not because they wanted to, but because storing petabytes at cloud rates would bankrupt the department. Engineers learned to diagnose problems fast or not at all.

You Can't Run a Power Grid from the Cloud

Why cloud-centric architectures fail for critical infrastructure

7-Day Retention Limit

Engineers needed sensor data from 8 days ago. IT had to explain it was already gone. The retention limit wasn't policy - it was budget reality. Cloud storage for petabytes would cost more than the entire IT budget.

Data deleted before analysis

Bandwidth Costs Astronomical

High-frequency sensor data at thousands of samples per second from every substation. Shipping it all to the cloud costs a fortune, and most of it is noise they don't need anyway.

Prohibitive egress costs

Security Is a Non-Starter

Critical infrastructure can't expose operational data over the public internet. Compliance requires air-gapped operations. PII from customer meters must be scrubbed before any data leaves the substation.

Massive attack surface

Index at the Substation, Store Locally, Search Globally

Each substation runs indexing and analysis. Raw sensor data stays on local storage. Only metadata and anomalies go to the cloud.

Substation-Level Indexing

Each substation cluster runs ML models that index sensor events - voltage anomalies, frequency deviations, equipment signatures, timestamps. Instead of storing petabytes of raw waveforms, store gigabytes of metadata and flagged events.

99% data reduction

Local Long-Term Storage

Raw sensor data stays on local NAS at each substation. Cost: $0.003/GB/month vs $0.023 for cloud. Go from 7-day retention to 5 years for less money. Engineers search the index, retrieve only what they need.

5-year retention, lower cost

PII Scanning and Removal

Customer meter data gets scanned for PII before leaving the substation. Automated redaction ensures compliance. Only anonymized, aggregated data flows upstream.

Compliance by default

Signed for Auditability

Every data transformation, every anomaly detection, every control decision gets cryptographically signed. Full audit trail for regulatory compliance, all air-gapped.

Regulatory compliance

How Expanso Makes This Possible

1

Deploy at Points of Presence

Lightweight compute nodes at each substation cluster. Ingest sensor streams from nearby transformers, switches, and meters. Process locally before data moves.

Near the sensors, not on them
2

Index Everything, Store Metadata

Run time-series analysis, anomaly detection, and event classification. Extract features, timestamps, and signatures. A petabyte problem becomes a gigabyte problem.

99% data reduction
3

Keep Raw Data Local

Raw waveforms stay on cheap local storage at the substation. Engineers search the index, retrieve only the specific time windows they need. No bandwidth costs.

Search globally, retrieve locally
4

Sign and Audit Everything

Cryptographically sign all transformations and decisions. Scan for PII, redact automatically. Only clean, signed, anonymized data flows to central systems.

Compliance and auditability

The Results

99%
Data Reduction
Bandwidth Savings
Index and metadata instead of raw waveforms
5 years
Retention
Was 7 Days
Local storage cheaper than cloud, longer retention for less
Zero
PII Leakage
Air-Gapped Compliance
Automated scanning and redaction at the substation
$1.8M
Annual Savings
vs. Cloud Architecture
Bandwidth and storage costs eliminated

What This Enables

Searchable Historical Analysis

Challenge: Engineers needed waveform data from 8 days ago to diagnose recurring faults. Data was already deleted. No way to search - you had to know exactly when and where to look.
Solution: Index every sensor event with timestamp, location, and signature. Search returns relevant time windows in seconds. Retrieve only what you need from local storage.
Result: 5-year searchable history

Bandwidth Cost Elimination

Challenge: Thousands of sensors at thousands of samples per second. Shipping petabytes to cloud would cost millions annually in egress fees alone.
Solution: Process and index at the substation. Send only metadata and flagged anomalies upstream. Raw data stays local on cheap storage.
Result: 99% bandwidth reduction

PII Compliance

Challenge: Customer meter data contains PII. Regulatory requirements prohibit sending it to cloud without anonymization. Manual redaction doesn't scale.
Solution: Automated PII scanning at the substation. Redact before data leaves the local network. Only anonymized aggregates flow upstream.
Result: Zero PII leakage

Audit Trail for Compliance

Challenge: Regulators require proof of every data transformation and control decision. Cloud-based systems can't guarantee air-gapped audit trails.
Solution: Cryptographically sign every operation at the substation. Full audit log stored locally and replicated to compliance systems.
Result: Regulatory compliance

Cloud-Centric vs. Substation Indexing

Cloud-Centric Approach

  • Ship all sensor data to cloud (petabytes)
  • Pay massive egress and storage costs
  • Delete after 7 days to control budget
  • No search - watch raw waveforms manually
  • PII compliance nightmare
7-day retention, millions in costs

With Expanso at Substations

  • Index at substation (gigabytes of metadata)
  • Store raw data locally on cheap storage
  • Search 5 years of history in seconds
  • Retrieve only what you need
  • PII scrubbed before leaving substation
5-year retention, fraction of the cost

Deleting Data You Might Need?

If your retention limits are budget-driven, not policy-driven, we should talk. We've deployed at utilities, cities, and critical infrastructure.