top of page
gordian background.png

Bigger inputs.
Smarter models.
Smaller footprints.

Embedded software that breaks through constraints in CPU/GPU processing and memory usage, training high data-intensity models on memory limited GPUs (requiring hundreds of gigabytes) and inference on edge devices in single-digit megabytes (requiring hundreds of megabytes).

Logo - Our Color-01_edited.png

01

Edge AI in Constrained Devices

Multiple CNNs running in parallel at the edge with minimal CPU/GPU and memory usage.

02

ML Training and Inference for Extremely High Data Intensity

Up to 4K pixel arrays run on standard object detection models; more data means better learning and higher accuracy.

Real World Applications

Utilities:
Grid-Edge Intelligence

Real-time anomaly detection and classification for predictive grid maintenance. 

Industrial: Electrical Panel and Motor Current Analysis 

Condition-based monitoring of on-premise electrical infrastructure and powered equipment. 

Health and Wellness

Enabling edge AI for resource-constrained devices; enhanced ML and inference for extremely high data-intensity applications. 

Event Spotlight

Gordian at
Itron Inspire 2025

At Itron Inspire 2025, CTO Shekar Mantha outlined how Gordian’s edge-AI platform now powers key elements of Itron’s Grid Edge Intelligence suite. By enabling high-performance, low-latency inference on resource-constrained devices, Gordian makes real-time anomaly detection and advanced grid diagnostics possible at massive scale.

This collaboration highlights both the technical maturity of Gordian’s platform and its growing commercial impact across the utility ecosystem.

Why Gordian?

Edge ML inference

with minimal CPU/GPU and memory usage and power draw. 

Shared memory across multiple models

for extremely high data-intensity model training. 

Workflow friendly

drop-in embedded software layer, no model customization required. 

bottom of page