DePIN networks generate enormous amounts of physical-world data — sensor readings, node performance, energy output, geographic coverage. Most of it has never been collected in one place, let alone analyzed. We built KAIROS to change that: 350+ projects, 254+ variables each, cross-referenced against 800,000+ physical sensors, and processed through 17 layers of econophysics and information theory. Query it all with SQL on Snowflake — the dataset that lets you see what's actually happening in decentralized infrastructure.
KAIROS Signal provides data and analytics for informational and research purposes only. Not investment advice. Past performance does not guarantee future results.
DePIN is a new asset class, and until now, there hasn't been a standardized data layer for it. No common schema, no shared ground truth. We built this for the teams who need that foundation — delivered via Snowflake, queryable with standard SQL.
Most crypto data platforms repackage the same exchange feeds. We took a different approach: connect directly to 350+ DePIN project APIs and 800,000+ physical sensors from government agencies like NOAA, EIA, and Maritime AIS — then apply the kind of mathematics usually reserved for physics research. The result is a data layer that connects market activity to physical reality, delivered through Snowflake so you can query it all with SQL.
Query everything with standard SQL in your existing Snowflake environment. No SDKs to install, no API keys to manage, no rate limits to worry about. Just connect our data share and start writing queries. It fits right into your existing dbt, Python, or BI workflows.
800,000+ sensors from NOAA Space Weather, US Energy Grids, Maritime AIS, and 15+ government agencies. By correlating real-world conditions with on-chain data, we can see whether a network's reported performance matches what's actually happening on the ground.
17 analytical layers applying methods from statistical mechanics, information theory, game theory, and nonlinear dynamics. These are the same mathematical frameworks physicists use to model complex systems — adapted here for decentralized infrastructure markets.
A simple but powerful idea: compare a token's market capitalization to its verified physical utility — real node counts, actual throughput, measured uptime. The gap between market price and physical reality often tells you something the charts alone can't.
Helium, Render, Filecoin, Akash, Hivemapper, and hundreds more — refreshed continuously. Node counts, uptime, throughput, earnings, churn, geographic coverage. Each metric validated against physical ground truth rather than self-reported data.
Markets don't behave the same way all the time — they trend, they consolidate, they enter chaotic phases. Our regime detection identifies which state a market is in, so you can adjust your analysis accordingly. Think of it as context for every data point.
Here's what you'll be working with in Snowflake. This is the live schema — not a mockup. Some analytical outputs are redacted below, but the structure gives you a clear picture of what's available.
{
"project": "HNT",
"timestamp": "2026-02-15T09:30:00Z",
"network": {
"active_nodes": 382941,
"uptime_30d": 0.9847,
"throughput_mbps": ████████,
"geographic_entropy": ██.████
},
"reality_gap": 0.34,
"regime": "accumulation",
"composite_score": ██.████,
"layer_outputs": {
"L1_sieve": ████,
"L2_regime": ████,
"L3_nash": ████,
"...": ████,
"L17_synthesis": ████
},
"sensor_refs": ["NOAA:SPW", "EIA:ELEC", ████]
}
SELECT project, timestamp, reality_gap,
regime, composite_score,
layer_outputs:████ AS l3_nash,
layer_outputs:████ AS l7_ising,
sensor_correlation:████
FROM kairos_signal.public.depin_vectors
WHERE project = 'HNT'
AND timestamp >= '2026-02-01'
AND reality_gap > 0.25
ORDER BY timestamp DESC
LIMIT 1000;
Full schema documentation, sample queries, and table catalog available.
View Data Catalog →We borrow heavily from physics, information theory, and game theory — fields that have spent decades modeling complex systems with many interacting parts. Markets, especially young ones like DePIN, behave a lot like physical systems: they have phase transitions, emergent behavior, and hidden structure. Here's a glimpse of how we think about the data. Some details are redacted, but the ideas are real.
In any market, participants are making strategic decisions based on what they think others will do. This is a Nash payoff matrix — it models those interactions mathematically. Think of it as mapping the incentive landscape: who benefits from what, and where the equilibria settle.
This one's intuitive: take everything a network actually does (uptime, throughput, coverage) and compare it to what the market says it's worth. The gap between real utility and market price is often where the most interesting insights live.
This is an Ising model — borrowed from condensed matter physics, where it describes how atoms align in magnets. Applied here, it models how market participants influence each other's behavior. When the "coupling constant" J shifts, the system can undergo phase transitions — sudden, collective changes in behavior. Sound familiar?
Markets look different at different timescales. A 15-minute trend might be noise; the same pattern at 6 hours might be structural. This decomposes the signal across multiple windows simultaneously, helping separate meaningful moves from random fluctuations.
Full technical brief, SQL schema, and integration guide available now.
Read Technical Brief →We ingest roughly 100 million data points per day from 198 sources. By the time it lands in your Snowflake environment, every data point has been cleaned, validated against physical sensors, enriched across 17 analytical layers, and tagged with regime context. Here's what that journey looks like:
530K symbols · 198 sources · ~100M new/day
20% anomaly gate · Poison filtered · Deduped
Regime-tagged · Reality Gap scored · AI-graded
Standard SQL · Your warehouse · Your models
Connects to your existing Snowflake environment as a data share — query with standard SQL, plug directly into your warehouse, dbt models, or notebooks.
Request AccessAll data delivered through Snowflake. Start with what you need, upgrade as your work evolves. Every plan includes schema documentation, sample queries, and support from people who actually understand the data.
A strong starting point for understanding DePIN through data.
Snowflake data share provisioned on subscription.
Deeper layers, more tables, cross-domain correlations.
Expanded table set with cross-domain data.
The complete dataset and every analytical layer we produce.
Everything we compute. Every table we maintain.
For teams that need isolation, compliance, or dedicated support.
We'll shape the delivery around your workflow.
Snowflake is our primary delivery today. Here's what we're building next as the platform grows — including purpose-trained AI models and a REST API for developers who prefer programmatic access.
A lightweight JSON API for teams that want programmatic access without Snowflake. Same data, same schema — delivered as HTTP endpoints with API key auth. Perfect for dashboards, bots, and lightweight integrations.
In developmentReal-time streaming for latency-sensitive workflows. Sub-minute updates pushed directly to your application — ideal for dashboards, monitoring, and time-critical analysis.
In developmentAsk questions about DePIN in plain language, get answers grounded in our full dataset. Purpose-trained models for macro context, risk assessment, contrarian signals, and pattern recognition.
Training on 1B+ data pointsInteractive dashboards for teams that want visual exploration alongside their SQL workflows. Portfolio monitoring, project comparison, and real-time health status at a glance.
PlannedHave a feature you'd love to see? We'd genuinely love to hear about it — we're building this for the people who use it.
Share Your Ideas →Tell us a bit about what you're working on and we'll provision your Snowflake data share. We read every submission and typically respond within a day.