Operating in Statistical Darkness
From Agreed Transparency to Engineered Opacity
In late February, a developer posted a demo of WorldView — a browser-based tool that layers live flight data, real satellite orbits, and public CCTV feeds onto a photorealistic 3D globe. He’d built it over a weekend. Someone joked he’d “vibe-coded Palantir.” The actual co-founder of Palantir felt compelled to respond on a live show.
A week later, the Iran strikes happened. Bilawal Sidhu, the developer in question, was sitting on his couch watching the news when he realised he already had the infrastructure to monitor it. By evening he had a full 4D reconstruction of the operation — flight paths clearing in real time as strikes approached, satellite passes over the strike zone, GPS jamming signatures lighting up before zero hour, tankers scattering through the Strait of Hormuz. All from public data. All replayable, minute by minute.
He titled the piece “The Intelligence Monopoly Is Over.”
The implication was seductive: anyone with enough technical skill and public data can now see what governments once saw alone.
Except that’s not quite what’s happening. And the gap between those two things is what F01 — From Agreed Transparency to Engineered Opacity — is about.
More data. Worse baselines.
The information environment isn’t becoming more legible. It’s fragmenting into tiers — and the tier most organisations, governments, and communities depend on is the one degrading fastest.
Official statistics — the shared baselines that make planning, risk assessment, and policy possible — are thinning. Stefaan Verhulst at The GovLab at NYU warned in 2024 of a coming “data winter”: a prolonged period in which data assets that could serve public interest are instead frozen, restricted, or commodified. Social media platforms that were once open to researchers have locked their data behind commercial APIs. Climate data has shifted from public good to market commodity. Generative AI anxiety is causing further hoarding — organisations restricting access not because data is sensitive, but because they fear how it might be used.
Meanwhile the public statistical infrastructure underneath most planning is under direct pressure. The American Statistical Association’s 2025 report documented the systematic hollowing of the US federal statistical system — leadership positions replaced with political appointees, expert panels disbanded, major surveys cancelled. The USDA ended its annual food security survey in September 2025, eliminating the baseline for tracking hunger at the precise moment the largest cuts to food assistance in decades took effect. This is the leading edge of a pattern F01 identified globally: governments increasingly treating statistics as tools of statecraft rather than public goods.
And where data simply doesn’t exist at all, the consequences compound differently. A December 2025 UNDP report by MIT Sloan researchers Catherine Tucker and Nan Clement identified “data deserts” — gaps where populations, communities, or entire countries don’t exist in the datasets that now determine access to credit, healthcare, disaster assistance, and employment. As AI systems are built on those datasets, the gaps compound. Absence produces the same planning problem as concealment.
Polymarket can price the probability of a regional conflict in real time. The US no longer tracks food insecurity. Both are data choices — and they reflect the same underlying logic: data flows toward commercial value and away from public utility.
What fills the void
The space left by degrading official data isn’t empty — it’s filling with alternatives that perform intelligence without necessarily producing it.
A proprietary data market worth $14-18 billion sells early warning signals — satellite imagery, credit card transactions, geolocation data — to those who can afford them, surfacing what used to be embedded in public statistics but now visible only to institutional investors and large capital allocators. Open-source aggregation tools layer public signals into visually sophisticated reconstructions. Kalshi now has data deals with CNN and CNBC; Polymarket with the Wall Street Journal. Crowd-priced probabilities are being positioned as a replacement for degrading official data — but they measure sentiment and position-taking, not ground conditions.
All of it performs intelligence. Whether any of it produces reliable grounds for decisions is a harder question — and mostly isn’t being asked. The result is an environment where the degradation of shared baselines is partially obscured by the proliferation of alternatives, making it harder to identify what you’re actually missing.
F01 named this scenario pathway statistical darkness: cascading opacity creating blind spots large enough that problems aren’t detected until they’re systemic. The conditions are assembling now.
What to do with this
Map your three most important planning assumptions to their data sources. For each one, ask: when was this data last independently verified? Who collected it, and do they still have the capacity and independence to collect it reliably? If you can’t answer those questions, you’re working from a baseline you can’t actually defend.
Identify one domain where you’re likely being out-informed. Who in your operating environment — a competitor, a regulator, a counterpart — has access to data you don’t? What decisions might they be making on the basis of that visibility? This isn’t hypothetical: the alternative data market exists because the information asymmetry is real and exploitable.
Name one early warning signal your organisation used to rely on that is now degraded, delayed, or gone. What has replaced it — formally or informally? If the answer is nothing, that’s the gap worth addressing first.
The full analysis is in F01: From Agreed Transparency to Engineered Opacity. Read it at 10fconsortium.org.
This pattern — decisions made on degrading or asymmetric information — runs through several of the 10F forecasts, including F09 on money, and F10 on technology infrastructure. If F01 lands for you, those are worth reading alongside it.
Bring this to your organization
The 10F Consortium has developed a Local Convening Toolkit for organizations and teams wanting to work through these forecasts together. It includes a facilitation guide, discussion prompts structured around the AREAS framework, and a post-convening capture form. Designed for groups of any size, with or without a professional facilitator. Access the toolkit here.
Follow the Project
Beyond this newsletter, you can find 10F on:
LinkedIn — 10F Consortium — longer-form updates and discussion
Bluesky — @10fconsortium.bsky.social — signal tracking and real-time commentary
Instagram — @10fconsortium — visual summaries and forecast highlights
GitHub — 10F Consortium — full forecast archive with version history and transparency, and new tools as we develop them
