WRITING • SYSTEMS • AI • GOVERNANCE

Coherence Without Comprehension

“When the proxy becomes the interface, it starts to define reality.”

How modern systems stay stable when nobody understands them end-to-end—and why AI makes this everyone’s problem.

January 16, 2026
“But Brawndo has what plants crave! It’s got electrolytes! …Okay—what are electrolytes? Do you know? Yeah. It’s what they use to make Brawndo. But why do they use them in Brawndo? What do they do? They’re part of what plants crave. But why do plants crave them? Because plants crave Brawndo, and Brawndo has electrolytes.” — Idiocracy
Idiocracy: Brawndo has what plants crave (electrolytes)

There are days when the world feels like it moved because a number twitched.

A jobs print surprises and markets reprice the future. A dashboard needle shifts and a team ships a rollback. A model score dips and a launch freezes. The day’s “what’s real” gets routed through an interface: a metric, an eval, a gate, a rating, a chart.

That isn’t irrational. It’s how large systems coordinate when nobody can afford to rebuild reality from first principles every morning.

The thesis

When shared understanding gets too expensive, we don’t stop acting. We act through proxies.

And when proxies become the interface to truth, whoever controls the proxy layer starts to control reality.

The crossing point

This pattern shows up when two curves cross:

Change accelerates. Tools get better, iteration gets cheaper, loops get tighter. With AI, the marginal cost of producing a plausible change—code, copy, strategy, analysis—drops toward zero.

Shared understanding saturates. Human comprehension doesn’t scale with iteration rate. Mentorship, deep ownership, and “I can explain the whole system” knowledge saturate early, then stall.

At that crossing point, a system has a choice: slow down to preserve comprehension, or keep pace by replacing comprehension with controls.

Most systems pick pace. Not because they’re careless, but because the environment pays for tempo—and slowing down feels like choosing to lose on purpose.

The Proxy Stack

Once you’re past the crossing point, organizations and institutions converge on the same template—software, finance, media, medicine, education, anywhere speed and scale outrun shared comprehension.

The stack
  • Reality gets too fast to hold in heads. End-to-end understanding stops being the default.
  • Proxies compress reality. Tests, evals, KPIs, dashboards, risk ratings, checklists.
  • Proxies become procedure. You can’t ship without “green.”
  • “Green” expands into “true.” “Passes checks” becomes “acceptable,” then “safe,” then “real.”
  • Contestability declines. Disagreement becomes an excavation—slow, costly, access-dependent.
  • Conflicts force arbitration. When proxies disagree with reality, someone has to decide what “green” means.

That last step is where authority enters—not as ideology, but as an operational necessity.

Why AI makes this louder

AI doesn’t just make output cheap. It makes plausible output cheap.

It increases the number of changes that could be shipped, refactors that might be “fine,” product variants that might work, model updates that might improve a benchmark.

That does two things: it explodes throughput faster than humans can form shared mental models, and it turns selection into the bottleneck. When you can generate ten directions before lunch, the scarce resource isn’t code—it’s “what counts.”

So organizations shift from comprehension to governance: eval harnesses, tests, policy checks, rollout gates, dashboards. Those become the coordination layer.

Scene 1: Healthcare

In healthcare, the Proxy Stack is impossible to miss once you’ve been on the receiving end of it.

A living body is compressed into vitals, lab values, imaging results, diagnostic codes, and risk scores. Care decisions route through what can be charted. Coverage routes through what can be coded.

If it’s not in the chart, it didn’t happen.

This is how medicine scales. Clinicians rotate. Shifts end. Specialists consult briefly. No one holds the whole patient continuously, but the chart persists. Coherence is preserved even when comprehension is fragmented.

The failure mode is familiar. Symptoms that don’t map cleanly to codes are discounted. “Normal labs” override lived experience. Contesting care requires navigating billing systems, appeals, second opinions, and institutional escalation.

Disagreement doesn’t vanish—it becomes procedural. The question shifts from “what is happening to this person?” to “what does the chart justify?”

Scene 2: Education

Education runs on proxies because understanding doesn’t scale.

Curiosity, confusion, mastery, and insight are compressed into grades, test scores, GPAs, rankings, and completion rates. Advancement routes through what can be measured. Opportunity routes through what can be compared.

The score becomes the student.

This enables massive coordination. Teachers change year to year. Institutions vary. Evaluators never meet most students. Yet the system remains legible because the proxies are portable.

Over time, the proxies begin to shape behavior. Teaching optimizes for the test. Learning optimizes for the rubric. Understanding that doesn’t map cleanly to the measurement surface fades from view.

Contesting the outcome means contesting the proxy—appeals, exceptions, accommodations. The disagreement isn’t about learning. It’s about whether the number can be changed.

Scene 3: Social Media

On platforms, the Proxy Stack operates in real time.

Expression is compressed into engagement: likes, views, shares, follows, watch time. These metrics determine visibility, monetization, and reach. What spreads is what the interface can reward.

The metric becomes the audience.

This makes coordination possible at planetary scale. No one understands the full network. No one needs to. The numbers route attention automatically.

Authority emerges quietly. Content isn’t judged directly; it’s ranked, downweighted, flagged, or boosted. Contesting outcomes means appealing moderation systems, reputation scores, or opaque policy thresholds.

Most people don’t argue with the algorithm. They adapt to it. Over time, expression shifts toward what survives the proxy layer.

These aren’t edge cases. They’re the default pattern when scale outruns shared understanding. Once a proxy layer becomes the interface, it doesn’t just report reality—it trains everyone to optimize for what the interface can see.

AI accelerates that shift because it floods the system with plausible moves. When change becomes cheap, “what counts” becomes governance—and governance becomes power.

Proxy debt and epistemic power

A proxy is a claim: if this stays within bounds, reality is okay. That claim decays. As systems evolve, proxies drift away from what they were meant to represent. The gap between what the proxy certifies and what reality becomes accumulates like debt.

Proxy debt is why systems can look stable right up until they snap—and why, after the snap, everyone says: “But everything looked green.”

Once proxies are the interface, maintaining them becomes governance. Who decides which evals count, what thresholds mean “ship,” what gets instrumented, what shows up on dashboards—decides what the system can even perceive. That’s power over truth.

The real design goal: preserve contestability

The wrong goal is “keep full shared understanding forever.” Once change becomes cheap and continuous, that’s fantasy—AI or otherwise.

The winnable goal is: preserve contestability. When comprehension stops scaling, your last line of defense is whether the system can still be challenged without asking permission.

Mechanisms that travel well
  • Orthogonal verification paths: Pair every critical proxy with an independent grounding signal.
  • Validity windows: Make proxies expire. Revalidate, document blind spots, define invalidation triggers.
  • Proxy-conflict protocol: Pre-commit how conflicts are resolved so you don’t improvise politics.
  • Protected dissent: Rotating stop-the-line roles, real escalation paths, anti-retaliation policy.
  • Ground-truth drills: Rebuild KPIs from raw logs, audit evals against outcomes, trace behavior end-to-end.

The thing I actually worry about

AI will make change cheaper than our old instincts can handle. That’s not just a software story. It’s a coordination story.

We’re going to respond the way big systems always respond: by moving truth into interfaces, and interfaces into gates.

The question isn’t whether we’ll use proxies. We will. The question is whether proxies remain tools—or become sovereigns.

Once challenging reality requires permission, the system may stay coherent. It may even be reliable. But it stops being corrigible.

And once corrigibility dies, authority doesn’t just arbitrate disputes. It arbitrates reality.