← Back to Blog
physicsquantum gravitydimensionalityinformation theory

Why Dimensional Collapse Quantizes

Gravity, language, and the lattice at the bottleneck

January 3, 2026

You can't say a thought that has no words.

Your brain operates in high-dimensional state space—millions of neurons, continuous dynamics, superpositions of intention. But to communicate, all of that must squeeze through language: a discrete vocabulary, sequential words, finite bandwidth.

The vocabulary is a lattice (in the loose sense: a discrete codebook with adjacency and grammar constraints). It's pre-existing structure in the low-dimensional output space. Your thought doesn't get "translated"—it gets selected by what the lattice admits. The continuous becomes discrete not because thought is discrete, but because the bottleneck has structure.

This is quantization. And it's the same thing that happens in gravity.

(This post is deliberately schematic; the paper pins down the definitions and shows how the correction channels drop out.)


Why Low-Dimensional Space Quantizes

Two things force discreteness when you project from high to low dimensions:

1. Collisions. In sufficiently low-dimensional projections, many distinct high-D states become operationally indistinguishable—forced overlap, aliasing. (By "collisions" I mean many-to-one projection: distinct high-D states map to the same low-D point.) You cannot represent continuous dynamics faithfully—you're forced into equivalence classes.

2. Pre-existing structure. The low-dimensional space isn't a blank canvas. It already has geometry, grammar, a lattice (or discretizing codebook) of admissible configurations. New information must be compatible with this structure. The existing lattice selects what can pass through.

Together: dimensional collapse forces collisions, and the lattice resolves collisions by selecting which equivalence class you land in. That's quantization. Discreteness is imposed by the interface, not by the high-dimensional system itself.


Gravity as an Instance

Now apply this to spacetime.

In 1995, Ted Jacobson showed that Einstein's equation—the governing law of general relativity—could be derived from thermodynamics applied to local null surfaces. Heat flow across a tiny horizon, plus the entropy-area relationship from black hole physics, gives you gravity as an equation of state.

But this left a question: if gravity is thermodynamic, what should be quantized to get quantum gravity?

The standard answers—quantize the metric, or posit mysterious "horizon microstates"—have struggled for decades. Here's an alternative:

The null surface is a bottleneck.

Matter dynamics are high-dimensional. The null surface is a low-dimensional interface where those dynamics get certified as admissible. The existing metric structure is the lattice—it determines which deformations can pass through.

Quantization appears at this interface for the same reason it appears in language: the lattice has finite resolution, and compatibility with existing structure forces discreteness.

(This is not a competing quantization scheme; it's a claim about where the quantumness lives in Jacobson-style thermodynamic gravity.)


The Clausius Residual: What You Couldn't Say

Make this precise. Define the Clausius residual:

Δ_Σ ≡ δQ − TδS

This measures how much a local patch fails to satisfy the admissibility constraint. It's the gap between energy flux and certified entropy change.

In classical general relativity: ⟨Δ_Σ⟩ = 0. Perfect certification. Every thought finds its word.

In quantum gravity: Var(Δ_Σ) > 0. The residual fluctuates. The interface has finite resolution—it can't certify arbitrarily fine distinctions.

Think of Δ_Σ as what you couldn't say. The conditional entropy H(X|Y)—everything in the high-dimensional input that didn't survive projection to the low-dimensional output. Classical GR is perfect translation. Quantum gravity is the acknowledgment that some information is lost at the bottleneck.

(For the formal definition of the certification map and how Δ_Σ generates the correction channels, see the full paper.)


ℏ as Lattice Spacing

Planck's constant gets a clean interpretation: it sets the resolution of the lattice.

A natural candidate scale is ΔA_min ~ Gℏ—the minimum distinguishable area increment at a null surface. Combine with the Bekenstein-Hawking relation, and you get ΔS_min ~ 1—one bit of certification precision.

This is the lattice spacing. ℏ doesn't appear because spacetime "knows about" quantum mechanics. It appears because the admissibility interface has built-in discreteness, and Gℏ is the scale at which that discreteness matters.


Two Correction Channels

If the interface has finite resolution, two things happen:

1. Stochastic noise. The Raychaudhuri equation picks up a noise term—not from matter fluctuations, but from certification fluctuations. The interface can't decide perfectly; that uncertainty shows up as random focusing of geodesics. The amplitude scales parametrically as ℓ_P²/A—negligible at macroscopic scales, significant near Planck-scale horizons.

2. Higher-curvature corrections. When certification is slow (non-equilibrium), the effective action picks up curvature-squared terms. This is dissipative response—the interface can't keep up with rapid metric changes, and the lag shows up as a correction to Einstein's equation.

These aren't posited; they follow from the bottleneck having finite capacity.


The General Principle

Quantization isn't mysterious. It's dimensional collapse through structured bottlenecks.

The brain projecting to language. A statistical manifold projecting to observables. Matter dynamics projecting through null surfaces. In each case:

  • High-dimensional dynamics must pass through a low-dimensional interface
  • The interface has pre-existing structure (a lattice)
  • The lattice forces discreteness by selecting admissible outputs
  • The "quantum" of discreteness is set by the lattice resolution

The big claim: Standard quantum mechanics treats quantization as ontological—the substrate itself is discrete at the Planck scale. This paper proposes instead that the substrate can be arbitrarily high-dimensional and continuous. Quantization describes the communication channel, not the substrate. Discreteness is a property of the interface, not of reality.

The metric stays classical—it's the high-dimensional embedding. What's quantum is the resolution limit of the channel through which we observe it.

This connects to a broader pattern: whenever you project high-dimensional systems through low-dimensional channels, information is lost (conditional entropy appears), continuous dynamics are forced discrete (collision/lattice selection), and the channel's structure determines what survives.


This post is the intuition layer. The full paper develops the mathematical framework—postulates, definitions, comparison with Jacobson/Eling-Jacobson/Einstein-Langevin stochastic gravity. If you want the careful version, go there.

Related Papers