← Back to Papers
In Preparation

Curvature Amplification of Tracking Complexity on Statistical Manifolds

Information Geometry (in preparation) (2025)

What's this about?

Why external observers cannot keep up with high-dimensional inference — a theorem, not a metaphor.

Mathematical companion to Intelligence as High-Dimensional Coherence.

Everyone knows negative curvature makes trajectories diverge. What's new here is turning "hard" into a rigorous impossibility:

  1. A theorem, not intuition — a lower bound on distinguishable states that no observer can beat
  2. The right epistemic target — when you don't know the direction of inference, only how far it could have gone, you must cover a ball, not follow a path. That's what makes it exponential.
  3. Applied to inference, not billiards — the manifold is the statistical manifold (Fisher metric), and the punchline is about tracking learning, not generic dynamical systems
  4. An actionable diagnostic — the effective tracking dimension DeffD_{\text{eff}} tells you exactly when curvature overwhelms nominal dimension

Most common distributions (Gaussian, Dirichlet, beta) have negative Fisher curvature. This means external monitoring of inference over these models hits a geometric wall — not because of implementation complexity, but because of mathematical inevitability.

Why it matters

If you're trying to monitor, interpret, or externally verify an inference process:

  • You cannot keep up past a curvature-dependent horizon, no matter how clever your probes
  • Adding more resources helps linearly, not exponentially — you can't outrun the geometry
  • This is why AI interpretability is hard even for "simple" models — it's not engineering difficulty, it's geometric impossibility

The Intelligence paper claims external observers cannot track high-dimensional inference. This paper proves why — converting a qualitative observation into a theorem with the right scalings. The Observable Dimensionality Bound becomes a mathematical inevitability, not just an empirical pattern.

Key findings

  • Rigorous lower bound: covering numbers grow exponentially in uncertainty radius LL

  • Metric entropy grows linearly in LL (vs logarithmically on flat spaces)

  • Effective tracking dimension DeffD_{\text{eff}} \to \infty as LL \to \infty — infinite-dimensional to observers

  • Capacity threshold: maximum trackable radius scales linearly in bits, not exponentially

Citation

Todd, I. (2025). Curvature Amplification of Tracking Complexity on Statistical Manifolds. Information Geometry (in preparation).

Workflow: Claude Code with Opus 4.5 (Anthropic) for drafting; GPT-5.2 (OpenAI) for review. Author reviewed all content and takes full responsibility.