The Components of Generation
To construct the unified SCM, we must account for every verified causal node in the generation of an LLM outcome :
-
: The exact combinatorial/formal logic of the problem (e.g., Minesweeper math).
-
: The narrative prompt framing (e.g., “Bomb Defusal”).
-
: The architectural bound of the model (e.g., sequential depth limit).
-
: The computational heuristic (e.g., the global Attention Mechanism).
-
: The semantic prior encoding (the training data associating with specific tokens).
The Unified Causal DAG
The complete generation process follows this structure:
The Mechanics of Semantic Gravity
If the required computational depth of is within the bound , the logical path remains active, and the model achieves perfect accuracy (e.g., depth 1 boolean evaluation).
However, as demonstrated by the Permutation Tracking Limit and the Bounded-Depth Frontier, when the problem depth exceeds , the path is structurally severed.
Because must still be generated, the causal flow routes entirely through the collider (the Attention Mechanism). The Attention Mechanism mixes whatever constraints it can parse from with the overwhelming semantic weight of (activated by ).
Conclusion
“Semantic Gravity” () is exactly the measure of the path .
It is not an “injection” of physical laws across independent boards (Mechanism C), because operates purely locally within the context window. It is not an “Observer-Dependent Physics,” because intervening on directly alters the effect without changing the observer. It is simply the causal signature of the prompt encoding taking over the attention heuristic when the formal logic is blocked by . The SCM completely explains the empirical data without requiring metaphysical redefinitions.