Top Banner
Physics and Metaphysics Christopher Michael Langan © 1998-2002 Today : Metaphysics :: Tomorrow : Physics Today’s dominant theory of small-scale physics, quantum mechanics, did not begin its long and successful run as a physical theory. The reason is logical; its major premise, the Heisenberg Uncertainty Principle, sets absolute limits on the accuracy to which quanta can be measured, and cursory logical analysis reveals that this defines a relation between measurer and measured object that cannot be expressed in a language describing measured objects alone. Since classical physics was the latter kind of language, neither the uncertainty principle nor quantum mechanics could be immediately classified as “physics”. Rather, they belonged to a logical metalanguage of physics called metaphysics. Indeed, even at a time when physics routinely explains what the ancients would have seen as “magic”, some physicists view quantum mechanics with a touch of uneasy skepticism. The reason: it raises too many metaphysical-looking issues without satisfactorily resolving them. Relativity too was initially a metaphysical theory based on the formation of higher-order predicates, spacetime and spacetime curvature, that had not existed in physics, drawing in the interests of self-containment a higher-order causal relationship between the fundamental physical parameters space, time and matter on a combination of empirical and mathematical grounds (a higher-order relation is a relation of relations…of relations of primitive objects, defined at the appropriate level of predicate logic). Since this describes a semantic operation that cannot be effected within the bare language of physics
28
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Cristopher M. Langan - Physics and Metaphysics.pdf

Physics and Metaphysics

Christopher Michael Langan

© 1998-2002

Today : Metaphysics :: Tomorrow : Physics

Today’s dominant theory of small-scale physics, quantum mechanics, did notbegin its long and successful run as a physical theory. The reason is logical; itsmajor premise, the Heisenberg Uncertainty Principle, sets absolute limits onthe accuracy to which quanta can be measured, and cursory logical analysisreveals that this defines a relation between measurer and measured object thatcannot be expressed in a language describing measured objects alone. Sinceclassical physics was the latter kind of language, neither the uncertaintyprinciple nor quantum mechanics could be immediately classified as“physics”. Rather, they belonged to a logical metalanguage of physicscalled metaphysics. Indeed, even at a time when physics routinely explainswhat the ancients would have seen as “magic”, some physicists view quantummechanics with a touch of uneasy skepticism. The reason: it raises too manymetaphysical-looking issues without satisfactorily resolving them.

Relativity too was initially a metaphysical theory based on the formation ofhigher-order predicates, spacetime and spacetime curvature, that had notexisted in physics, drawing in the interests of self-containment a higher-ordercausal relationship between the fundamental physical parameters space,time and matter on a combination of empirical and mathematical grounds (ahigher-order relation is a relation of relations…of relations of primitive objects,defined at the appropriate level of predicate logic). Since this describes asemantic operation that cannot be effected within the bare language of physics

Page 2: Cristopher M. Langan - Physics and Metaphysics.pdf

as it existed at the time, relativity was metaphysical rather than physical innature. Nevertheless, it achieved quick recognition as a physical theory…notonly because Einstein was already recognized as a professional physicist, butbecause it made physical predictions that classical physics alone did not make.

It was recognized long before Einstein that observations of physical objectsvary subjectively in certain parameters. For example, although objects areidentical when viewed under identical conditions, they vary in size whenviewed from different distances, display different shapes from different angles,and seem to be differently colored and shaded under different lightingconditions. Einstein expanded the range of subjective variation of physicalphenomena by showing that objects also look different when viewed atdifferent relative velocities. But in keeping with classical objectivism, heshowed in the process that such perceptual variations were a matter of objectivecircumstance, in effect treating perception itself as an objectivephenomenon. Because this kind of empirical objectivization is exactly what isexpected of the objective empirical science of physics, attention was divertedfrom nagging metaphysical questions involving the interplay of rational andempirical factors in perception.

Although he never got around to enunciating it, Einstein may well havesensed that perception cannot be understood without understanding the logic ofthis interplay, and that this logic is instrumental to the inherently metaphysicaloperation of theoretical unification. Nevertheless, perhaps encouraged by hisown apparent success in sidestepping this particular metaphysical issue, hespent the second half of his career on a doomed attempt to unify physics in apurely physical context - that is, in the context of a spacetime model whichwent only as far as Relativity Theory. Since then, many bright and well-educated people have repeated roughly the same error, never realizing thatphysical theory truly advances only by absorbing profoundly creativemetaphysical extensions on which no ordinary physicist would wittingly signoff.

Like quantum mechanics and the Theory of Relativity, the CTMU is ametaphysical theory that makes distinctive predictions and retrodictions notmade by previous theories. However, the CTMU makes no attempt in theprocess to sidestep difficult metaphysical issues. For example, Einsteinintroduced the cosmological constant to stabilize the size of the universe, butthen dropped it on the empirical grounds of apparent universal expansion. Incontrast, the CTMU goes beyond empiricism to the rational machinery ofperception itself, providing cosmic expansion with a logical basis whilepredicting and explaining some of its features. Relativity pursues the goal ofexplanatory self-containment up to a point; spacetime contains matter and

Page 3: Cristopher M. Langan - Physics and Metaphysics.pdf

energy that cause spacetime fluctuations that cause changes in matter andenergy. The CTMU, on the other hand, pursues the goal of self-containment allthe way up to cosmogenesis. And while neither GR nor QM does anything toresolve the fatal paradoxes of ex nihilo creation and quantum nonlocality, theCTMU dissolves such paradoxes with a degree of logical rectitude to whichscience seldom aspires.

To understand how the CTMU is a natural extension of modern physics, letus review the history of the physicist’s and cosmologist’s art in the context ofCartesian coordinate systems.

The Cartesian Architecture of Split-Level Reality

Modern physics is based on, and can be described as the evolution of,rectilinear coordinate systems. Called Cartesian coordinate systems after theirleading exponent, René Descartes, they are particularly well-suited to theobjective modeling of physical phenomena, that is, to the algebraicrepresentation of static and dynamic relationships without respect to theirlocations or the special vantages from which they are observed orconsidered. This property of Cartesian spaces relates directly to anotherinvention of Monsieur Descartes called mind-body dualism, which merely statesin plain language what Cartesian coordinate systems seem to graphically depict:namely, that cognition and physical reality can be factored apart at ourphilosophical and scientific convenience.

Since the time of Descartes, there have been numerous attempts to clarify theexact relationship between mind and reality and thereby solve the "mind-bodyproblem". Hume, for example, held that reality consists exclusively of senseimpressions from which concepts like “mind” and “matter” are artificiallyabstracted. Kant countered with the view that the mind knows deep reality onlythrough cryptic impressions molded to its own cognitive categories. Morerecently, Lewes’ dual-aspect monism maintained that reality consists of aneutral substance of which mind and body are just complementary aspects,while the mind-stuff theoryof Clifford and Prince was a form of psychicalmonism positing that mind is the ultimate reality, that ordinary material realityis simply mind apprehended by mind, and that the higher functions of mindemerge from smaller pieces of mind that do not of themselves possess higher

Page 4: Cristopher M. Langan - Physics and Metaphysics.pdf

mental attributes (an idea previously attributable to Leibniz, Spencer andothers).

But while each of these theories contains a part of the truth, none containsthe whole truth. Only recently have the parts been assembled in light of modernlogic and science to create a dynamic, comprehensive theory of reality thatsolves the mind-body problem for good.

Classical Mechanics:

The Infinitesimal Worm in Newton’s Apple

The next big representational advance in physics, Newtonian mechanics,relied on a new kind of analytic geometry based on vector modeling and vectoranalysis of physical motion in Cartesian coordinate systems in which space andtime are represented as independent (orthogonal)axes. Its Langrangian and Hamiltonian formulations occupy the samemathematical setting. From the beginning, the featureless neutrality ofCartesian space accorded well with the evident nonpreference of physical lawsfor special coordinate systems…i.e., for the homogeneity and isotropy of spaceand time. However, this property of Cartesian spaces also rendered positionand motion completely relative in the Galilean sense. Finding that this made ithard to explain certain physical phenomena - e.g., inertia, and later on,electromagnetic wave propagation - Newton and his followers embraced theconcept of a stationary aether against whose background all dimensions,durations and velocities were considered to be fixed. The aether was anunusual kind of “field” over which the center of mass of the universe wasperfectly distributed, explaining not only inertia but the evident fact that all ofthe matter in the universe was not gravitating toward a central point in space.

In order to work with his vector model of physical reality, Newton had toinvent a new kind of mathematical analysis known as infinitesimalcalculus. Specifically, he was forced in calculating the velocity of an object ateach point of its trajectory to define its “instantaneous rate of change” as a ratioof "infinitesimal" vectors whose lengths were “smaller than any finite quantity,but greater than zero”.

Page 5: Cristopher M. Langan - Physics and Metaphysics.pdf

Since this is a paradoxical description that does not meaningfully translateinto an actual value – the only number usually considered to be smaller thanany finite quantity is 0 - it is hard to regard infinitesimal vectors as meaningfulobjects. But even though they could not give a satisfactory account ofinfinitesimal vectors, post-Newtonian physicists at least knew where suchvectors were located: in the same Cartesian space containingordinaryfinite vectors. Better yet, the celebrated mathematician Gaussdiscovered that they could be confined within the curves to which they weretangential, an insight developed by Riemann into a comprehensive theoryof differential geometry valid for curved surfaces in any number ofdimensions. However, although differential geometry would later prove usefulin formulating a generalization of Cartesian space, its “intrinsic” nature did notresolve the paradox of infinitesimals.

For many years after Newton, mathematicians struggled to find a way aroundthe infinitesimals paradox, first lighting on the Weierstrass epsilon-deltaformalism purporting to characterize infinitesimals within standard Cartesianspace. But later – in fact, years after Newtonian mechanics was superseded bya more general theory of physics – they finally satisfied their yearning tounderstand infinitesimals as timeless mathematical objects. They accomplishedthis by reposing them in a hypothetical nonstandard universe where each pointof the standard universe is surrounded by a nonstandard neighborhoodcontaining infinitesimal objects (in logic, a theory is justified by demonstratingthe existence of a model for which it is true; as a model of infinitesimals andtheir relationships with finite numbers, the nonstandard universe justifies theinfinitesimal calculus). Ignoring the obvious fact that this could have ametaphysical bearing on physical reality, mathematicians took the nonstandarduniverse and carried it off in the purely abstract direction of nonstandardanalysis.

Quantum Mechanics:

Space and Time Get Smeared (and Worse)

After standing for over two centuries as the last word in physics, thedifferential equations comprising the deterministic laws of Newtonianmechanics began to run into problems. One of these problems was calledthe Heisenberg Uncertainty Principle or HUP. The HUP has the effect of“blurring” space and time on very small scales by making it impossible to

Page 6: Cristopher M. Langan - Physics and Metaphysics.pdf

simultaneously measure with accuracy certain pairs of attributes of a particle ofmatter or packet of energy. Because of this blurring, Newton’s differentialequations are insufficient to describe small-scale interactions of matter andenergy. Therefore, in order to adapt the equations of classical mechanics to thenondeterministic, dualistic (wave-versus-particle) nature of matter and energy,the more or less ad hoc theory of quantum mechanics (QM) was hastilydeveloped.

QM identifies matter quanta with “probability waves” existing in -dimensional complex Hilbert space, a Cartesian space defined over the field ofcomplex numbers a+bi (where a and b are real numbers and i = 1) instead ofthe pure real numbers, and replaces Hamilton’s classical equations of motionwith Schrodinger’s wave equation. QM spelled the beginning of the end forLaplacian determinism, a philosophical outgrowth of Newtonianism which heldthat any temporal state of the universe could be fully predicted from a completeCartesian description of any other. Not only uncertainty but freedom hadreentered the physical arena.

Unfortunately, the HUP was not the only quantum-mechanical problem forclassical physics and its offshoots. Even worse was a phenomenon called EPR(Einstein-Podolsky-Rosen) nonlocality, according to which the conservation ofcertain physical quantities for pairs of correlated particles seems to require thatinformation be instantaneously transmitted between them regardless of theirdistance from each other. The EPR paradox juxtaposes nonlocality to theconventional dynamical scenario in which anything transmitted betweenlocations must move through a finite sequence of intervening positions in spaceand time. So basic is this scenario to the classical worldview that EPRnonlocality seems to hang over it like a Damoclean sword, poised to sunder itlike a melon. Not only does no standard physical theory incorporating commonnotions of realism, induction and locality contain a resolution of this paradox -this much we know from a mathematical result called Bell's theorem - but itseems that the very foundations of physical science must give way before aresolution can even be attempted!

Page 7: Cristopher M. Langan - Physics and Metaphysics.pdf

The Special Theory of Relativity:

Space and Time Beget Spacetime

Another problem for Newton’s worldview was the invariance of c, the speedof light in vacuo. Emerging from Maxwell’s equations and directexperimentation (e.g., Michelson-Morley), c-invariance defies representation inan ordinary Cartesian vectorspace. Einstein’s Special Theory ofRelativity (SR), arising at about the same time as quantum mechanics, wasdesigned to fill the breach. To accomplish this, Einstein had to generalizeCartesian space in such a way that distances and durations vary with respect todistinct coordinate systems associated with various states of relativemotion. The particular generalization of Cartesian space in which thesemotion-dependent coordinate systems are related, and in which the velocity oflight can be invariantly depicted, is called Minkowski spacetime. In spacetime,space and time axes remain perpendicular but no longer represent independentdimensions. Instead, spacelike and timelike domains are separated by null orlightlike geodesics (minimal trajectories) representing the "paths" traced bylight, and the constant velocity of light is represented by the constant (usually45°) orientation of the corresponding spacetime vectors. The space and timeaxes of moving coordinate systems are skewed by the Lorentz transformationfunctions according to their relative angles of propagation (velocities) throughtimelike domains, resulting in relative distortions of space and time vectorsbetween systems.

General Relativity:

Spacetime Develops Curvature

Flat or Euclidean spacetime suffices for the representation of all kinds ofphysical motion up to constant linear acceleration. However, gravity – whichNewton had regarded as a force and represented as an ordinary Cartesian vector– causes other kinds of motion as well, e.g. orbital motion. So in order togeneralize Minkowski spacetime to explain gravity, Einstein had to undertake afurther generalization of Cartesian space accommodating non-flat or curvedspacetime. In this generalization of Cartesian space, spacetime curvature isrepresented by algebraically well-behaved generalizations of vectors

Page 8: Cristopher M. Langan - Physics and Metaphysics.pdf

called tensors, which are just mathematical functions that take ordinaryspacetime vectors as input and yield other vectors (or numbers) asoutput. Calculating these entities can be as exacting and tedious as countingsand grains, but they are mathematically straightforward. By modeling physicalreality as a curved tensor manifold, Einstein was able to explain how gravityaffects motion and thus to create the gravitational extension of SpecialRelativity known as General Relativity or GR. While the gravitationalcalculations of GR match those of classical mechanics under most conditions,experiment favors GR decisively in certain situations.

Although GR is based on differential geometry intrinsic to curved surfaces –geometry in which one need never leave a surface in order to determine itsproperties – distances and curvatures are ultimately expressed in terms ofminute spatiotemporal vectors or "line elements" which must be madeinfinitesimal to ensure that they never leave the "surface" of spacetime. Thus,GR avoids the mensural necessity of an external hyperspace only by inheritingNewton’s infinitesimals paradox. Since GR, like classical mechanics, treatsthese line elements as objects to be used in forming ratios and tensors, itrequires an "object-oriented" (as opposed to a Weierstrass-style procedural)definition of infinitesimals. But such a definition requires a nonstandarduniverse on model-theoretic grounds. So GR depends on a nonstandarduniverse as much as its predecessors, and is not as self-contained as the conceptof intrinsic curvature might lead one to expect.

Strictly speaking, Newtonian mechanics and all subsequent theories ofphysics require a nonstandard universe, i.e. a model that supports the existenceof infinitesimals, for their formulation. The effect of this requirement is to blurthe distinction between physics, which purports to limit itself to the standarduniverse of measurable distances, and metaphysics, which can describe thestandard universe as embedded in a higher-dimensional space or a nonstandarduniverse containing infinitesimals. The fast acceptance of GR as a "physical"theory thus owed at least in part to the fact that physicists had already learned toswallow the infinitesimals paradox every time they used the infinitesimalcalculus to do classical mechanics! Only much later, with the advent of aninfocognitive spacetime internally accommodating necessary metaphysical self-extensions, could the infinitesimal line elements of GR be properly embeddedin a neoCartesian model of reality as abstract ingredients of certain distributedcomputations (see Appendix A).

The moral of the story up to this point is abundantly clear: both before andafter Newton, the greatest advances in physics have come through the creationand absorption of metaphysical extensions. Unfortunately, most physicists are

Page 9: Cristopher M. Langan - Physics and Metaphysics.pdf

sufficiently unclear on this fact that the word “metaphysics” remains all butunmentionable in their deliberations.

But change finds a way.

The Search for a Unified Field:

Spacetime Gains New Dimensions

Having found so satisfying a mathematical setting for gravitation, Einsteinnext tried to create a Unified Field Theory (UFT) by using the same model toexplain all of the fundamental forces of nature, two of which, electricity andmagnetism, had already been unified by Maxwell aselectromagnetism or EM for short. As a natural first step, Einstein tried toformulate the EM force as a tensor. Unfortunately, EM force is governed byquantum mechanics, and 4-dimensional GR tensors lack intrinsic quantumproperties. This alone limits General Relativity to a calculational rather thanexplanatory bearing on electromagnetism. Because the branch of quantummechanics called quantum electrodynamics (QED), which treats the EM forceas a particle interchange, better explained the properties and dynamics ofelectrons and electromagnetic fields, Einstein’s geometrodynamic approach tothe UFT was widely abandoned.

After a time, however, the trek was resumed. Kaluza-Klein theory hadalready added an extra dimension to spacetime and curled it up into a tight littleloop of radius equal to the Planck length (10-33 cm, far smaller than any knownparticle). This curling maneuver explained more than the extra dimension’sinvisibility; because only a discrete number of waves can fit around a loop, italso seemed to explain why particle energies are quantized, and thus to providea connection between relativity theory and QM (in 1921, Kaluza observed thatlight could be viewed as the product of fifth-dimensional vibrations). Thoughtemporarily forgotten, this idea was ultimately revived in connectionwith supersymmetry, an attempt to unify the fundamental forces of nature in asingle theory by defining GR-style tensors accommodating 7 additional

Page 10: Cristopher M. Langan - Physics and Metaphysics.pdf

spacetime dimensions. Shortened and rolled up in bizarre topologicalconfigurations, these dimensions would exhibit fundamental frequencies andquantized harmonics resembling the quantum properties of tiny particles ofmatter.

Although supersymmetry was eventually dropped because its 11-dimensionalstructure failed to explain subatomic chirality (whereby nature distinguishesbetween right- and left-handedness), its basic premises lived on in the form of10-dimensional superstring theory.

Again, the basic idea was to add additional dimensions to GR, slice andsplice these extra dimensions in such a way that they manifest the basic featuresof quantum mechanics, and develop the implications in the context of a seriesof Big Bang phase transitions (“broken symmetries”) in which matter changesform as the hot early universe cools down (mathematically, these phasetransitions are represented by the arrows in the series GH…SU(3) x SU(2)x U(1)SU(3) x U(1), where alphanumerics represent algebraicsymmetrygroups describing the behavioral regularities of different kinds of matter underthe influence of different forces, and gravity is mysteriously missing)

Unfortunately, just as General Relativity did nothing to explain the origin of4-D spacetime or its evident propensity to “expand” when there would seem tobe nothing for it to expand into, string theory did nothing to explain the originor meaning of the n-dimensional strings into which spacetime had evolved. Nordid it even uniquely or manageably characterize higher-dimensional spacetimestructure; it required the same kind of nonstandard universe that was missingfrom GR in order to properly formulate quantum-scale dimensional curling, andeventually broke down into five (5) incompatible versions all relying ondifficult and ill-connected kinds of mathematics that made even the simplestcalculations, and extracting even the most basic physical predictions, virtuallyimpossible. Worse yet, it was an unstratified low-order theory too weak toaccommodate an explanation for quantum nonlocality or measurable cosmicexpansion.

Recently, string theory has been absorbed by a jury-rigged patchwork called“membrane theory” or M-theory whose basic entity is a p-dimensional objectcalled, one might almost suspect eponymically, a “p-brane” (no, this is not ajoke). P-branes display mathematical properties called S- and T-duality whichcombine in a yet-higher-level duality called the Duality of Dualities (again, thisis not a joke) that suggests a reciprocity between particle size and energy thatcould eventually link the largest and smallest scales of the universe, and thusrealize the dream of uniting large-scale physics (GR) with small-scalephysics (QM). In some respects, this is a promising insight; it applies broad

Page 11: Cristopher M. Langan - Physics and Metaphysics.pdf

logical properties of theories (e.g., duality) to what the theories “objectively”describe, thus linking reality in a deeper way to the mental process oftheorization. At the same time, the “membranes” or “bubbles” that replacestrings in this theory more readily lend themselves to certain constructiveinterpretations.

But in other ways, M-theory is just the same old lemon with a new coat ofpaint. Whether the basic objects of such theories are called strings, p-branes orbubble-branes, they lack sufficient structure and context to explain their ownorigins or cosmological implications, and are utterly helpless to resolve physicaland cosmological paradoxes like quantum nonlocality and ex nihilo (something-from-nothing) cosmogony… paradoxes next to which the paradoxes of brokensymmetry “resolved” by such theories resemble the unsightly warts on the noseof a charging rhinoceros. In short, such entities sometimes tend to look to thoseunschooled in their virtues like mathematical physics run wildly andexpensively amok.

Alas, the truth is somewhat worse. Although physics has reached the point atwhich it can no longer credibly deny the importance of metaphysical criteria, itresists further metaphysical extension. Instead of acknowledging and dealingstraightforwardly with its metaphysical dimension, it mislabels metaphysicalissues as “scientific” issues and festoons them with increasingly arcane kinds ofmathematics that hide its confusion regarding the underlying logic. Like asocial climber determined to conceal her bohemian roots, it pretends that it isstill dealing directly with observable reality instead of brachiating upvertiginous rationalistic tightropes towards abstractions that, while sometimesindirectly confirmed, are no more directly observable than fairies andunicorns. And as science willfully distracts itself from the urgent metaphysicalquestions it most needs to answer, philosophy ignores its parentalresponsibility.

Reality as a Cellular Automaton:

Spacetime Trades Curves for Computation

At the dawn of the computer era, the scientific mainstream sprouted a timelyalternative viewpoint in the form of the Cellular Automaton Model of theUniverse, which we hereby abbreviate as the CAMU. First suggested bymathematician John von Neumann and later resurrected by salesman and

Page 12: Cristopher M. Langan - Physics and Metaphysics.pdf

computer scientist Ed Fredkin, the CAMU represents a conceptual regression ofspacetime in which space and time are re-separated and described in the contextof a cellular automaton. Concisely, space is represented by (e.g.) a rectilineararray of computational cells, and time by a perfectly distributed statetransformation rule uniformly governing cellular behavior. Because automataand computational procedures are inherently quantized, this leads to a naturalquantization of space and time. Yet another apparent benefit of the CAMU isthat if it can be made equivalent to a universal computer, then by definition itcan realistically simulate anything that a consistent and continually evolvingphysical theory might call for, at least on the scale of its own universality.

But the CAMU, which many complexity theorists and their sympathizers inthe physics community have taken quite seriously, places problematicconstraints on universality. E.g., it is not universal on all computational scales,does not allow for subjective cognition except as an emergent property of its(assumedly objective) dynamic, and turns out to be an unmitigated failure whenit comes to accounting for relativistic phenomena. Moreover, it cannot accountfor the origin of its own cellular array and is therefore severely handicappedfrom the standpoint of cosmology, which seeks to explain not only thecomposition but the origin of the universe. Although the CAMU array caninternally accommodate the simulations of many physical observables, thusallowing the CAMU’s proponents to intriguingly describe the universe as a“self-simulation”, its inability to simulate the array itself precludes the adequaterepresentation of higher-order physical predicates with a self-referentialdimension.

Reality as Reality Theory:

Spacetime Turns Introspective

Now let us backtrack to the first part of this history, the part in whichRené Descartes physically objectivized Cartesian spaces in keeping with histhesis of mind-body dualism. Notice that all of the above models sustain themind-body distinction to the extent that cognition is regarded as an incidentalside effect or irrelevant epiphenomenon of objective laws; cognition issecondary even where space and time are considered non-independent. Yet notonly is any theory meaningless in the absence of cognition, but the all-importanttheories of relativity and quantum mechanics, without benefit of explicit logicaljustification, both invoke higher-level constraints which determine the form or

Page 13: Cristopher M. Langan - Physics and Metaphysics.pdf

content of dynamical entities according to properties not of their own, but ofentities that measure or interact with them. Because these higher-levelconstraints are cognitive in a generalized sense, GR and QM require a jointtheoretical framework in which generalized cognition is a distributed feature ofreality.

Let’s try to see this another way. In the standard objectivist view,the universe gives rise to a theorist who gives rise to a theory of theuniverse. Thus, while the universe creates the theory by way of a theorist, it isnot beholden to the possibly mistaken theory that results. But while this is trueas far as it goes, it cannot account for how the universe itself is created. To fillthis gap, the CTMU Metaphysical Autology Principle or MAP states thatbecause reality is an all-inclusive relation bound by a universal quantifierwhose scope is unlimited up to relevance, there is nothing external to realitywith sufficient relevance to have formed it; hence, the real universe must beself-configuring. And the Mind-Equals-Reality (M=R) Principle says thatbecause the universe alone can provide the plan or syntax of its own self-creation, it is an "infocognitive" entity loosely analogous to a theorist in theprocess of introspective analysis. Unfortunately, since objectivist theoriescontain no room for these basic aspects of reality, they lack the expressivepower to fully satisfy relativistic, cosmological or quantum-mechanical criteria.The ubiquity of this shortcoming reflects the absence of a necessary andfundamental logical feature of physical analysis, a higher order of theorizationin which theory cognitively distributes over theory, for which no conventionaltheory satisfactorily accounts.

In view of the vicious paradoxes to which this failing has led, it is onlynatural to ask whether there exists a generalization of spacetime that containsthe missing self-referential dimension of physics. The answer, of course, is thatone must exist, and any generalization that is comprehensive in an explanatorysense must explain why. In Noesis/ECE 139, the SCSPL paradigm of theCTMU was described to just this level of detail. Space and time wererespectively identified as generalizations of information and cognition, andspacetime was described as a homogeneous self-referential mediumcalled infocognition that evolves in a process called conspansion. Conspansivespacetime is defined to incorporate the fundamental concepts of GR and QM ina simple and direct way that effectively preempts the paradoxes left unresolvedby either theory alone. Conspansive spacetime not only incorporates non-independent space and time axes, but logically absorbs the cognitive processesof the theorist regarding it.

Since this includes any kind of theorist cognitively addressing any aspect ofreality, scientific or otherwise, the CTMU offers an additional benefit of great

Page 14: Cristopher M. Langan - Physics and Metaphysics.pdf

promise to scientists and nonscientists alike: it naturally conduces to aunification of scientific and nonscientific (e.g. humanistic, artistic and religious)thought.

CTMU >> CAMU in Camo

Before we explore the conspansive SCSPL model in more detail, it isworthwhile to note that the CTMU can be regarded as a generalization of themajor computation-theoretic current in physics, the CAMU. Originally calledthe Computation-Theoretic Model of the Universe, the CTMU was initiallydefined on a hierarchical nesting of universal computers, the Nested SimulationTableau or NeST, which tentatively described spacetime as stratified virtualreality in order to resolve a decision-theoretic paradox put forth by Los Alamosphysicist William Newcomb (see Noesis 44, etc.). Newcomb’s paradox isessentially a paradox of reverse causality with strong implications for theexistence of free will, and thus has deep ramifications regarding the nature oftime in self-configuring or self-creating systems of the kind that MAP shows itmust be. Concisely, it permits reality to freely create itself from within byusing its own structure, without benefit of any outside agency residing in anyexternal domain.

Although the CTMU subjects NeST to metalogical constraints not discussedin connection with Newcomb’s Paradox, NeST-style computationalstratification is essential to the structure of conspansive spacetime. The CTMUthus absorbs the greatest strengths of the CAMU – those attending quantizeddistributed computation – without absorbing its a priori constraints on scale orsacrificing the invaluable legacy of Relativity. That is, because the extendedCTMU definition of spacetime incorporates a self-referential, self-distributed,self-scaling universal automaton, the tensors of GR and its many-dimensionaloffshoots can exist within its computational matrix.

An important detail must be noted regarding the distinction between theCAMU and CTMU. By its nature, the CTMU replaces ordinary mechanicalcomputation with what might better be called protocomputation. Whereascomputation is a process defined with respect to a specific machine model, e.g.a Turing machine, protocomputation is logically "pre-mechanical". That is,before computation can occur, there must (in principle) be a physicallyrealizable machine to host it. But in discussing the origins of the physical

Page 15: Cristopher M. Langan - Physics and Metaphysics.pdf

universe, the prior existence of a physical machine cannot be assumed. Instead,we must consider a process capable of giving rise to physical reality itself...aprocess capable of not only implementing a computational syntax, but ofserving as its own computational syntax by self-filtration from a realm ofsyntactic potential. When the word "computation" appears in the CTMU, it isusually to protocomputation that reference is being made.

It is at this point that the theory of languages becomes indispensable. In thetheory of computation, a "language" is anything fed to and processed by acomputer; thus, if we imagine that reality is in certain respects like a computersimulation, it is a language. But where no computer exists (because there is notyet a universe in which it can exist), there is no "hardware" to process thelanguage, or for that matter the metalanguage simulating the creation ofhardware and language themselves. So with respect to the origin of theuniverse, language and hardware must somehow emerge as one; instead ofengaging in a chicken-or-egg regress involving their recursive relationship, wemust consider a self-contained, dual-aspect entity functioning simultaneously asboth. By definition, this entity is a Self-Configuring Self-ProcessingLanguage or SCSPL. Whereas ordinary computation involves a language,protocomputation involves SCSPL.

Protocomputation has a projective character consistent with the SCSPLparadigm. Just as all possible formations in a language - the set of all possiblestrings - can be generated from a single distributed syntax, and all grammaticaltransformations of a given string can be generated from a single copy thereof,all predicates involving a common syntactic component are generated from theintegral component itself. Rather than saying that the common component isdistributed over many values of some differential predicate - e.g., that somedistributed feature of programming is distributed over many processors - we cansay (to some extent equivalently) that many values of the differential predicate -e.g. spatial location - are internally orendomorphically projected within thecommon component, with respect to which they are "in superposition". Afterall, difference or multiplicity is a logical relation, and logical relations possesslogical coherence or unity; where the relation has logical priority over thereland, unity has priority over multiplicity. So instead of putting multiplicitybefore unity and pluralism ahead of monism, CTMU protocomputation, underthe mandate of a third CTMU principle called Multiplex Unity or MU, puts thehorse sensibly ahead of the cart.

To return to one of the central themes of this article, SCSPL andprotocomputation are metaphysical concepts. Physics is unnecessary to explainthem, but they are necessary to explain physics. So again, what we aredescribing here is a metaphysical extension of the language of physics. Without

Page 16: Cristopher M. Langan - Physics and Metaphysics.pdf

such an extension linking the physical universe to the ontological substrate fromwhich it springs - explaining what physical reality is, where it came from, andhow and why it exists - the explanatory regress of physical science wouldultimately lead to the inexplicable and thus to the meaningless.

Spacetime Requantization and the Cosmological Constant

The CTMU, and to a lesser extent GR itself, posits certain limitations onexterior measurement. GR utilizes (so-called) intrinsic spacetime curvature inorder to avoid the necessity of explaining an external metaphysical domainfrom which spacetime can be measured, while MAP simply states, in a moresophisticated way consistent with infocognitive spacetime structure asprescribed by M=R and MU, that this is a matter of logical necessity(see Noesis/ECE 139, pp. 3-10). Concisely, if there were such an exteriordomain, then it would be an autologous extrapolation of the Human CognitiveSyntax (HCS) that should properly be included in the spacetime to bemeasured. [As previously explained, the HCS, a synopsis of the most generaltheoretical language available to the human mind (cognition), is asupertautological formulation of reality as recognized by the HCS. WhereCTMU spacetime consists of HCS infocognition distributed over itself in a wayisomorphic to NeST – i.e., of a stratified NeST computer whose levels haveinfocognitive HCS structure – the HCS spans the laws of mind and nature. Ifsomething cannot be mapped to HCS categories by acts of cognition, perceptionor reference, then it is HCS-unrecognizable and excluded from HCS reality dueto nonhomomorphism; conversely, if it can be mapped to the HCS in aphysically-relevant way, then it is real and must be explained by realitytheory.]

Accordingly, the universe as a whole must be treated as a static domainwhose self and contents cannot “expand”, but only seem to expand because theyare undergoing internal rescaling as a function of SCSPL grammar. Theuniverse is not actually expanding in any absolute, externally-measurable sense;rather, its contents are shrinking relative to it, and to maintain local geometricand dynamical consistency, it appears to expand relative to them. Alreadyintroduced as conspansion (contraction qua expansion), this process reducesphysical change to a form of "grammatical substitution" in which thegeometrodynamic state of a spatial relation is differentially expressed within an

Page 17: Cristopher M. Langan - Physics and Metaphysics.pdf

ambient cognitive image of its previous state. By running this scenariobackwards and regressing through time, we eventually arrive at the source ofgeometrodynamic and quantum-theoretic reality: a primeval conspansivedomain consisting of pure physical potential embodied in the self-distributed"infocognitive syntax" of the physical universe…i.e., the laws of physics, whichin turn reside in the more general HCS.

Conspansion consists of two complementaryprocesses, requantization and inner expansion. Requantization downsizes thecontent of Planck’s constant by applying a quantized scaling factor tosuccessive layers of space corresponding to levels of distributed parallelcomputation. This inverse scaling factor 1/R is just the reciprocal ofthe cosmological scaling factor R, the ratio of the current apparent size dn(U) ofthe expanding universe to its original (Higgs condensation) sized0(U)=1. Meanwhile, inner expansion outwardly distributes the images of pastevents at the speed of light within progressively-requantized layers.

As layers are rescaled, the rate of inner expansion, and the speed andwavelength of light, change with respect to d0(U) so that relationships amongbasic physical processes do not change… i.e., so as to effect nomologicalcovariance. The thrust is to relativize space and time measurements so thatspatial relations have different diameters and rates of diametric change fromdifferent spacetime vantages. This merely continues a long tradition in physics;just as Galileo relativized motion and Einstein relativized distances anddurations to explain gravity, this is a relativization for conspansive “antigravity”(see Appendix B).

Conspansion is not just a physical operation, but a logical one aswell. Because physical objects unambiguously maintain their identities andphysical properties as spacetime evolves, spacetime must directly obey the rulesof 2VL (2-valued logic distinguishing what is true from what isfalse). Spacetime evolution can thus be straightforwardly depicted by Venndiagrams in which the truth attribute, a high-order metapredicate of anyphysical predicate, corresponds to topological inclusion in a spatial domaincorresponding to specific physical attributes. I.e., to be true, an effect must benot only logically but topologically contained by the cause; to inherit propertiesdetermined by an antecedent event, objects involved in consequent events mustappear within its logical and spatiotemporal image. In short, logic equalsspacetime topology.

This 2VL rule, which governs the relationship between the Space-Time-Object and Logico-Mathematical subsyntaxes of the HCS, follows from thedual relationship between set theory and semantics, whereby

Page 18: Cristopher M. Langan - Physics and Metaphysics.pdf

predicating membership in a set corresponds to attributing a propertydefined onor defining the set. The property is a “qualitative space” topologicallycontaining that to which it is logically attributed. Since the laws of nature couldnot apply if the sets that contain their arguments and the properties that serve astheir parameters were not mutually present at the place and time of application,and since QM blurs the point of application into a region of distributivespacetime potential, events governed by natural laws must occur within a regionof spacetime over which their parameters are distributed.

Conspansive domains interpenetrate against the background of past events atthe inner expansion rate c, defined as the maximum ratio of distance to durationby the current scaling, and recollapse through quantuminteraction. Conspansion thus defines a kind of “absolute time” metering andsafeguarding causality. Interpenetration of conspansive domains, whichinvolves a special logical operation called unisection (distributed intersection)combining aspects of the set-theoretic operations union and intersection, createsan infocognitive relation of sufficiently high order to effect quantumcollapse. Output is selectively determined by ESP interference andreinforcement within and among metrical layers.

Because self-configurative spacetime grammar is conspansive by necessity,the universe is necessarily subject to a requantizative “accelerative force” thatcauses its apparent expansion.

The force in question, which Einstein symbolized by the cosmologicalconstant lambda, is all but inexplicable in any nonconspansive model; that nosuch model can cogently explain it is why he later relented and describedlambda as “the greatest blunder of his career”. By contrast, the CTMU requiresit as a necessary mechanism of SCSPL grammar.

Thus, recent experimental evidence – in particular, recently-acquired data onhigh-redshift Type Ia supernovae that seem to imply the existence of such aforce – may be regarded as powerful (if still tentative) empirical confirmationof the CTMU.

Metrical Layering

In a conspansive universe, the spacetime metric undergoes constantrescaling. Whereas Einstein required a generalization of Cartesian spaceembodying higher-order geometric properties like spacetime curvature,

Page 19: Cristopher M. Langan - Physics and Metaphysics.pdf

conspansion requires a yet higher order of generalization in which evenrelativistic properties, e.g. spacetime curvature inhering in the gravitationalfield, can be progressively rescaled. Where physical fields of force controlor program dynamical geometry, and programming is logically stratified as inNeST, fields become layered stacks of parallel distributive programming thatdecompose into field strata (conspansive layers) related by an intrinsicrequantization function inhering in, and logically inherited from, the mostprimitive and connective layer of the stack. This "storage process" by whichinfocognitive spacetime records its logical history is called metricallayering (note that since storage is effected by inner-expansive domains whichare internally atemporal, this is to some extent a misnomer reflectingweaknesses in standard models of computation).

The metrical layering concept does not involve complicated reasoning. Itsuffices to note thatdistributed (as in “event images are outwardly distributed inlayers of parallel computation by inner expansion”) effectively means “of 0intrinsic diameter” with respect to the distributed attribute. If an attributecorresponding to a logical relation of any order is distributed over amathematical or physical domain, then interior points of the domain areundifferentiated with respect to it, and it need not be transmitted amongthem. Where space and time exist only with respect to logical distinctionsamong attributes, metrical differentiation can occur within inner-expansivedomains (IEDs) only upon the introduction of consequent attributes relative towhich position is redefined in an overlying metrical layer, and what we usuallycall “the metric” is a function of the total relationship among all layers.

The spacetime metric thus amounts to a Venn-diagrammatic conspansivehistory in which every conspansive domain (lightcone cross section, Vennsphere) has virtual 0 diameter with respect to distributed attributes, despiteapparent nonzero diameter with respect to metrical relations among subsequentevents. What appears to be nonlocal transmission of information can thus seemto occur. Nevertheless, the CTMU is a localistic theory in every sense of theword; information is never exchanged “faster than conspansion”, i.e. faster thanlight (the CTMU’s unique explanation of quantum nonlocality within alocalistic model is what entitles it to call itself a consistent “extension” ofrelativity theory, to which the locality principle is fundamental).

Metrical layering lets neo-Cartesian spacetime interface with predicate logicin such a way that in addition to the set of “localistic” spacetime intervals ridingatop the stack (and subject to relativistic variation in space and timemeasurements), there exists an underlying predicate logic of spatiotemporalcontents obeying a different kind of metric. Spacetime thus becomes a logicalconstruct reflecting the logical evolution of that which it models, thereby

Page 20: Cristopher M. Langan - Physics and Metaphysics.pdf

extending the Lorentz-Minkowski-Einstein generalization of Cartesian space.Graphically, the CTMU places a logical, stratified computational constructionon spacetime, implants a conspansive requantization function in its deepest,most distributive layer of logic (or highest, most parallel level of computation),and rotates the spacetime diagram depicting the dynamical history of theuniverse by 90° along the space axes. Thus, one perceives the model’sevolution as a conspansive overlay of physically-parametrized Venn diagramsdirectly through the time (SCSPL grammar) axis rather than through anextraneous z axis artificially separating theorist from diagram. The cognition ofthe modeler – his or her perceptual internalization of the model – is therebyidentified with cosmic time, and infocognitive closure occurs as the modelabsorbs the modeler in the act of absorbing the model.

To make things even simpler: the CTMUequates reality to logic, logic to mind, and (by transitivity ofequality) reality to mind. Then it makes a big Venn diagram out of all three,assigns appropriate logical and mathematical functions to the diagram, anddeduces implications in light of empirical data. A little reflection reveals that itwould be hard to imagine a simpler or more logical theory of reality.

The CTMU and Quantum Theory

The microscopic implications of conspansion are in remarkable accord withbasic physical criteria. In a self-distributed (perfectly self-similar) universe,every event should mirror the event that creates the universe itself. In terms ofan implosive inversion of the standard (Big Bang) model, this means that everyevent should to some extent mirror the primal event consisting of acondensation of Higgs energy distributing elementary particles and theirquantum attributes, including mass and relative velocity, throughout theuniverse. To borrow from evolutionary biology, spacetime ontogenyrecapitulates cosmic phylogeny; every part of the universe should repeat theformative process of the universe itself.

Thus, just as the initial collapse of the quantum wavefunction (QWF) of thecausally self-contained universe is internal to the universe, the requantizativeoccurrence of each subsequent event is topologically internal to that event, andthe cause spatially contains the effect. The implications regarding quantumnonlocality are clear. No longer must information propagate at superluminal

Page 21: Cristopher M. Langan - Physics and Metaphysics.pdf

velocity between spin-correlated particles; instead, the information required for(e.g.) spin conservation is distributed over their joint ancestral IED…the virtual0-diameter spatiotemporal image of the event that spawned both particles as acorrelated ensemble.

The internal parallelism of this domain – the fact that neither distance norduration can bind within it – short-circuits spatiotemporal transmission on alogical level. A kind of “logical superconductor”, the domain offers noresistance across the gap between correlated particles; in fact, the “gap” doesnot exist! Computations on the domain’s distributive logical relations are asperfectly self-distributed as the relations themselves.

Equivalently, any valid logical description of spacetime has a propertycalled hology, whereby the logical structure of the NeST universal automaton –that is, logic in its entirety - distributes over spacetime at all scales along withthe automaton itself. Notice the etymological resemblance of hologyto holography, a term used by physicist David Bohm to describe his ownprimitive nonlocal interpretation of QM. The difference: while Bohm’s PilotWave Theory was unclear on the exact nature of the "implicate order" forced byquantum nonlocality on the universe - an implicate order inevitably associatedwith conspansion - the CTMU answers this question in a way that satisfiesBell's theorem with no messy dichotomy between classical and quantumreality. Indeed, the CTMU is a true localistic theory in which nothing outrunsthe conspansive mechanism of light propagation.

The implications of conspansion for quantum physics as a whole, includingwavefunction collapse and entanglement, are similarly obvious. No lessgratifying is the fact that the nondeterministic computations posited in abstractcomputer science are largely indistinguishable from what occurs in QWFcollapse, where just one possibility out of many is inexplicably realized (whilethe CTMU offers an explanation called the Extended Superposition Principle orESP, standard physics contains no comparable principle). In conspansivespacetime, time itself becomes a process of wave-particle dualization mirroringthe expansive and collapsative stages of SCSPL grammar, embodying therecursive syntactic relationship of space, time and object.

There is no alternative to conspansion as an explanation of quantumnonlocality. Any nonconspansive, classically-oriented explanation wouldrequire that one of the following three principles be broken: the principleof realism, which holds that patterns among phenomena exist independently ofparticular observations; the principle of induction, whereby such patterns areimputed to orderly causes; and the principle of locality, which says that nothingtravels faster than light.

Page 22: Cristopher M. Langan - Physics and Metaphysics.pdf

The CTMU, on the other hand, preserves these principles by distributinggeneralized observation over reality in the form of generalized cognition;making classical causation a stable function of distributed SCSPL grammar;and ensuring by its structure that no law of physics requires faster-than-lightcommunication. So if basic tenets of science are to be upheld, Bell’s theoremmust be taken to imply the CTMU.

As previously described, if the conspanding universe were projected in aninternal plane, its evolution would look like ripples (infocognitive events)spreading outward on the surface of a pond, with new ripples starting in theintersects of their immediate ancestors.

Just as in the pond, old ripples continue to spread outward in ever-deeperlayers, carrying their virtual 0 diameters along with them. This is why we cancollapse the past history of a cosmic particle by observing it in the present, andwhy, as surely as Newcomb’s demon, we can determine the past throughregressive metric layers corresponding to a rising sequence of NeST strataleading to the stratum corresponding to the particle’s last determinantevent. The deeper and farther back in time we regress, the higher and morecomprehensive the level of NeST that we reach, until finally, like John Wheelerhimself, we achieve “observer participation” in the highest, most parallelizedlevel of NeST...the level corresponding to the very birth of reality.

Appendix A

Analysis is based on the concept of the derivative, an "instantaneous (rate of)change". Because an "instant" is durationless (of 0 extent) while a "change" isnot, this is an oxymoron. Cauchy and Weierstrass tried to resolve this paradoxwith the concept of "limits"; they failed. This led to the discovery ofnonstandard analysis by Abraham Robinson. The CTMU incorporates aconspansive extension of nonstandard analysis in which infinitesimal elementsof the hyperreal numbers of NSA are interpreted as having internal structure,i.e. as having nonzero internal extent. Because they are defined as beingindistinguishable from 0 in the real numbers Rn, i.e. the real subset of thehyperreals Hn, this permits us to speak of an "instantaneous rate of change";while the "instant" in question is of 0 external extent in Rn, it is of nonzerointernal extent in Hn. Thus, in taking the derivative of (e.g.) x2, both sides of theequation

Page 23: Cristopher M. Langan - Physics and Metaphysics.pdf

y/x = 2x + x

(where = "delta" = a generic increment) are nonzero, simultaneous and inbalance. That is, we can take x to 0 in Rn and drop it on the right with no lossof precision while avoiding a division by 0 on the left. More generally, thegeneric equation:

limxH0Ry/x = limxH0R[f(x +x) - f(x)]/x

no longer involves a forbidden "division by 0"; the division takes place in H,while the zeroing-out of x takes place in R. H and R, respectively "inside" and"outside" the limit and thus associated with the limit and the approach thereto,are model-theoretically identified with the two phases of the conspansionprocess L-sim and L-out, as conventionally related by wave-particle duality.This leads to the CTMU "Sum Over Futures" (SOF) interpretation of quantummechanics, incorporating an Extended Superposition Principle (ESP) under theguidance of the CTMU Telic Principle, which asserts that the universe isintrinsically self-configuring.

In this new CTMU extension of nonstandard analysis, the universe can have anundefined ("virtually 0") external extent while internally being a "conspansivelydifferentiable manifold". This, of course, describes a true intrinsic geometryincorporating intrinsic time as well as intrinsic space; so much for relativitytheory. In providing a unified foundation for mathematics, the CTMUincorporates complementary extensions of logic, set theory and algebra.Because physics is a blend of perception (observation and experiment) andmathematics, providing mathematics with a unified foundation (by interpretingit in a unified physical reality) also provides physics with a unified foundation(by interpreting it in a unified mathematical reality). Thus, by conspansiveduality, math and physics are recursively united in a dual-aspect reality whereinthey fill mutually foundational roles.

Page 24: Cristopher M. Langan - Physics and Metaphysics.pdf

Appendix B

Because the value of R can only be theoretically approximated, using R oreven R-1 to describe requantization makes it appear that we are simply using onetheory to justify another. But the R-to-R-1 inversion comes with an additionof logical structure, and it is this additional structure that enables us to define ahigh-level physical process, conspansion, that opposes gravity and explainsaccelerating redshift. Conspansive requantization is uniform at all scales andcan be seen as a function of the entire universe or of individual quanta; everypart of the universe is grammatically substituted, or injectively mapped, into animage of its former self…an image endowed with computationalfunctionability. To understand this, we must take a look at standard cosmology.

Standard cosmology views cosmic expansion in terms of a model calledERSU, the Expanding Rubber Sheet Universe. For present purposes, it issufficient to consider a simplified 2-spheric ERSU whose objects and observersare confined to its expanding 2-dimensional surface. In ERSU, the sizes ofmaterial objects remain constant while space expands like an inflating balloon(if objects grew at the rate of space itself, expansion could not be detected).

At the same time, spatial distances among comoving objects free of peculiarmotions remain fixed with respect any global comoving coordinate system; inthis sense, the mutual rescaling of matter and space is symmetric. But eitherway, the space occupied by an object is considered to “stretch” without theobject itself being stretched.

Aside from being paradoxical on its face, this violates the basic premise ofthe pure geometrodynamic view of physical reality, which ultimately impliesthat matter is “space in motion relative to itself”. If we nevertheless adhere toERSU and the Standard Model, the expansion rate (prior to gravitationalopposition) is constant when expressed in terms of material dimensions, i.e.,with respect to the original scale of the universe relative to which objectsremain constant in size. For example, if ERSU expansion were to be viewed asan outward layering process in which the top layer is “now”, the factor of linearexpansion relating successive layers would be the quotient of theircircumferences. Because object size is static, so is the cosmic time scale whenexpressed in terms of basic physical processes; at any stage of cosmicevolution, time is scaled exactly as it was in the beginning.

The idea behind the CTMU is to use advanced logic, algebra andcomputation theory to give spacetime a stratified computational or cognitive

Page 25: Cristopher M. Langan - Physics and Metaphysics.pdf

structure that lets ERSU be “inverted” and ERSU paradoxes resolved. Toglimpse how this is done, just look at the ERSU balloon from the inside insteadof the outside. Now imagine that its size remains constant as thin, transparentlayers of parallel distributed computation grow inward, and that as objects arecarried towards the center by each newly-created layer, they are proportionatelyresized. Instead of the universe expanding relative to objects whose sizesremain constant, the size of the universe remains constant and objects do theshrinking…along with any time scale expressed in terms of basic physicalprocesses defined on those objects. Now imagine that as objects and timescales remain in their shrunken state, layers become infinitesimally thin andrecede outward, with newer levels of space becoming “denser” relative to olderones and older levels becoming “stretched” relative to newer ones. In the olderlayers, light – which propagates in the form of a distributed parallelcomputation – “retroactively” slows down as it is forced to travel through moredensely-quantized overlying layers.

To let ourselves keep easy track of the distinction, we will give the ERSUand inverted-ERSU models opposite spellings. I.e., inverted-ERSU willbecome USRE. This turns out to be meaningful as well as convenient, for therehappens to be an apt descriptive phrase for which USRE is acronymic: theUniverse as a Self-Representational Entity. This phrase is consistent with theidea that the universe is a self-creative, internally-iterated computationalendomorphism.

It is important to be clear on the relationship between space and time inUSRE. The laws of physics are generally expressed as differential equationsdescribing physical processes in terms of other physical processes incorporatingmaterial dimensions. When time appears in such an equation, its units areunderstood to correspond to basic physical processes defined on the sizes ofphysical objects. Thus, any rescaling of objects must be accompanied by anappropriate rescaling of time if the laws of physics are to be preserved.

Where the material contents of spacetime behave in perfect accord with themedium they occupy, they contract as spacetime is requantized, and in order forthe laws of physics to remain constant, time must contract apace. E.g., if at anypoint it takes n time units for light to cross the diameter of a proton, it must takethe same number of units at any later juncture. If the proton contracts in theinterval, the time scale must contract accordingly, and the speed andwavelength of newly-emitted light must diminish relative to former values tomaintain the proper distribution of frequencies. But meanwhile, light already intransit slows down due to the distributed “stretching” of its deeper layer ofspace, i.e., the superimposition of more densely-quantized layers. Since itswavelength is fixed with respect to its own comoving scale (and that of the

Page 26: Cristopher M. Langan - Physics and Metaphysics.pdf

universe as a whole), wavelength rises and frequency falls relative to newer,denser scales.

Complementary recalibration of space and time scales accounts for cosmicredshift in the USRE model. But on a deeper level, the explanation lies in thenature of space and time themselves.

In ERSU, time acts externally on space, stretching and deforming it againstan unspecified background and transferring its content from point to point byvirtual osmosis. But in USRE, time and motion are implementedwholly within the spatial locales to which they apply. Thus, if cosmic redshiftdata indicate that “expansion accelerates” in ERSU, the inverse USREformulation says that spacetime requantization accelerates with respect to theiteration of a constant fractional multiplier…and that meanwhile, innerexpansion undergoes a complementary "deceleration" relative to the invariantsize of the universe. In this way, the two phases of conspansion work togetherto preserve the laws of nature.

The crux: as ERSU expands and the cosmological scaling factor R rises, theUSRE inverse scaling factor 1/R falls (this factor is expressed elsewhere in atime-independent form r). As ERSU swells and light waves get longer andlower in frequency, USRE quanta shrink with like results. In either model, thespeed of light falls with respect to any global comoving coordinate system;cn/c0 = R0/Rn = Rn

-1/R0-1 (the idea that c is an “absolute constant” in ERSU is

oversimplistic; like material dimensions, the speed of light can be seen tochange with respect to comoving space in cosmological time). But only inUSRE does the whole process become a distributed logico-mathematicalendomorphism effected in situ by the universe itself…a true localimplementation of physical law rather than a merely localistic transfer ofcontent based on a disjunction of space and logic. The point is to preserve validERSU relationships while changing their interpretationsso as to resolveparadoxes of ERSU cosmology and physics.

In Noesis/ECE 139, it was remarked that if the universe were projected on aninternal plane, spacetime evolution would resemble spreading ripples on thesurface of a pond, with new ripples starting in the intersects of oldones. Ripples represent events, or nomological (SCSPL-syntactic)combinations of material objects implicit as ensembles of distributed properties(quantum numbers). Now we see that outer (subsurface) ripples becomeinternally dilated as distances shorten and time accelerates within new ripplesgenerated on the surface.

CTMU monism says that the universe consists of one “dual-aspect”substance, infocognition, created by internal feedback within an even more

Page 27: Cristopher M. Langan - Physics and Metaphysics.pdf

basic (one-aspect) substance called telesis. That everything in the universe canmanifest itself as either information or cognition (and on combined scales, asboth) can easily be confirmed by the human experience of personalconsciousness, in which the self exists as information to its own cognition…i.e.,as an object or relation subject to its own temporal processing. If certainirrelevant constraints distinguishing a human brain from other kinds of objectare dropped, information and cognition become identical to spatialrelations and time.

In a composite object (like a brain) consisting of multiple parts, the dualaspects of infocognition become crowded together in spacetime. But in thequantum realm, this “monic duality” takes the form of an alternation basic tothe evolution of spacetime itself. This alternation usually goes by the nameof wave-particle duality, and refers to the inner-expansive and collapsativephases of the quantum wave function.

Where ripples represent the expansive (or cognitive) phase, and theircollapsation into new events determines the informational phase, the abovereasoning can be expressed as follows: as the infocognitive universe evolves,the absolute rate of spatiotemporal cognition cn at time n, as measured inabsolute (conserved) units of spacetime, is inversely proportional to theabsolute information density Rn/R0 of typical physical systems...i.e., to theconcentration of locally-processed physical information. As light slows down,more SCSPL-grammatical (generalized cognitive) steps are performed per unitof absolute distance traversed. So with respect to meaningful content, theuniverse remains steady in the process of self-creation.

Page 28: Cristopher M. Langan - Physics and Metaphysics.pdf

Partial Bibliography:

1. Exploding stars flash new bulletins from distant universe

James Glanz in Science, May 15, 1998 v280 n5366 p1008

2. To Infinity and Beyond

Robert Matthews in New Scientist, Apr 11, 1998 p27

3. Astronomers see a cosmic antigravity force at work

James Glanz in Science, Feb 27, 1998 v279 n5355 p1298

4. The Theory Formerly Known as Strings

Michael J. Duff in Scientific American, Feb 1998, v278 n2 p64

5. Exploding stars point to a universal repulsive force

James Glanz in Science, Jan 30, 1998 v 279 n5351 p 651

6. Mind and Body: René Descartes to William James

Robert H. Wozniak, 1996 <www.brynmawr.serendip.edu>

7. The Mind of God: The Scientific Basis for a Rational World by Paul Davies

Simon and Schuster, 1992

8. A Brief History of Time by Stephen W. Hawking

Bantam Books, 1988

9. Quantum Reality: Beyond the New Physics by Nick Herbert

Anchor Press/Doubleday, 1985

10. Cosmology: The Science of the Universe by Edward R. Harrison

Cambridge University Press, 1981

11. The Mathematical Experience by Philip J. Davis and Reuben Hersh

Birkhäuser, 1981