This is not a religious tract, nor is it physics, nor is it philosophy. It is meant to be no less and no more than a rational discourse on some of conundrums at the outer limits of our known physical understanding today. Since its arguably B.S.; lets just call it metaphysics and be over with it. The intent is, however, serious. The exploration here is meant to be part of a puzzle piece, along with a parallel exploration of freewill and the existence of Platonic realms, and a critique of Heidegger. Some of this might be experimentally tested? And maybe we can even associate some mathematical equations with these ideas, giving them at least a little bit of traction.

Bunk. Of course this isn't right, and with the exception of a few brave individuals, (e.g. Julian Barbour) nobody but nobody actually believes that this is the case. We have a too-strong anthropocentric experience to accept this view: we all know what the past is, we all sense that we live in the present, and we all seem to agree that much of the future, or at least, the truly important parts, are unpredictable. Curiously though, in this popular view, it is never pointed out that in fact, the past fits the above description to a tee. The past is so perfectly "predictable" that, in fact, we don't even use that word: we "remember" the past, we don't "predict" it. The past is immutable and unchangeable: there is nothing that we can do to change the past. That is, the past is exactly like those immutable, unalterable equations of motion: predestined and inescapable. So it seems that the past is exactly like this Newtonian world-view: its just that something funny happens in the present, somehow making the future unknowable.

One might think of the past as a block of ice forming, as the liquid present freezes onto it. The present, the 'here and now', is like a wave of crystallization, like Kurt Vonegut's Ice-9, freezing, propagating through space-time, segregating what was from what might be. This might make for nice literary allusions, but doesn't fit with the Newtonian view. Differential equations know of no past, present, or future: they don't distinguish between these, and this is precisely where the metaphysical problem originates. Ordinary, 'classical' mechanics states that the future is just like the past, and this seems to be inescapable, even as we all know intuitively that this is not so.

Two modern developments in Physics seem to offer avenues of escape from the conundrum of pre-determination. The first, Quantum Mechanics, introduces a certain amount of randomness that provides the wiggle-room needed to make the future unpredictable, and offer at least a glimmer for free will. Unfortunately, Quantum is saddled with a number of messy interpretational problems that leave an unsatisfying taste in the mouths of anyone who cares to seriously try them. The second, Chaos Theory, doesn't offer immediate escape from classical dynamics, but does show us how a whole lot of unpredictable things can happen in a short amount of time.

The need for such an interpretation is due to some curious puzzles that arise in discussions of quantum measurement. We review some of these below, and after that, we dive in.

Lets review some popular quantum conundrums:

- Schreodinger's Cat Paradox
- Mott, Heisenberg, 1929: alpha particles with spherical wave functions leave straight tracks in cloud chambers (alternately, consider a detector consisting of several plates at various distances and covering a 4pi solid angle. The non-detection of a decay at a closer plate has caused the wave function to 'collapse' at least partly, avoiding that plate, and eventually being detected on a more distant plate. Clearly, the language used to describe this makes 'wave function collapse' sound even more disconcerting than it is. The Mott/Heisenberg straight-track observation is a special case of this, where the wave function collapse has occurred on some microscopic scale, by ionizing an atom, as opposed to being delayed for macroscopic time periods.)
- EPR; and specifically, the decay of a singlet to a pair of spin-1/2 particles. Bell's Theorm.
- Quantum interference over macroscopic scales: e.g. proton interferometer
- Clauser Horne Shimoney etal

Lets review the popularly discussed quantum measurement proposals. All of these proposals seem to be lacking in that none provide any sort of detailed description of the mechanics of wave function collapse. They attempt to resolve the meta-physical paradoxes without proposing the physics.

**Many Worlds**, aka 'Quantum Multiverse'. The 'universe splits in two' every time a measurement is made. Since measurements apparently happen all the time, there's a googleplex of parallel universes. Without a more detailed proposal as to what the process is, and what sort of experimental predication it can make, its just more metaphysics. Popular among freer thinkers.- The
**'brain/mind'**causes quantum measurements to occur. This is one extreme response to the Schroedinger Cat Paradox: The wave function doesn't collapse until the experimenter looks the instruments. I don't think any mainstream thinkers take this seriously. There are too many ways to poke holes into this. **Many-body/Thermodynamic Interaction Hypothesis**. This is probably what most mainstream physicists think is the 'correct' answer, mostly because they haven't really thought about it. The wave function collapse occurs when the wave function interacts with a many-body, chaotic, thermodynamic system. Indeed, there is a huge grain of fact to this: wave functions really don't collapse until they interact with a whole bunch of atoms. There's a multitude of ways of supporting this position through experimental and theoretical arguments. Unfortunately, it has an Achilles heel: it fails to actually resolve any paradoxes. For example, consider the following: a spin-0 singlet state that decays into two very energetic spin-1/2 particles. We've arranged the fast decay products to pass through space-like separated stern-Gerlach magnets before hitting cloud chambers at each end. Think about it: you'll see why the mantra of 'thermodynamic interaction' won't work. (The singlet decays into a pair of EPR correlated particles; we know their spins point in opposite directions (yada yada Bell's Theorm yada yada). As they pass through the Stern-Gerlach magnets, the trajectories are deflected up or down. Then, finally, the 'measurement': the interaction between many bodies in the cloud chamber that collapses the wave packets. To maintain correlation between the measurements, the final collapse can't just happen when the wave function interacts with the gas in the cloud chamber. What happens in one cloud chamber is intimately connected to what happened in the other. Measurements seem to violate locality.)There are various ways of dressing up this hypothesis in high-falutin' language. For example: during interactions with multiple bodies, certain paths that may have at one point contributed to the Feynmann path integral in fact bump into analytic 'cuts', and stop contributing in a phase-coherent fashion to the path integral, thereby causing a collapse of the wave function. These cuts in the analytic plane don't exist in two-body interactions, but absolutely litter the phase space when the N in N-body is large enough. In other words, for N sufficiently larger than two, most phases follow chaotic paths, and the phase relationships are no longer coherent, but become incoherent, thereby marking the wave function collapse. (For example, consider the chaotic regime of the forced harmonic oscillator). This view of wave function collapse seems at first to be very appealing, precisely because it can be dressed up with all sorts of flowery appeals to chaos and the like. But, as we mentioned, it founders because it is ultimately a local theory, and fails to explain EPR correlations.

- Negative-energy influences travel backwards in time, thereby 'correcting'
the initial state. This hypothesis
states that when a measurement occurs, some influence travels backwards
in time, and modifies what happened 'back then', to bring it into a state
consistent with the measurement.
There's a certain amount of beauty to this hand-waving
argument. Time symmetry makes anti-particles look like particles traveling
backwards in time, and all solutions to the Dirac equation (with a
non-zero momentum) have a negative energy component.
This hypothesis, when coupled to the
'thermodynamic' hypothesis, actually seems capable of resolving the EPR
correlation issues. The measurements work out because some influence
travels backwards in time, alters the original state, and sets things
aright.
To put it another way: we know that when we make
predictions about future experiments with Bell's theorem, we know that
the wave must be in this and that state. We know that when we analyze
experiments performed in the past, they are consistent with Bell's theorem.
The one thing we can't verify is that the experiments aren't somehow
"rearranging" themselves in the "past", with unexpected wave-function
"collapses" that harmonize the results.
Unfortunately, this hypothesis seems to get almost no serious
discussion, and is therefore quite vague as to specific mechanisms and
details.

We could attack the path integral from a second direction. For example, in thermodynamic problems, one averages over an avogadro's number of states, which, for all practical purposes, we can take as 'infinite'. This is because experimental evidence provides close support for the taking of thermodynamic averages. We also know that thermodynamics breaks down when dealing with dozens or hundreds of atoms, because by then, the averages no longer accurately approximate the situation. Unfortunately, we have few similar experimental explorations on the contributions of the path integral to second quantization. We know that it must be correct for the most part, because otherwise quantum mechanics in general wouldn't work. The Casimir effect, studies of dielectrics, and surface tension physics provide some experimental tests of second quantization, but the linkage is not precise. (In the Casimir effect, one sums over all standing waves trapped between a pair of metal plates. Or, at least, one sums up to a frequency at which the plates become transparent to EM waves. This summation is a kind of a path integral. The experiments work, and the summation is quite sensitive to the cutoff frequency/regulator. In certain geometries, the summations lead to infinities that need to be regulated or dealt with, much like the free-field QED theory. What's different is that in the Casimir effect or in surface-tension or dielectric calculations, the infinities are 'real' in the sense that they really do depend on the cutoff frequency, and different materials with different cutoffs can be measured in the lab. But there is still a big leap from these experiments to the path integral.

**Abstract:**
By assuming that Planck-scale spacetime resembles a 'foam', it is deduced, by
means of hand-waving, that most quantum phenomena are best understood as interactions
classical geodesics on this foam. Furthermore, it is argued that the arrow of
time, as something distinct from 3D space, and having a fixed past and unknowable
future, is a side-effect of the reconciliation of anomalies in this foam.

Lets assume that spacetime, at Planck scales (10e-40), resembles a foam of worm-holes. Lets imagine geodesics on this foam. Now, lets imagine that these geodesics interacted like billiard balls, i.e. had point interactions with each other. This clearly leads to grandfather paradoxes throughout the foam: a future billiard ball could emerge from a wormhole in the past, and prevent itself from going into the very wormhole it emerged from.

Hypothesis: The space-time foam tries to organize or equilibrate itself so that such grandfather-paradoxes between billiard geodesics do not occur. Hypothesis: this act of organizing or equilibrating propagates through the foam as a 'wavefront', a 3D slab of a surface traveling along a fourth dimension. This thin 3D slab is what we call 'right now'; what lies behind this slab is 'the past', and what lies ahead is 'the future'.

Now let us pause to notice that this simple model does a decent job of 'explaining' second quantization. How does it do this?

First, note that the act of second quantization consists of writing a sum, a 'Feynmann Path Integral' over all possible paths, weighted by the exponent of the action. The action defines the classical path; as any given path deviates from the classical path, it is weighted away. With only the slightest of handwaving, it should be obvious that if we imagine geodesics on a space-time foam, that these could well be taken to be the 'paths' that contribute to the path integral. Now, a purist might start arguing about the Hausdorff Measure of geodesics on a foam, as compared to the 'uniformly distributed' Jacobian of a path integral, but I will only shoot back that any such argument is already on tenuous mathematical footing. The set of geodesics on a foam should probably form a complete enough set to work just fine as the domain over which a path integral is taken. (Although not usually studied by physicists, there are deep fundamental problems with integrals of stochastic differential equations, which are well know to 'quants' working in the financial industries. These difficulties have been overcome for simple arbitrage and options pricing formulas but continue to plague more complex financial models. With some handwaving, its clear that many of these problems apply to path integrals as well, especially when one considers the question of whether the 'paths' contributing to the integral are differentiable or even continuous.)

Next, note that the 'accidental' resemblance of various diffusion and stat-mech equations to various quantum equations is no longer so 'accidental': the statistical properties and distributions of geodesics on a foam should indeed have the same qualitative properties as Brownian motion in a hard-ball gas.

Thus, we've just hand-waved our way over a giant part of quantum mechanics. To be more precise, we will need to also cover why it is that Planck's constant is used in these path integrals, and how nothing in this handwaving contradicts the infinitely more rigorous theories of superstrings. Indeed, while we're handwaving, we should point out that talking of geodesics on a spacetime foam might be a lot like talking about the hydrogen atom is if it were a planetary system whose orbits are constrained to have integer-wavelength circumference. We know that Bohr Action-Angle formalisms gave over to solutions of the Schroedinger wave equation for the Hydrogen atom; in a similar way, we might think that this talk of geodesics is a mental place-holder for something that ultimately phrases itself quite differently.

Note that the probabilities of quantum wave functions are re-interpreted as the possible future geometrical arrangements of the Planck-scale foam. Some arrangements are very likely, some are very unlikely. There are no hidden variables here: all one can talk about is possible future arrangements based on a knowledge of the current/past history. (Question to be answered: if the past is not known, who does this affect the prediction? i.e. how would we envision a (non-relativistic) electron propagating in free space? What is happening to this 'wave' as this Planck-scale foam 'freezes' along? Generically, how does one derive the idea of a propagater from the foam? There are similarities to diffusion eqn, but how can one get more specific?)

Next, let us imagine what this world of geodesics implies for quantum measurement. There is a strong sense of non-locality that is induced by this foam. Imagine the classic spin-EPR experiment, where a singlet decays into a pair of spin-1/2 particles, whose spins are then measured, and found to be correlated. This is where, I think, my handwaving reaches a crux, and all the best stuff falls out. Here goes (this is a bit rocky):

Imagine that the spin-1/2 state is described by a collection of geodesics propagating on this foam. As this data propagates through the foam, a pair of measurements are made, and the data from these measurements are brought back together to roughly the same point in spacetime, where the experimenter can compare them and verify that e.g. Bell's theorem, has not been violated. Hypothesis: the geodesics in the space-time foam can only be made self-consistent in the backwards light cone. In particular, the geodesics from one measurement cannot be reconciled with those from the other until that have both entered into a common backwards light cone (i.e. the measurements have been brought together). When these paths are brought back together, the outcome 'freezes', i.e. become a part of the 'past'. This act of coming together defines not only the arrow of time, but defines time itself (or rather, distinguishes time from space).

The Past is Manifest Destiny. (Newtonian ...) The claim is that this is no accident. The creation of the past is what happens when geodesics are reconciled on a space-time foam.

- Hand-waving for quantum numbers and 'elementary particles'.
Quantum numbers find expression in being certain twisted
configurations of geodesics. That is, elementary particles,
such as the electron or neutrino, might just be topologically
stable arrangements of the Planck-scale space-time foam:
solitons of some sort.
In other words, they might be analogous to the 'particles'
seen in solid-state physics: bosonic 'phonons', or fermionic
crystalline dislocations/defects. Particle masses would then
presumably be functions of the number of handles, or some
other topological integer. Anti-particles would presumably
be some reversed topological knots that, when brought near
a particle, can be slipped to untie the knots. I don't
know enough of topology to argue that the Pauli exclusion
principle may be topological in nature: some (anti-)symmetrical
exchange of handles between two configurations.
The topological interpretation avoids the silly questions such as 'why do all electrons look alike'? They all look alike because they are all represented by the same set of self-consistent twists in the space-time foam. Asking 'why do all electrons look alike' is tantamount to asking 'why are all instances of the number two alike?' Or maybe better yet: 'why do all single-point dislocations of a lattice look alike?' Physical particles are configurations of an underlying relativistic 'ether'.

The above considerations would seem to dictate a program of topological research. However, this alone doesn't seem sufficient to resolve the issue of the arrow of time, or of the second law of thermodynamics. Topological arrangements would seem to be time-symmetric. Its too static in itself. If we are to view the past as 'fixed', and the future as 'indeterminate', then we need a language of topology where the past is fixed, the present is a mad scramble to arrange a topology so that it is consistent with everything that came in the past.

I know of no topology that acts like this. There is no study that asks and answers the question: 'how do I tie the knot right here, so that the strands 'back there' don't have to be rearranged'. Topological stability in crystal lattice dislocations is independent of time, and are arrangements in 3D space. Instead, we need to ponder the evolution of topological entities as they are brought 'close' to each other in space.

The other problem is that we still have no insight into what free-will might be all about. Clearly, free-will is a phenomenon of the present: I have no free-will in the past, nor in the future. At best, I can try to control what I am doing 'right now'. Topological arguments seem to be deterministic, however fanciful they may get. It seems to me that any theory of the flow of time is somehow incomplete until it also somehow addresses the question of free-will.

- Re-emphasize that no matter how silly all of this sounds,
it does indeed avoid both 'many-worlds' and the 'Schroedinger
cat paradox' and all that baggage. No doubt, for some
philosophers, 'many worlds' or quantum-indeterminate brain
cells sound like more fun, but I think that my handwaving
is less looney than the alternatives.
- There are two ways of dealing with Clauser Horne Shimony etal correlations.
One way is to view this as a bizarre kind of interferometric
experiment, where macroscopic indeterminism exists until
the results of both space-time measurements are brought to
the same general area of space-time. i.e. the cat is both
dead and alive until the experimenter brings both measurements
together to discover that correlations exist. But this
violates our macroscopic intuition about locality.
Another possibility is that upon wave function collapse in one location, an influence travels 'backwards in time', along one arm of the experiment, and thence forwards along the other, precipitating a change there. What is the form of this influence? Why, precisely that which can carry no information: a gauge-like rotation of the phase of the wave function. For space-like separated measurements, this communication via backwards-traveling geodesics helps keep each side in perfect correlation with the other, so that any interaction with a macroscopic, stochastic tank of particles that is the measuring instrument is 'simultaneously' echoed in the other arm.

Might it be possible to test this hypothesis experimentally, by placing two devices at space-like separations, blasting correlated particles into both of them, and then looking to see if they've achieved thermodynamic equilibrium? For, if the hypothesis is correct, then faster-than-light signalling is disallowed, but thermodynamic interaction is not. This might be experimentally testable.

However, this is tricky, and has a (fatal?) flaw: we can still do faster-than-light signalling. Lets say that experimenter at location A measures the temperature of a container that is getting getting correlated photons being shot into it. Experimenter at location B can put one of two containers in the path of the photons: a hot tank, and a cold tank. If quantum correlations could act to bring the two containers into thermodynamic equilibrium, then experimenter A would see the temperature rise, or drop, over time. With sufficient space-like separation, this could be used to perform faster-than-light signalling.

(Equally disturbing is to think of the situation in terms of entropy.)

- Point out some obvious similarities between closed geodesics and (super-) strings.
Note that if the spacetime foam has to wiggle around to close up inconsistent
(grandfather-paradoxish) geodesics, that this could well resemble vibrations
of strings. Add the weasel words to state that we may not be talking about four
dimensions here. Maybe add some super-symmetric handwaving.
- Say something about the future. How the future is sort-of predictable,
in the sense that I can predict the path of a baseball, although I cannot predict
that the baseball stadium will not be suddenly wiped out by a meteorite or other
unexpected event. It is this essence: the merging of all events into the
past light cone, that distinguishes the future from the past.
Emphasize that quantum uncertainty is a sum over all states is really
what this is all about: the uncertain future can only be expressed over all
possible quantum states precisely because the precise arrangement of the
spacetime foam won't be known until it becomes the past.
Emphasize also the oxymoron: the arrangement of the past affects the future.
- Explore how things like chaos theory have implications for the predictability
of the future, and in particular, how positive-Lyapunov exponent spacetime foams.
- Explore things like 'Hausdorff measure', Serpinski carpets, etc. are relevant
to the conversation. Explore to see how classical chaos manifesting itself at
the Planck scale may explain certain quantum phenomena. I dunno, this is pretty
far out there.
- Consciousness as a quantum phenomenon.

- Arrow of Time
- We experience time to go forward, even though virtually all equations used in physics don't make a distinction between time moving forward, and time moving backwards. If the equations don't seem to care, then why does time move forward? This is the conundrum of the "Arrow of Time".
- Butterfly Effect
- A popular term in chaos theory. Best understood through and old litany: For want of a nail the shoe was lost, for want of a shoe the horse was lost, for want of a horse the rider was lost, for want of a rider, the message was lost, for want of a message the battle was lost, all for want of a nail. It is a statement that small differences in initial conditions can lead to drastically different final outcomes. See also 'Positive Lyapunov Exponent'.
- EPR (Einstein Podolsky Rosen) Paradox
- A famous paper where a conundrum of the Heisenberg Uncertainty Principle is explored ...
- Feynmann Path Integral
- Gestalt of Determinism
- When one places oneself outside the context of time ...
- Grandfather Paradox
- If you had a time machine, could you go back and kill your own grandfather while he was just a child? And if you had, where would that leave you? Time travel is filled with paradoxes.
- Hidden Variables, Bell's Theorem
- The idea that the probabilistic outcomes of quantum mechanics can't be accounted for by assuming the existence of some as-yet undetermined (hidden) controlling variables.
- Anthropocentric Universe
- These are physical facts that can be deduced from the existence of mankind and/or of the self. In particular, we can deduce that that the past is different than the future, because, well, that is just how we all experience time.
- Inflation
- Inflation for Beginners
- Light Cone
- Many Worlds Hypothesis
- PhD Thesis, 1957, Hugh Everett, Princeton University
- Pre-destination
- The idea that ones actions cannot change the future, that events will unfold in a certain way despite ones best efforts.
- Planck Length
- Platonic
- A concept advocated by Julian Barbour, arguing that time doesn't exist, and that instead one should view the universe as a giant configuration space, consisting of the positions of all particles relative to one-another. (In my opinion, the fact that this hypothesis has trouble explaining history, or at least, the human perception that there is a past, rather invalidates the whole thing. I am not happy about a theory that explains less, rather than more.)
- Quantum Entanglement
- Quantum Measurement
- Schreodinger's Cat Paradox
- Second Quantization
- Supersymmetry
- Superstring
- Wormhole

First Draft, November 1999

February 2000

Linas Vepstas linas@linas.org