E-034 SEP HDET - Determinism Hard (Thesis 3 Excerpt)

Exact excerpt

A huge range of responses to the Hole Argument have been published since 1989; after an initial burst of articles in the late 1980s and early-mid 1990s, there was a relatively quiet period, followed by a revival of interest from 2011 onward (see [the hole argument](https://plato.stanford.edu/entries/spacetime-holearg/) for in-depth consideration of a number of responses to the argument). One popular family of responses (e.g., Hoefer 1996, Pooley 2006) departs from the observation that the differences represented by the models <_M_ , **g** , **T** > and <_M_ , _h*_ **g** , _h*_ **T** > are purely _haecceitistic_ and therefore may be rejected if one adopts an anti-haecceitistic metaphysics. Following Belot & Earman (2001), anti-haecceitist substantivalism is sometimes called “sophisticated substantivalism”.
#### 4.3.2 Singularities
The separation of space-time structures into manifold and metric (or connection) facilitates mathematical clarity in many ways, but also opens up Pandora’s box when it comes to determinism. The indeterminism of the Earman and Norton hole argument is only the tip of the iceberg; singularities make up much of the rest of the berg. In general terms, a singularity can be thought of as a “place where things go bad” in one way or another in the space-time model. For example, near the center of a Schwarzschild black hole, curvature increases without bound, and at the center itself it is undefined, which means that Einstein’s equations cannot be said to hold, which means (arguably) that this point does not exist as a part of the space-time at all! Some specific examples are clear, but giving a general definition of a singularity, like defining determinism itself in GTR, is a vexed issue (see the entry on [singularities and black holes](https://plato.stanford.edu/entries/spacetime-singularities/) and Earman (1995) for extensive treatments; Callender and Hoefer (2001) gives a brief overview). We will not attempt here to catalog the various definitions and types of singularity.
Different types of singularity bring different types of threat to determinism. In the case of ordinary black holes, mentioned above, all is well outside the so- called “event horizon”, which is the spherical surface defining the black hole: once a body or light signal passes through the event horizon to the interior region of the black hole, it can never escape again. Generally, no violation of determinism looms outside the event horizon; but what about inside? Some black hole models have so-called “Cauchy horizons” inside the event horizon, i.e., surfaces beyond which determinism breaks down.
Another way for a model spacetime to be singular is to have points or regions go missing, in some cases by simple excision. Perhaps the most dramatic form of this involves taking a nice model with a space-like surface _t_ = _E_ (i.e., a well-defined part of the space-time that can be considered “the state state of the world at time _E_ ”), and cutting out and throwing away this surface and all points temporally later. The resulting spacetime satisfies Einstein’s equations; but, unfortunately for any inhabitants, the universe comes to a sudden and unpredictable end at time _E_. This is too trivial a move to be considered a real threat to determinism in GTR; we can impose a reasonable requirement that space-time not “run out” in this way without some physical reason (the spacetime should be “maximally extended”). For discussion of precise versions of such a requirement, and whether they succeed in eliminating unwanted singularities, see Earman (1995, chapter 2).
The most problematic kinds of singularities, in terms of determinism, are _naked singularities_ (singularities not hidden behind an event horizon). When a singularity forms from gravitational collapse, the usual model of such a process involves the formation of an event horizon (i.e. a black hole). A universe with an ordinary black hole has a singularity, but as noted above, (outside the event horizon at least) nothing unpredictable happens as a result. A naked singularity, by contrast, has no such protective barrier. In much the way that anything can disappear by falling into an excised-region singularity, or appear out of a white hole (white holes themselves are, in fact, technically naked singularities), there is the worry that anything at all could pop out of a naked singularity, without warning (hence, violating determinism _en passant_). While most white hole models have Cauchy surfaces and are thus _arguably_ deterministic, other naked singularity models lack this property. Physicists disturbed by the unpredictable potentialities of such singularities have worked to try to prove various _cosmic censorship hypotheses_ that show—under (hopefully) plausible physical assumptions—that such things do not arise by stellar collapse in GTR (and hence are not liable to come into existence in our world). To date no very general and convincing forms of the hypothesis have been proven (see the entry on [singularities and black holes](https://plato.stanford.edu/entries/spacetime-singularities/), section 4) so the prospects for determinism in GTR as a mathematical theory do not look terribly good.
### 4.4 Quantum mechanics
As indicated above, QM is widely thought to be a strongly non-deterministic theory. Popular belief (even among most physicists) holds that phenomena such as radioactive decay, photon emission and absorption, and many others are such that only a _probabilistic_ description of them can be given. The theory does not say what happens in a given case, but only says what the probabilities of various results are. So, for example, according to QM the fullest description possible of a radium atom (or a chunk of radium, for that matter), does not suffice to determine when a given atom will decay, nor how many atoms in the chunk will have decayed at any given time. The theory gives only the probabilities for a decay (or a number of decays) to happen within a given span of time. Einstein and others thought that this was a defect of the theory that should eventually be removed, perhaps by a supplemental _hidden variable_ theory[[7](https://plato.stanford.edu/entries/determinism-causal/notes.html#note-7)] that restores determinism; but subsequent work showed that no such hidden variables account could exist. At the microscopic level the world is ultimately mysterious and chancy.
So goes the story; but like much popular wisdom, it is partly mistaken and/or misleading. Ironically, quantum mechanics is one of the best prospects for a genuinely deterministic theory in modern times. Everything hinges on what interpretational and philosophical decisions one adopts. The fundamental law at the heart of non-relativistic QM is the Schrödinger equation. The evolution of a wavefunction describing a physical system under this equation is normally taken to be perfectly deterministic.[[8](https://plato.stanford.edu/entries/determinism-causal/notes.html#note-8)] If one adopts an interpretation of QM according to which that’s it—i.e., nothing ever interrupts Schrödinger evolution, and the wavefunctions governed by the equation tell the complete physical story—then quantum mechanics is a perfectly deterministic theory. There are several interpretations that physicists and philosophers have given of QM which go this way. (See the entries on [quantum mechanics](https://plato.stanford.edu/entries/qm/) for general discussion and [Everettian quantum mechanics](https://plato.stanford.edu/entries/qm-everett/) and [many-worlds interpretation of quantum mechanics](https://plato.stanford.edu/entries/qm-manyworlds/)for discussion of the most prominent such interpretation).
More commonly in the 20th century—and this is part of the basis for the popular wisdom—physicists resolved the [quantum measurement problem](https://plato.stanford.edu/entries/qt-issues/index.html#MeasProb) by postulating that some process of “collapse of the wavefunction” occurs during measurements or observations that interrupts Schrödinger evolution. The collapse process is usually postulated to be indeterministic, with probabilities for various outcomes, _via_ Born’s rule, calculable on the basis of a system’s wavefunction. The once-standard Copenhagen interpretation of QM posits such a collapse. It has the virtue of solving certain problems such as the infamous Schrödinger’s cat paradox, but few philosophers or physicists can take it very seriously unless they are instrumentalists about the theory. The reason is simple: the collapse process is not physically well-defined, is characterised in terms of an anthropomorphic notion (_measurement_) and feels too _ad hoc_ to be a fundamental part of nature’s laws.[[9](https://plato.stanford.edu/entries/determinism-causal/notes.html#note-9)] In recent decades, it is more common to preset the “collapse of the wavefunction” as a merely _effective_ or _apparent_ phenomenon, usually by appealing to some sort of “decoherence” that renders the continuing existence of superpositions empirically undetectable. On this approach, the deterministic and unitary evolution of quantum states is not interrupted (see the entry on the [many-worlds interpretation of quantum mechanics](https://plato.stanford.edu/entries/qm-manyworlds/)).
In 1952 David Bohm created an alternative theoretical framework for non-relativistic QM that realizes Einstein’s dream of a hidden variable theory, restoring determinism and definiteness to micro-reality. In [Bohmian quantum mechanics](https://plato.stanford.edu/entries/qm-bohm/), unlike other interpretations, it is postulated that all particles have, at all times, a definite position and velocity. In addition to the Schrödinger equation, Bohm posited a _guidance equation_ that determines, on the basis of the system’s wavefunction and particles’ initial positions and velocities, what their future positions and velocities should be. As much as any classical theory of point particles moving under force fields, then, Bohm’s theory is deterministic. Amazingly, he was also able to show that, as long as the statistical distribution of initial positions and velocities of particles are chosen so as to meet a “quantum equilibrium” condition, his theory is empirically equivalent to standard Copenhagen QM. However, and unfortunately, as Wallace (2020) has forcefully argued, Bohmian mechanics and its later extensions do not yet offer an alternative to the standard quantum field theories of the Standard Model of particle physics. One approach to extending Bohmian mechanics to general quantum field theories, that of John Bell, changes the theory to being stochastic rather than deterministic.