Skip to main content
Quantum mechanics

Quantum mechanics

The flawed multiverse

22 Sep 2011

The Beginning of Infinity: Explanations that Transform the World
David Deutsch
2011 Allen Lane £25.00/$56.00hb 496pp

Too many worlds?

According to the quantum-information theorist David Deutsch, our modern understanding of how the world works has provided us with “good explanations” that open up essentially infinite possibilities for future progress. One of these explanations is the idea of the quantum multiverse, which Deutsch discussed in the May issue of Physics World (pp34–38, print version only) and to which he devotes a chapter in his book The Beginning of Infinity.

In 1957 Hugh Everett III noted that if quantum mechanics is a universal theory, then it should be applicable to particle detectors, and indeed observers, as well as to individual particles. Consider an experiment where a photon interacts with a partially reflecting surface, and separate photon detectors are positioned to register transmitted and reflected photons. A straightforward quantum-mechanical calculation predicts that the resulting quantum state of this whole set-up should be a linear combination of one where the photon has been detected in the transmitted (but not the reflected) direction, and another where the reverse is true. Experimentally, of course, a photon is found in one or other of the detectors at random, with probabilities that depend on the properties of the reflecting surface.

To get agreement with experiment, we normally employ a further postulate, known as the Born rule, which states that a measurement causes the wavefunction of the system to “collapse”, which means that only one of the above outcomes actually occurs. The relative probabilities of the possible outcomes are given by the modulus squared of the corresponding parts of the wavefunction.

However, Everett proposed that the collapse postulate is unnecessary, thanks to decoherence. Once a photon has been detected, its quantum state becomes entangled with the state of the detector, and to perform an interference experiment, coherence would have to be maintained for all the phases associated with the huge number of particles making up the detectors. This is a practical impossibility; so even though the two outcomes coexist, they cannot affect each other in any way. This means that both can continue while unaware of each other’s existence. In other words, the photon has gone both ways, and the detectors and everything that interacts with them have two different futures: one in which the photon was reflected and another in which it was transmitted.

An inevitable consequence of Everett’s theory is that this splitting occurs even if a human observer records the result of the experiment. The observer also evolves into a superposition of two states, each of which is completely unaware of the other, leading them to assume that collapse has occurred and only their branch exists. As Deutsch explains, this splitting spreads out from the experimental apparatus in what he calls a “wave of differentiation” until it eventually encompasses everything – hence the terms “parallel universes” or “many worlds”.

Because the assumption of collapse is no longer required, it has been said that the many-worlds interpretation is “economical with postulates although extravagant with universes”. This should surely be sufficient reason not to dismiss the idea out of hand. However, I believe the many-worlds theory is open to criticism for reasons other than extravagence. One of these concerns probabilities in a situation where both outcomes occur in parallel. If both options are happening, how can it be meaningful to say that one is more probable than the other – as is experimentally the case if the reflector is not exactly 50/50?

As he described in his Physics World article, Deutsch’s response is to propose that before the measurement, the photon is not just a single particle but is actually an (uncountable) infinity of identical or “fungible” particles. After interacting with the reflector, an infinite number of fungible photons exist in both output channels, but the ratio of these numbers is finite, so that each has a “measure” proportional to the squared modulus of the wavefunction. Even though an observer knows they are going to evolve into two copies of themself, they can apparently assign relative probabilities to which copy they expect to become. These probabilities are given by the Born rule.

Whether Deutsch’s fungibility formulation successfully resolves the question of probabilities is a moot point, but it seems to me that the multiverse concept raises another problem, which other supporters of many-worlds theories thought they had resolved some time ago. This is known as the “preferred basis” problem, and it arises from the fact that there is no unique way to express a quantum state as a superposition of two component states.

Consider a spin-half particle in an eigenstate of the x component of spin. This can be expressed as an equally weighted superposition of the positive and negative eigenstates of the z component – or of the y component, or, indeed, as an appropriately weighted superposition of any two linearly independent spin states. Together, these states constitute a “basis”. Suppose now that we pass such a particle through an apparatus oriented to measure some component of spin. According to the multiverse model, before the measurement, the ratio of the (infinite) numbers of instances of the particle that will appear in the two possible output channels corresponds to the relative probabilities given by the Born rule.

However, this ratio is a function of the direction chosen for the measurement, so the initial state of the particle must depend on the nature of the measurement that is still to be performed. Choosing the basis beforehand to suit the properties of the subsequent measurement seems to me to destroy the objectivity of the description of the initial state and, indeed, of the multiverse. This process also implies an additional assumption, which means that we have lost the multiverse’s economy with postulates, while the extravagance with universes remains. Deutsch appears to recognize this difficulty to some extent; he indicates that it is related to the quantum electron “field”, but he does not explain how this could resolve the problem.

Deutsch’s belief in the existence of the multiverse inspired his ground-breaking contributions to quantum computing, and he believes that a successful implementation of a quantum computer would constitute incontrovertible evidence for it. He argues that the reason a quantum computer can carry out some tasks very much faster than a classical one is because the former performs a large number of calculations simultaneously in parallel universes. However, I believe that this idea is also challenged by the preferred-basis problem.

To see how, let us take as an example the quantum Fourier transform, which is the core operation in Shor’s algorithm for efficiently obtaining the prime factors of large numbers using a quantum computer. This operation subjects a set of qubits – quantum objects such as spin-half particles that have two possible states – to a series of “unitary” operations, which in the spin-half example amount to subjecting the spins to a series of rotations. This creates an entangled state that is a linear superposition of binary representations of the components of the Fourier transform of whichever function was represented by the original configuration of qubits. These separate components certainly form a basis, but there is no obvious reason why this basis should be preferred over any other, or why this quantum process should not occur in a single universe.

Criticisms of many-worlds theories in general and of the quantum multiverse in particular have been around for a long time now, and it is a pity that Deutsch does not recognize and address some of them in his book. Instead, he devotes a chapter to attacking the “bad philosophy” underlying alternative interpretations, particularly the conventional “Copenhagen” interpretation, which relies on making a distinction between the quantum world of particles and the classical world of detectors and observers. Many of us can certainly see weaknesses in the Copenhagen approach, but this does not mean that the multiverse is immune to criticism.

Quantum physics occupies only two of the 18 chapters of this book, which also surveys a wide range of modern science and philosophy. Deutsch’s main theme is the possibilities for future progress that the ongoing scientific revolution has generated. As well as allegedly explaining quantum physics, these possibilities include the evolution of a culture based on a democratic political system and our ability to achieve anything that is not forbidden by the laws of nature – including immortality. Illness and old age, he writes, “are going to be cured …certainly within the next few lifetimes”.

Deutsch’s ideas are expressed very clearly and the text is enlivened by a number of amusing anecdotes. This is a book that should interest anyone who likes thinking about the deep issues that underlie our understanding of the modern world – provided they maintain some scepticism over its dogmatic tone and reluctance to countenance alternative viewpoints. Deutsch willingly accepts that much of his inspiration comes from the work of Karl Popper, whose mantra “we have a duty to be optimistic” clearly underlies his thinking. However, he would have done well to remember that Popper was often dogmatic, to the point where some wags said that his book The Open Society and its Enemies should have been called “The Open Society by one of its Enemies”!

Related events

Copyright © 2024 by IOP Publishing Ltd and individual contributors