Twenty years ago, it was possible to write a book on the foundations of quantum mechanics in which the Everett interpretation was relegated to a footnote (Maudlin 1994, p.4). That would be unthinkable today, in large part because of the work of David Wallace. Nobody has done more to defend, clarify and advance the Everett interpretation over the past dozen years than Wallace, and this book is the culmination of his work on this area. As those who have read Wallace's articles will expect, it is an excellent book, and should be required reading for anyone interested in the foundations of quantum mechanics.

The trouble with quantum mechanics is that it doesn't provide us with unique outcomes to our experiments. If we represent the initial state of a system as quantum mechanics suggests, and we allow that representation to evolve according to its dynamical law, then the result typically does not correspond to any unique outcome, but instead is a superposition of terms representing several distinct outcomes. In response to this problem -- the measurement problem -- a variety of interpretations of quantum mechanics have been developed. These fall into two general categories: either one denies realism, so that quantum mechanics is not taken to be descriptive, or one denies that quantum mechanics is complete, and supplements it with extra physics that *does* pick out one result as actual at the end of an experiment.

Wallace wants to sweep all this away. Taking his lead from Everett (1957), Wallace argues that there is no measurement problem: the superposition described above is a perfectly good description of the system after the experiment. But of course this involves some significant bullet-biting; the reason that the representation doesn't pick out one outcome as actual is that *all* the superposed outcomes are actual, each existing in a different "branch" or "world". Hence the business of *interpreting* quantum mechanics to avoid the measurement problem is simply misguided, and indeed calling Everettian quantum mechanics (EQM) an *interpretation* is really a misnomer, since EQM is just standard quantum mechanics taken literally. However, many readers find Everett's position to be incomprehensible or incoherent, so Wallace sets himself the task of constructing a coherent Everettian theory and defending it against the many objections that have been leveled against such views. The result is a vigorous defense of a strikingly alien ontology, and this is why it makes such compelling reading.

The book is divided into three parts. The first part defends the claim that the formalism of quantum mechanics yields the structure of branching quasi-classical worlds that the Everettian says it does. The second part defends the claim that the weights of these branches -- the squared amplitudes of the terms in the superposition -- give the probabilities of the states of affairs they contain. Given that the arguments of the first two parts show that EQM is a coherent theory, the third part derives various consequences of that theory, concerning uncertainty and possibility, locality, the direction of time and a number of other topics. The three sections are punctuated by two Interludes and an Epilogue in which the author engages an imaginary (or rather, composite) Everett skeptic in dialogue on the merits of these arguments. The book is completed by four substantive technical appendices giving formal proofs of some of the results of the first two parts and providing a rigorous account of the decision theory required for part two.

The first chapter stakes out the territory. Wallace begins by pointing out that realism goes without saying in the sciences: scientists take their theories to provide true descriptions of the world, not just predictions of the behavior of pieces of lab equipment. Quantum mechanics is often taken to challenge this realism, whereas Wallace takes the (methodologically!) conservative position that quantum mechanics represents the structure of the physical world just like any other theory. The presentation of quantum mechanics is standard, except that in addition to identifying the state with a ray in a Hilbert space and the dynamics with a set of unitary transformations on the state, Wallace also includes as part of the theory "some additional structure on the Hilbert space, sufficient to specify the particular system which we are studying" (p.14). This additional structure typically takes the form of a preferred set of basis vectors or a particular decomposition of the Hilbert space into quotient spaces. So a set of *N* particles, for example, is modeled as a tensor product of *N* Hilbert spaces, with three position operators (one for each dimension) defined on each individual Hilbert space.

One of the laudable features of the book is that Wallace tries to stay as close as possible to quantum mechanics as it exists in practice. Hence he describes quantum measurement theory in terms of positive operator valued measures (POVMs), rather than following the usual presentation (at least in the philosophical literature) in terms of projection. Wallace describes a POVM algorithm for extracting observable predictions (in the form of probabilities for various macroscopically distinct outcomes) from the quantum state. He formulates the measurement problem in these terms as the problem that measurement cannot be represented physically. In the rest of science, we treat measurement as just another physical process, but in quantum mechanics we appeal to an ad hoc measurement algorithm with no physical interpretation.

Wallace considers, and rejects, various traditional responses to the measurement problem. Instrumentalism doesn't do justice to the practice of physics and relies on an arbitrary micro-macro distinction. Steering a middle ground between realism and instrumentalism, for example by arguing that the world is made of information or that classical logic fails, are desperate moves that commit us to a reformulation of the whole of science. Alternative realist approaches, namely hidden variable theories and dynamical collapse theories, cannot be extended to the relativistic domain. The arguments here are rather quick, but that is fine: since Wallace's contention is that Everett's insight makes these moves unnecessary, the arguments are moot anyway.

Everett's insight, of course, is that the post-measurement superposition of terms described above describes *multiplicity* rather than indefiniteness. Rather than a single measuring device recording an indefinite outcome, the superposition describes multiple measuring devices recording distinct outcomes. Hence the measurement problem is not so much solved as dissolved; since there is no need to appeal to the measurement algorithm to connect quantum mechanics to the world, there is no need to respond to the problematic nature of this connection. But to make good on this proposal requires solving a number of problems, and Wallace finishes Chapter 1 by outlining the challenges facing the Everett interpretation. The major problems are ontological -- what justifies us as regarding the quantum state as a collection of quasi-classical worlds? -- and the probability problem -- how can we make sense of probabilistic quantum predictions in a deterministic branching universe? The response to the former problem makes up the first part of the book, and the response to the latter problem the second part.

Chapters 2 and 3 constitute Wallace's argument that the quantum state is correctly regarded as describing a collection of quasi-classical worlds -- that it is not a misnomer to call EQM the *many-worlds* interpretation. Quantum mechanics makes no mention of worlds, and to modify the theory to add explicit mention of worlds would be to violate the spirit of Everett's proposal. So Wallace takes worlds to be emergent entities -- not directly definable in terms of the underlying physics, but not independent of the underlying physics either. Emergence is often taken to be a mysterious phenomenon, and it is Wallace's task in chapter 2 to argue instead that emergence is a mundane feature that is ubiquitous in science.

Wallace's basic contention -- and it is a plausible one -- is that just about all the entities studied by science are emergent entities. Wallace is a functionalist about emergent entities, regarding them as patterns that play an essential explanatory and predictive role in the theories that posit them. Tigers play an essential role in zoology, and zoology cannot be reduced to physics. Nevertheless, physics *instantiates* zoology, and Wallace gives a sketch of a formal definition of the instantiation relation. The key feature of his account is that the instantiation relation is a three-place one: theory A instantiates theory B over domain D. That is, the "reduction" is local rather than global, in the sense that a simple map from the possible histories of A to those of B may only exist in a particular situation. Whatever the merits of this account of inter-theoretic reduction, Wallace's methodological point is surely right: science routinely and unproblematically populates its theories with emergent entities, and Everettian worlds are just more of the same. That is, given that in certain situations quantum mechanics instantiates classical histories, then superpositions of the quantum-mechanical histories instantiate multiple classical histories.

The latter claim, of course, depends on the lack of significant interference between the quantum-mechanical histories. Chapter 3 provides a precise account of when and how this occurs in terms of decoherence, and along the way provides a corrective to the standard textbook account of the "classical limit" of quantum mechanics. According to the textbook account, the wavepacket of an isolated object approximates a classical trajectory when its characteristic action is large compared to *ħ*. But the reasoning behind this result fails for chaotic systems, and relies on the unrealistic assumption that a classical system can be treated as isolated. Instead, Wallace provides a rigorous formal model of decoherence, proving the Branching-Decoherence Theorem according to which decoherence is a necessary and sufficient condition for the existence of a branching structure of quasi-classical histories. Since entanglement with the environment is practically unavoidable for any macroscopic system, branching into quasi-classical histories will occur whenever microscopic superpositions are amplified -- which is to say, all the time.

An interesting corollary of conceiving of Everettian branches as emerging via decoherence is that there is no answer to the question of how many branches there are. One can choose a finer-grained set of histories, but there is no finest grain, because below a certain resolution interference between terms becomes significant and the branching structure disappears. Wallace likens branches to experiences in this regard: certainly you have experiences, and indeed more experiences of one kind than another, but it makes no sense to ask how many experiences you had yesterday. This feature has an important role in Wallace's treatment of the probability problem.

Wallace's response to the probability problem is the heart of the book, taking up chapters 4 through 6. This is appropriate: it is far from obvious to many people that probability has any place in a deterministic branching theory, and yet it is crucial to the empirical adequacy of EQM that it can recover the Born rule for ascribing probabilities to measurement outcomes. Wallace addresses the probability problem in two passes: Chapter 4 gives an informal overview of his solution, and Chapter 5 develops that solution via a formal, decision-theoretic argument.

Wallace's basic argument in Chapter 4 is that the accusation that EQM has a problem with probability rests on a double standard. *Everyone* has a problem with probability; the nature of probability remains an unsolved puzzle for philosophers, and EQM fares no worse, and perhaps considerably better, than classical physics as a basis for thinking about probability. Wallace rehearses the standard philosophical difficulties with probabilities. How are probabilities related to relative frequencies, and how do they serve as a guide to decision-making? Frequentists treat the link between probability and frequency as definitional, and the link between probability and decision as derivative. But there are well-known difficulties with this approach: whatever the link between probability and relative frequency, it cannot be straightforward. Some attempts to find a place for probability in EQM (including Everett's own) take this approach. They fail, but in precisely the same way that frequentist solutions always fail; their failure should not be taken as indicating that Everettians have special problems with probability.

Rationalists concerning probability turn the problem around; they treat the link between probability and decision as definitional and the link between probability and frequency as derivative. This approach seems to fare considerably better than the frequentist approach, and carries over straightforwardly to Everettian agents facing a branching future. One can then define objective chances functionally; the chances are whatever you are rationally compelled to set your credences to. Wallace's claim is that branch weight plays this role; if EQM is true, then you are rationally compelled to set your credence in an outcome to the weight of the branch containing that outcome. The argument is essentially a symmetry argument. Classically, you assign a chance of 1/6 to each face of a die because of a dynamical symmetry of die-throwing. But in the classical case, the symmetry argument ultimately fails, because only one outcome actually occurs for a given throw, so something in the initial conditions must break the symmetry on which the probability assignment depends. Here Everett can go one better: *nothing* breaks the symmetry, because each outcome occurs in some branch. Given these symmetries, it is provable that you should set your credence to the branch weight.

Chapter 5 is the formal presentation of this symmetry-based decision-theoretic argument. An agent bets on an experiment, receiving a payoff dependent on the outcome. Clearly there are some rationality constraints that her preferences over these bets must obey. Given these constraints, in the classical case one can prove that the agent's preferences are represented by a probability measure and a utility assignment over the outcomes. Wallace adopts the same strategy in the Everettian case, where the agent receives a payoff in each branch of the post-measurement state. He proves a similar representation theorem, except that whereas in the classical case the theorem shows that the agent's preferences are represented by *some* probability measure, in the Everettian case it shows that they are represented by the Born rule.

The proof, of course, depends on some axioms. Wallace assumes some fairly standard axioms of rationality (that the agent's preferences form a total ordering, and obey a diachronic consistency constraint), and some uncontroversial axioms concerning the richness of the space of experimental outcomes. But he also assumes some rationality axioms that are specific to the Everettian context: that an agent cares only about the macrostate and not the microstate that instantiates it, that an agent doesn't care about branching per se, that an agent's preferences supervene on the final physical state, and that the agent's preferences are insensitive to sufficiently small perturbations of the physical state. These are special axioms that require a special defense, and Wallace defends them as prohibitions on decision strategies that exploit artifacts of the model. He finishes the chapter by describing how various alternative proposed strategies for action in Everettian worlds violate these axioms. For example, the rule according to which the each branch gets an equal probability violates the combination of the diachronic consistency and branching indifference axioms.

For the purposes of the above proof, it is assumed that you know that EQM is true, and you know the quantum state of the system in front of you. But what if you don't? Chapter 6 argues that Everettians have no special trouble with statistical inference -- inferring the state from statistical data, or inferring that EQM is true from statistical data. One might worry, for example, that since every possible sequence of outcomes for a series of measurements actually occurs, no such sequence can confirm one state over another, or confirm the correctness of quantum mechanics over some alternative theory. Wallace addresses these concerns in three passes. The first pass is essentially a classical statistical approach to hypothesis testing, ruling out hypotheses according to which the data in front of us have low probability. Assume that EQM is true, and EQM and some alternative theory ascribe different probabilities to a given experimental outcome. Then after 10,000 runs of the experiment, the experimenter correctly rules out the alternative theory in branches whose aggregate weight (and hence probability) is very close to 1. The second pass invokes a Bayesian approach to inference: Wallace shows that Bayesian updating applies unproblematically in an Everettian context, in the sense that agents who conditionalize on the data will take that data to confirm EQM in branches with aggregate weight close to 1. The third pass adopts a unified approach to the Born-rule theorem of Chapter 5 and the current statistical inference problem, since as Wallace notes, solving the two problems separately involves repeating many of the same steps twice. Wallace proves the Everettian Epistemic Theorem, according to which a rational agent who is unsure of the truth of EQM will have preferences represented by a utility function and a probability function, where conditional on EQM being true the probability function is given by a density operator, and the agent's credence in EQM is updated according to standard Bayesian inference using the density operator to calculate probabilities. This is a very powerful theorem.

Wallace concludes that probability is no problem for EQM: the Born rule ascribes probabilities to outcomes even if all those outcomes actually occur, and inferences from statistical data to credences go exactly as your favorite theory of statistical inference dictates. In fact, EQM is on firmer ground than non-branching theories when it comes to objective probability, since the unbroken symmetry of the branches means that in EQM you can *prove* that the objective probabilities are given by the Born rule. In non-branching theories there is inevitably a gap at this point; without branching, the principle that you should set your credences to the physical chances of events is mysterious.

That concludes the second part of the book and Wallace's defense of EQM. If Wallace is right, then EQM has no internal difficulties, and hence quantum mechanics, taken literally, shows us that the world is a branching structure of superposed histories. The third part of the book spins out various consequences of this realization. Chapter 7 addresses how notions of uncertainty, possibility and identity appear from an Everettian perspective. The first two parts are studiously silent on these topics; nothing in the treatment of probability, for example, assumes that probability is a measure of uncertainty. Indeed, one might take the lesson of EQM to be that probability is a measure of something other than uncertainty, such as how much you care about your various successors. Wallace takes a more conservative line, employing a principle of charity to infer that when Everettian agents say they are uncertain whether an event will occur, they just *mean* that the event in question occurs in some but not all future branches. Otherwise, agents in an Everettian universe would be radically mistaken in their uncertainty claims. Wallace defends this position by appealing to the tradition according to which there is no more to what makes a semantic theory true than goodness of fit with the way the language is used. Epistemic possibility is analyzed along the same lines, but Wallace notes that other uses of possibility may require other kinds of analysis.

In a similar vein, one might take EQM to have radical consequences for the identity of objects (including persons); one might take it that since objects undergo branching, that an object is a hydra-like rather than worm-like structure in spacetime. Again, Wallace takes a conservative line, noting that the hydra view of identity is incompatible with his conservative interpretation of uncertainty claims (since according to the hydra view, I will see both outcomes of a two-outcome experiment). He adopts instead a Lewisian view, according to which an object is indexed to a complete quasi-classical history. On this view, the hydra-like structure above is composed of a number of distinct objects whose histories initially overlap but later diverge. Alternatively, one might adopt a stage view according to which an object is a temporal stage of a Lewisian space-time worm; given Wallace's methodological naturalist scruples, he is inclined to think there is no fact of the matter about which is correct.

Chapter 8 explores the question of the relationship between the quantum state and the world. Wallace is a realist, and so takes quantum states to describe physical systems. But the quantum state is a vector in a high-dimensional Hilbert space, and we certainly aren't accustomed to thinking of systems this way; for one thing, we think of physical systems as existing in *three* spatial dimensions. Wallace's proposal, which he calls spacetime state realism, is that the quantum state of a system can be taken to represent the intrinsic properties of each local part of that system, and hence that the quantum state can be taken to describe a three-dimensional world after all. The way this is achieved is by tracing over all components of the state other than those representing the subsystem in question; hence the intrinsic properties of the subsystem are represented by a density operator. Wallace uses this proposal to demonstrate that EQM is local, in the sense that the state of any region depends only on the state of some cross-section of its past light cone. There is no action at a distance. However, EQM is nonseparable, since the density operators of two subsystems do not determine the density operator of their union. Hence the intrinsic properties of a system can outrun the intrinsic properties of its local subsystems. It is a virtue of Wallace's EQM that it provides such clear and unequivocal answers to the vexed question of quantum nonlocality.

Chapter 9 takes on the direction of time. EQM is time asymmetric on its face: branching occurs towards the future, but not towards the past. But what is the source of this asymmetry? Dissatisfied with existing accounts that locate the source of the asymmetry either in the large volume of the equilibrium region of phase space or in the low entropy of the early universe, Wallace starts from scratch. He takes it that any account of irreversibility needs to account for the success of quantitative theories of irreversible processes. To that end, he constructs a formal account in which an irreversible, stochastic dynamics for the macroscopic properties of systems emerges from a reversible deterministic underlying microdynamics. The irreversibility in the macrodynamics is put in by hand; one could just as well construct a backward macrodynamics which takes later states to earlier ones. What is striking is that the forward macrodynamics is highly empirically successful, and the backward macrodynamics is an empirical failure. Wallace suggests that the source of this asymmetry (in both the classical and quantum case) is that any *simple* microstate is forward predictable; only hopelessly gerrymandered microstates are not forward predictable (e.g., those obtained by evolving a simple state forwards and time-reversing the result). Hence the irreversibility inherent in EQM (and elsewhere) can be explained without appeal to a low-entropy past hypothesis; simplicity is a microphysical property, whereas entropy is a macrophysical quantity.

Chapter 10 covers six topics that did not, in Wallace's opinion, merit their own chapter, including quantum Russian roulette, the possibility of observing other branches, and the quantum mechanics of time travel. There is much of interest here, and plenty of potential for future development of the Everettian program, but for reasons of space I will not discuss this less fully-developed material in detail.

As a whole, the book is a tour de force. Wallace clears away the fog of confusion surrounding EQM with clear philosophical argumentation and rigorous (and creative) mathematical analysis. By this means, he makes a strong case that if one treats quantum mechanics just as one treats any other scientific theory -- as a literal description of the physical world -- then EQM and its branching structure of quasi-classical worlds is the inevitable result.

But is it an irresistible case? I'm not sure that it is, so let me point out a few places where resistance seems appropriate. First, I worry that the spacetime state realism laid out in Chapter 8 makes the intrinsic properties of systems quite mysterious. It is not that the density operators that represent these properties are complicated mathematical entities; as Wallace rightly points out, there is no rule stating that "only those items to which one is introduced sufficiently early on in the schoolroom get to count as possible representatives of physical quantities" (p.299). Rather, my worries are of a Humean nature: how are these complicated physical structures manifest in experience? Wallace notes that even in classical electromagnetism, we have only an indirect grasp of the vector field via the acceleration of test particles (p.297). But do we have even that much grasp of Wallace's density operator field? Consider a pair of spin-1/2 particles in the singlet state. The density operator representing their intrinsic properties can be represented as a 2x2 matrix, but how do we gain any intuitive grasp on the import of this structure, indirect or otherwise? Presumably the properties that instantiate this structure are somehow made manifest in the outcomes of spin measurements, but the matrix elements themselves determine a *probability distribution* over outcomes. How do the actual outcomes give us insight into probabilistic properties? At the very least, in order for us to have any grasp at all on the properties of spacetime state realism, Wallace's dissolution of the probability problem for EQM has to be successful.

So is it successful? I don't take issue with the technical aspects of Wallace's treatment; the Deutsch-Wallace proof strategy has been around long enough to have been well picked over. But it is worth noting how radical its conclusions are: it is not just that it shows how objective chance arises within EQM, it also gives us "some reason to think that objective chance does not even make sense outside the [Everettian] quantum-mechanical context" (p.396). Indeed, it seems hard to avoid this second conclusion; the physical symmetries on which the Deutsch-Wallace strategy depend are simply not available outside the branching context (p.147, p.229). There is something satisfying about this: it explains why we have had so much trouble understanding objective chance, because no such understanding was available prior to EQM. But the conclusion that chance makes no sense outside EQM -- for example, that de Moivre was unwittingly referring to quantum branching events -- is hard to swallow.

My suspicion here is that if branch weight gives us what we were looking for -- that to which we should set our credences -- then we have been looking for the wrong kind of thing. If we insist on setting our credences to physical properties of individual events, then perhaps it is no wonder that only an exotic physical theory like EQM could satisfy our demands. (If we think of *scarcity* as a physical property of individual birds, then only an exotic theory of ornithology could satisfy our demands.) But what is the alternative? As Wallace concedes, the frequentist program for understanding chances is not dead yet, at least in its Lewisian "best-system" variant (p.126), so the project of looking for chances in the *distribution* of events rather than in properties of individual events is a potentially viable one. But it is far from clear whether such a view would cohere with EQM; every frequency is actual in some branch, and there seems little prospect of making branch weight relevant to chance conceived in this way.

Finally, I am not convinced that the interpretation of the quantum state is as straightforward as Wallace suggests -- that EQM is the inevitable result of treating quantum mechanics like we treat any other theory. *If* you interpret the quantum state as representing something like a field, then the ontology of EQM is certainly a natural one. Recall from above, though, that Wallace recognizes additional structure in addition to the bare Hilbert space, structure that determines what kind of system we are dealing with. This recognition of additional structure is significant, as too often it goes unnoticed, but one might worry about the extent to which such a move *presupposes* some pre-quantum description of the world, rather than reading the structure of the world directly off the quantum description (c.f. Healey 2012). Furthermore, the sciences employ mathematical structure in a variety of ways, not just as direct representation of physical structure. So the argument for treating the quantum state as Wallace requires is at least in part an eliminative one: we can't interpret the quantum state as a distribution over unrealized possibilities because of interference phenomena, and because mechanisms for picking out one possibility as actual (like Bohm's) violate relativity. These eliminative moves may be old hat, and nowhere near as interesting as Wallace's positive project, but they are still steps in his argument. And what if (as suggested above) the probability problem cannot be solved, and the EQM reading of the quantum formalism is equally problematic? Then we have to go back to the drawing board -- back to the question of what *interpretations* of the quantum formalism are possible. Quantum mechanics has the dubious distinction of being the only theory (as far as I know) which was developed without its developers having any idea what sort of thing the theory is *about*. EQM is one fascinating option, and Wallace does much to make it more plausible, but I think it is too soon to claim that it is the only possibility.

REFERENCES

Everett, Hugh (1957), "'Relative State' Formulation of Quantum Mechanics", *Reviews of Modern Physics* 29: 454-462.

Healey, Richard (2012), "Quantum Theory: A Pragmatist Approach", *British Journal for the Philosophy of Science* 63: 729-771.

Maudlin, Tim (1994), *Quantum Non-Locality and Relativity*. Oxford: Blackwell.