Stephen Mumford and Matthew Tugby (eds.)

Metaphysics and Science

Stephen Mumford and Matthew Tugby (eds.), Metaphysics and Science, Oxford University Press, 2013, 244pp., $75.00 (hbk), ISBN 9780199674527.

Reviewed by Alastair Wilson, University of Birmingham

This volume has two main aims. One is to collect together high-quality work from the intersection of metaphysics and the philosophy of science. In this, it succeeds admirably; the essays will be essential reading for anyone working seriously on laws of nature, dispositions, natural kinds, or emergence. The second aim is to demarcate and exemplify the discipline of metaphysics of science, which the editors set out to define in their introduction. Success here is less clear-cut, and there is room for doubt about the value of the definition project. I'll start with some remarks about the individual essays­­, before turning to the introduction and the banner under which it unites them.

The volume begins with three chapters on laws of nature. In the first of these, John Roberts concisely develops a central theme from his The Law-Governed Universe (2008). The package of views defended in that book includes pragmatist and contextualist elements, but here those elements are separated out from the thesis that the facts about what the laws are depend on the facts about which measurement methods are reliable. We usually suppose that the laws determine which measurement methods are reliable. Roberts suggests inverting this direction of explanation: "What it is to be a law of nature is to be one of the general truths that follow from the reliability of the legitimate measurement methods" (p. 34).

While his project is primarily a defensive one, pre-empting a number of objections, Roberts motivates his 'Measurability Account of Laws' (MAL) on the basis that it can explain the counterfactual reliability of the legitimate measurement methods. The explanation offered turns on a notion of 'basic evidence' (evidence produced by a legitimate measurement method), along with a semantics for counterfactuals, which takes as input an 'epistemic context' and which will "settle it that those counterfactuals that must be true in order for the basic sources of evidence to be counterfactually reliable are true" (pp. 37-38). The idea is that "the epistemic norms are what ground the counterfactuals" (p. 36). The semantics for counterfactuals that this would involve is merely sketched, and it will not in any case satisfy the anti-Humean who insists that transcendent laws are required to explain the persistent usefulness of our whole package of assessing counterfactuals and identifying reliable methods. But it nevertheless helps to flesh out an interesting and minimalist Humean viewpoint.

Jim Woodward's chapter is a substantial and systematic critique of the Mill-Ramsey-Lewis 'best-system analysis' (BSA) of laws of nature -- currently the most popular empiricist theory of lawhood. Woodward argues that the balancing of simplicity and strength characteristic of recent Lewisian developments of the BSA is a distortion of actual scientific methods for establishing the truth of claims about laws and causes. According to Woodward, both the identification of laws and the identification of causes are (in different ways) fundamentally a matter of invariance; he argues vigorously that the BSA fails to do justice to this central insight. (Nor, he argues, can it do justice to the role of initial conditions in science). Perhaps surprisingly, Woodward does not provide an alternative set of criteria for lawhood or conclude that laws fail to supervene on the Humean supervenience base (HSB). Rather, he suggests that "the entire exercise of asking whether or not causal structure supervenes on the HSB is unilluminating and irrelevant to understanding our conception of causation and how it relates to evidence" (p. 58), and shortly afterwards he extends this conclusion to the case of laws. Here Woodward is rejecting not only the thrust of Lewis's proposed analysis of laws, but also the wider program of providing a metaphysical account of what laws consist in. This anti-theoretical stance sits uncomfortably with the pro-metaphysics attitude of most of the other authors in the volume, and to my mind Woodward does not do enough to justify it. His interesting and valuable discussion of current Humean theories of causation and laws should spur us instead to produce better BSA-style accounts that do more justice to scientific practice.

Marc Lange's "How to explain the Lorentz transformations" is perhaps the highlight of the volume. It is a careful and well-crafted study, containing a thorough treatment of the debate's historical background as well as a potent response to influential recent work by Harvey Brown. Lange deftly applies the machinery of his Laws and Lawmakers (2009) to a case study of special recent interest, the striking fact that all known physical interactions are Lorentz-covariant. In Physical Relativity (2005), Harvey Brown argued that Lorentz covariance is best explained by the content of the force laws, rather than by the nature of spacetime. According to Brown, it is simply an "unexplained brute fact" that all of the actual force laws are Lorentz-covariant. Lange presents an important challenge to Brown's view, arguing that it cannot account for the apparent counterfactual stability of Lorentz covariance under changes in the details of the force laws. Lange shows how the notion of 'sub-nomic stability' developed in Laws and Lawmakers allows for a hierarchy of laws; Lorentz covariance is higher-up in the hierarchy and hence more counterfactually stable than the individual force laws, though it is still less counterfactually stable than logical or mathematical truths. Lange claims that this hierarchical picture provides for a non-causal explanation for the universality of the Lorentz transformations, an explanation that is unavailable to Brown because of his exclusive focus on causal explanation.

Andreas Hüttemann's essay is closer in content and style to the tradition (sometimes called "metaphysics of science") that places central emphasis on the role of dispositions. The core of Hüttemann's theory is the claim that a cause is "an actual disturbing factor (antidote) to the default behaviour that a system is disposed to display (relative to a causal field)" (p. 118). This account seems of limited use. Part of the reason for this is that 'disturbing factor' is worryingly close to 'cause'; what is a disturbing factor if not a factor that causes a disturbance? Some non-reductive theories of causation are arguably substantive and informative: the interventionist approach promoted by Woodward (2003) and Hitchcock (2001) has many admirers. But in Hüttemann's proposal, the connection to dispositions seems to do little work: the theory boils down to the claim that causation involves interactions between objects possessing properties that manifest differently under different conditions. This doesn't tell us much about how to deal with hard cases. Double prevention, for example, is accommodated only by using a causally-rich process as the "default behaviour that a system is disposed to display" along with corresponding causally-rich dispositions. One of Hüttemann's applications of the theory involves a system consisting of an Enemy fighter, a bomber piloted by Suzy, and a target; Enemy is supposed to have the disposition to prevent Suzy from bombing the target. Given that dispositions like these are infused with causal content, it is unclear how much benefit accrues from giving an account of causation in terms of them.

Jennifer McKitrick's chapter also focuses on dispositions, and in particular on the pandispositionalist view that all (sparse) properties are powers. It is by now familiar that pandispositionalists face a regress challenge: if each disposition-manifestation event consists only in the acquiring of new dispositions, in what sense are dispositions ever made manifest? McKitrick highlights an additional potential regress focused around the notion of a trigger, or stimulus. If all activations of a disposition require triggers, and a triggering event is itself the acquisition of a disposition, then why doesn't the process by which the trigger gets the disposition to activate itself require a trigger? McKitrick explores a number of possible ways out of this regress, and argues that pandispositionalism is difficult to reconcile with the "stimulus-manifestation" model of dispositions according to which triggers are a category of special metaphysical interest. She concludes that pandispositionalism fits most naturally with a "mutual manifestation" model of dispositions of the sort defended by Heil (2003) and by Mumford & Anjum (2011).

The volume shifts gears at this stage, with three chapters on natural kinds. Helen Beebee's contribution is a thorough critique of inferences from the Kripke-Putnam causal theory of reference for natural kind terms to metaphysical conclusions about natural kinds. Nathan Salmon has argued that these inferences from semantics to metaphysics presuppose a "non-trivial essentialism"; but Beebee goes further, arguing that some authors (Brian Ellis is the worst offender) smuggle in more potent forms of essentialism while attributing them to the semantics. There follows a helpful discussion and taxonomy of different sorts of ways in which scientific terms could "cross-cut"; that is, whether the kind structure utilized by one legitimate theory or discipline could overlap with the kind structure of another in such a way that neither is a sub-classification of the other. Beebee considers cross-cutting within a single "classificatory framework", cross-cutting across different frameworks, and cross-cutting across different incommensurable Kuhnian paradigms. Her aim is not to argue for the existence or legitimacy of any of these different kinds of cross-cutting, but to establish that they are not ruled out by the Kripke-Putnam semantics.

Emma Tobin's chapter examines the relationship between natural properties as invoked in the metaphysical tradition of Quine, Lewis and Armstrong, and natural kinds as invoked in the parallel tradition of Strawson, David Wiggins, Lowe and Ellis. Tobin's overall conclusion is that no need has been demonstrated for a thick notion of natural kind that goes beyond that of a set of things sharing a natural property. While her discussion covers plenty of useful ground, it isn't fully satisfying. Tobin doesn't engage the arguments by Strawson and Lowe in favour of a sui generis conception of kinds; and it seems rather uncharitable to interpret their views as so closely tied to sortals that they can be defeated by the observation that 'gold' and 'water' are not count nouns.

L. A. Paul's chapter is squarely in the "natural properties" tradition. Paul aims to resuscitate a familiar challenge to the "semantic realist" program: Putnam's model-theoretic argument for widespread indeterminacy of reference. Her claim is that, while Putnam's challenge can ultimately be resisted, in resisting it the realist must adopt a surprising and apparently contingent premise about the world's structure. In particular, the realist must suppose that there are no "deep-level symmetries" in the patterns of distribution of the various perfectly natural properties, which would lead to systematic indeterminacy in our reference at the higher level. Perhaps Paul's clearest example is that of positive and negative charge; she suggests that "if the world exhibited enough global symmetry" (p. 193) in the charge distribution, our predicates 'positively charged' and 'negatively charged' would refer indeterminately. The idea is roughly this: for any pattern of linguistic use, there will be some world or other where the global pattern of charge distribution lines up with the global pattern of linguistic usage in just the right way as to generate indeterminacy. No explicit recipe is given for generating worlds which give rise to indeterminacy, of the sort found in Williams (2007), and the case remains to be made that the distributions that lead to referential indeterminacy are widespread enough through modal space to be an epistemic threat. Still, Paul may have put her finger on a deep concern for semantic realists.

The volume concludes with a substantial chapter by Jessica Wilson on the notion of metaphysical emergence and its relation to nonlinearity in scientific theories. Wilson distinguishes weak emergence (compatible with physicalism) from strong emergence (incompatible with physicalism), identifying weak emergence as the notion of most contemporary interest. She also surveys a number of candidate accounts of weak metaphysical emergence, rejecting them either as not involving genuine emergence or as involving emergence that is either incompatible with physicalism or not genuinely metaphysical. She goes on to propose a new account of weak metaphysical emergence in terms of the reduced set of "degrees of freedom" typically associated with high-level theoretical models of systems exhibiting non-linear behaviour. Wilson succeeds, I think, in identifying an important phenomenon that does deserve to be called "emergence". However, I expect the jury to remain out on whether this phenomenon is incompatible with "metaphysical reduction", as is claimed here. Wilson's argument for their incompatibility relies on systems having properties of being characterizable by the specification of set S of degrees of freedom, and on the application of Leibniz's law to these properties. If attributions of these properties are context-dependent or agent-relative, then the argument risks equivocation.

The introduction, by Stephen Mumford and Matthew Tugby, contains useful summaries of the chapters. It also contains an extended discussion of the disciplinary moniker 'metaphysics of science', offering a definition in terms of "the kind of debate with which contemporary metaphysicians of science are typically concerned" (p.6). These debates turn out to involve "the more general scientific-cum-metaphysical concepts, concepts which are deployed in all the natural sciences" (p.6). Spoiler alert: the final definition metaphysics of science offered is as follows:

The metaphysical study of the aspects of reality, such as kindhood, lawhood, causal power, and causation, which impose order on the world and make our scientific disciplines possible (that is, disciplines which are able to provide predictions -- often novel -- and offer explanations for new facts and anomalies within their given domain), and also the study of the metaphysical relationship between the various scientific disciplines. (p. 14)

We should presumably read the definition not as presupposing specific commitments to kinds and laws and causal powers and causes, but only as presupposing that something or other imposes order on the world. This commitment is still an uncomfortably strong one, and I think we should try to avoid making it a criterion for engaging in metaphysics of science. Humeans characteristically reject talk of regularities being imposed by governing laws and also often reject talk of causation, laws, powers, etc. as unhelpful for understanding fundamental science; but they are still thinking about science and doing metaphysics. A definition of metaphysics of science that excludes the Humean perspective runs the risks of weighing the field down with a party-line ideology and of discouraging interest in metaphysics amongst practitioners of the various special sciences.

More concise and more palatable is what Mumford and Tugby call their "key claim": "the metaphysics of science is the metaphysics of order" (p. 16). By this they do not intend to include the metaphysics of the particular relations that in fact order the world -- surprisingly, the metaphysics of space and time lie outside of their envisaged disciplinary boundaries. Rather, they intend that the metaphysics of science is the study of whichever features of reality are present in all metaphysically possible worlds that are sufficiently orderly as to permit of scientific investigation. This general claim is more irenic than the specific definition, but still by no means neutral. If metaphysical modality is an artificial and gerrymandered grade of modality, as many philosophers suspect it to be, then on this definition the discipline of metaphysics of science is likewise gerrymandered. Sceptics about metaphysical modality could perhaps be persuaded back aboard the bandwagon by modifying the "key claim" as follows: the metaphysics of science is the study of whichever features of reality explain the actual world's being sufficiently orderly as to admit of scientific investigation.

Setting aside doubts about the project of disciplinary definition, this is a fine volume. Several of the essays will be indispensable reference points for scholars and students of metaphysics, and the rest are well worth reading.


Brown, H. R. (2005). Physical Relativity. Oxford: Oxford University Press.

Hitchcock, C. (2001). "The Intransitivity of Causation Revealed in Equations and Graphs", Journal of Philosophy 98: 273-99.

Heil, J. (2003). From an Ontological Point of View. Oxford: Oxford University Press.

Lange, M. (2009). Laws and Lawmakers. Oxford: Oxford University Press.

Mumford, S. & Anjum, R. L. (2011). Getting Causes From Powers. Oxford: Oxford University Press.

Williams, R. (2007). "Eligibility and Inscrutability", Philosophical Review 116(3): 361-399.

Woodward, J. (2003). Making Things Happen. Oxford: Oxford University Press.