Imitation of Rigor: An Alternative History of Analytic Philosophy

Imitation Of Rigor

Mark Wilson, Imitation of Rigor: An Alternative History of Analytic Philosophy, Oxford University Press, 2021, 207pp., $70.00 (hbk), ISBN 9780192896469.

Reviewed by Katherine Brading, Duke University

2023.05.1


Imitation of Rigor is a book about philosophical methods and the misuse of “rigor”, most heinously within some strands of contemporary metaphysics. The subtitle is “an alternative history of analytic philosophy” because one of Mark Wilson’s aims is to “illustrate how our subject [i.e., philosophy] might have evolved if dubious methodological suppositions hadn’t intervened along the way” (xviii). The book is rich in examples, and his argument depends on our attention to the details. I shall try to explain the big picture and hence why investing time in the details matters.

Here is one way to read the argument of the book. In the late nineteenth and early twentieth centuries, philosopher-physicist-mathematicians (Wilson’s main protagonist here is Heinrich Hertz) sought to axiomatize physics as a means of clarifying its content. Carnap and others picked up on this method, and from here was born the notion of “Theory T” (3) as an ideal of both scientific and philosophical theorizing. Contemporary metaphysicians have, in turn, taken up this method of doing philosophy, thereby placing emphasis on a particular form of rigor. This, Wilson argues, is a mistake.

The underlying assumption attributed to contemporary metaphysicians is that the output of science (in the long run) will be an axiomatized theory of everything; a single theory with unlimited scope, unified via its axiomatic structure. The gap between what is achievable in practice and such an ideal outcome is held to be a matter of no metaphysical import.

This underlying assumption has three specific consequences that Wilson challenges. First, it means that, insofar as such metaphysicians engage with the structure of scientific theories, they presume that the important content of scientific theorizing is contained within axiomatic presentations. Wilson argues against this.

Second, metaphysicians have sought axiomatic presentations of metaphysical theories on the grounds that this approach has proven successful in science. As a result, axiomatization has been adopted as a method for achieving precise meaning in metaphysical theorizing (whether of words or concepts). For Wilson, this is “erzatz rigor” (about which more below).

Third, contemporary metaphysicians (or rather, those whom Wilson is targeting) have championed maximal generality of an axiomatized theory as a standard for success in metaphysical theorizing, suggesting that this parallels success in science. Wilson denies the latter, and thereby rejects this justification for such a methodological precept in metaphysics.

To make his case, Wilson begins with Hertz’s attempts to resolve conceptual puzzles in the foundations of classical mechanics by means of formal axiomatization, and more specifically his treatment of the notion of “force” (See Eisenthal 2021). Wilson’s diagnosis of the situation Hertz confronted is that, when pressed into service in different problem-solving contexts, “force” fragments into several distinct and more precise concepts, each appropriate for its own domain of application. Wilson argues that, notwithstanding Hertz’s efforts, mechanics has moved forward not by unifying the various more precise force concepts under a single, general concept, but by explicitly articulating both the boundaries of the domains of application and the means by which we move across those boundaries without disaster (that is, without contradictions and/or failures in applicability).

Through detailed discussion of these developments and more, Wilson builds his evidence for the claim that, given how science has in fact unfolded since Carnap, we have good reason to reject the above underlying assumption about the output of science. Science itself provides no good grounds for presuming an axiomatized general theory as its eventual output, no matter how idealized or how indefinitely “in the long run”. The axiomatization project in science has already failed. Moreover, he argues, it failed in ways highly informative about our epistemic situation in the world and our ability to theorize about portions of it. He offers an alternative model of scientific theorizing and an alternative philosophical method that takes this conclusion seriously.

Wilson’s positive proposal for philosophical methodology suggests setting aside “Theory T thinking” and instead paying attention to the philosophical importance of (i) the way theories fragment under the pressures of application, and (ii) the detailed work that goes into tying different fragments together in order to be able to apply our theories. He builds his position by looking carefully at successful practices of scientific theorizing. If you think, as I do, that what is achievable—and what unachievable—in practice for us as the kinds of epistemic creatures we are is philosophically informative, then we must take Wilson’s arguments and his evidence seriously. In the end, this requires working through his examples.

The plentiful examples offered by Wilson illustrate the ways our concepts evolve in response to the demands of use in specific contexts, and become fragmented as those contexts diverge. Wilson champions “multiscalar architectures” as a powerful way to understand how we theorize successfully in this situation. At different length scales, different behaviors dominate. The methodological strategy is to treat a complex problem “only to the depth required to extract larger-scale answers that are normally trustworthy” while at the same time investigating the circumstances under which these answers hold and fail (93). Where they fail, we move to a different scale and repeat the process. Wilson emphasizes the “tremendous efficiencies of multiscalar schemes” in which “each computational scale relies upon relatively simple ‘dominant behavior’ rules,” supplemented by submodels targeting circumstances where dominant behavior at a different scale becomes important for the problem at hand (101).

Pursuing this strategy yields a multiscalar patchwork, but not yet a multiscalar architecture. The next crucial observation concerns the behavior of language in this multiscalar theorizing: the same term (e.g., “force”) comes to mean subtly different things in different models. One potential solution would be to enforce semantic identity, and to do this by attempting to axiomatize our collection of models and dominant behavior rules. This is the strategy pursued by Hertz, Wilson argues, and it turned out not to work. We can now get an inkling as to why. Within each domain of modeling, a term such as “force” (see chapter 4) and “cause” (see chapter 6) will “specialize to accommodate parochial demands” and in general the resulting specializations will not be consistent with one another (108). Importantly, these very inconsistencies will be necessary for the success of theorizing in the different domains. As a result, absent semantic identity of key terms, we must develop protocols by which to sew the different models together. Multiscalar architectures succeed in doing just this, and Wilson explains the methods by which they do so.

The pursuit of a unified axiomatization, Wilson argues, has persisted in analytic philosophy today as the “gold standard” for successful methodology despite the fact that, with respect to the foundations of classical mechanics—and indeed physics more generally—it has failed. He writes:

The standards of “erzatz rigor” of which this book complains reflect this excessive reliance upon “epistemologically idealized theories,” which inaccurately presume that the troublesome anomalies of real-life descriptive practice will someday become vanquished with a swift sweep of the axiomatic pen. (151)

He sees this as particularly problematic in the work of some of today’s analytic metaphysicians, identifying views of David Lewis, L. A. Paul, and Ted Sider among his targets.

Wilson objects to a conception of metaphysical practice which suggests the following parallels between metaphysical and scientific theorizing. Metaphysicians investigate our pre-scientific concepts and seek to formalize them into general theories, just as science achieves precision for its concepts via axiomatized theories; and since this method is successful in science, we should also expect it to be successful in metaphysics. Moreover, implicitly (or sometimes explicitly), such metaphysicians presume that wherever science takes us conceptually, since the metaphysical concepts are prior to the concepts of science, they are not subject to revision due to developments in science. Wilson pushes back. Insofar as the metaphysicians claim to be following the methods of science, they fail to do so if they focus on the pursuit of a single, all-encompassing, axiomatized theory. The development of science teaches us two things, he says: axiomatization is an insufficient goal, we should instead expect fragmentation; and this fragmentation will not preserve our pre-scientific conceptual architecture.[1]

The price of excessive focus on axiomatization has been that we have under-valued the philosophical importance of the places where our theories and concepts don’t work as expected, and have therefore under-valued the methods for exploring how we proceed in such circumstances, and the ways we patch our different pieces of theorizing together.

In the final chapter, Wilson describes two approaches to precisification of meaning prominent in the early twentieth century. The first ties meanings to “intuitions”; the second takes an axiomatic approach. What is missing in each case—as the later twentieth-century went on to articulate—is the relationship between meaning and practice. As I read it, Wilson’s work (in Wandering Significance and Imitation of Rigor) is about what happens when we recognize that our intuitions are precisified but also complexified, fragmented, and transformed by practice, and that (since in natural science, unlike in mathematics, implicit definition is insufficient for meaning), our axiomatizations are complexified, delimited, and proliferated by practice.

Against the metaphysician who wishes to start and end with intuitions, the lesson is as follows. On the one hand, there is no point in precisifying our “intuitions” about the world where those intuitions have no purchase on the world. What is the justification that this is anything other than a pretty fairy tale? On the other, where they do have purchase, our intuitions will respond: complexity, fragmentation, and transformation will result.

Against the metaphysician who argues that rigor via axiomatization will stabilize and precisify our concepts, an analogous lesson follows. There is no point in precisifying our concepts if those concepts have no applicability to the actual world we find ourselves in. Justification for applicability comes from practice, and once practice enters the process, our axiomatizations yield complexity, proliferation, and limited domains of applicability.

Either way, precisification leads not to a single, unified language of universal scope, but to fragmentation and domain-specificity. Moreover, it is within these processes that we come to learn about the world, iteratively and piecemeal.

Faced with these patches of theory, detailed work is required to tie different fragments together in order to be able to apply our theories. As noted above, Wilson’s book champions a specific methodology for how we connect different domain-dependent theories together, drawing on multi-scalar architecture techniques. The justification for this is that it is a specific and theoretically regimented method for tackling patching, and that it has a proven track-record of success. Together, these give Wilson reason to argue for its philosophical import going forward (chapter 5).

Wilson’s general philosophical message is independent of the details of either the specific examples that he uses to undermine “theory T thinking” as an appropriate heuristic, or multi-scalar architectures as an appropriate antidote and alternative. Nevertheless, I agree wholeheartedly with Wilson that the value and viability of a philosophical position depends on these specifics and these details. Sophisticated philosophical theories that cannot cope with the demands of our practices as we navigate the world are nothing but “fictions and reveries” (See Du Châtelet, Foundations of Physics, §55). I hope those who champion the kinds of metaphysical project that Wilson challenges will give his objections the serious engagement they deserve.

Wilson locates the history of why we have adopted axiomatization as appropriate methodology for philosophy in the early twentieth century, particularly in Carnap. According to Wilson, Carnap enthusiastically promoted the notion that “any worthy scientific proposal can be fit within the single-leveled contours of a complete and fully axiomatized coverage” (149). And so Carnap comes out as the villain of the piece, but I think that is rather unfair. For, among other reasons, the axiomatic philosophical project might have worked—as Wilson himself says. It seems to me that rather than rejecting axiomatization altogether, the ways in which it failed show us that axiomatic methods must be complemented by a second form of investigation. Indeed, for Hilbert himself—who developed and championed the axiomatic method and applied it to physics (among other areas of theorizing) in the late nineteenth and early twentieth centuries—it has a specific purpose. Hilbert’s method is an epistemological tool: it is a means of uncovering which aspects of a theory have been encoded into its very structure, and which we continue to add “by hand” from intuition (or experience). Viewed in this way—as a tool rather than an end in itself—Hilbert’s axiomatic method remains hugely important for scientific theorizing and for the investigation of scientific theories.

With axiomatization thus understood, the first component of Wilson’s own picture allows for a pluralism of conceptual schemes that lack the universality allegedly dreamt of by Carnap, but where what is true remains relativized to a conceptual scheme (see chapter 9). It’s simply that each conceptual scheme is limited in scope. The second component of Wilson’s picture is the theorization of how we coordinate among those schemes so as to tie them together successfully. That is, we link them in such a way that we can navigate around in the world with a minimum of catastrophic failures. I wholeheartedly agree with Wilson that philosophically important things happen on the way to successful theories (including en route to an axiomatization), and in the interstices between them, whether this is in physics, other areas of science, or any area of analytic philosophy. I think the interstices are of enormous philosophical interest, for these are the spaces where our concepts fail and our familiar patterns of reasoning are as likely as not to lead us astray. If we are interested “merely” in axiomatized theories then we miss a great deal of what philosophy is for.

I cannot end without noting the appearance of Emilie Du Châtelet, on the cover and within. She comes up first in a discussion of the complexities hidden within “billiard ball” models of collision (126). This is a topic dear to my heart: the failure of philosophy to handle collisions is what drove physics and philosophy apart through the course of the eighteenth century (Brading and Stan, 2023), and this has had deep and far-reaching consequences throughout philosophy thereafter. Her second appearance is in connection with the methodological import of continuity as a tool in theorizing (134–8, 176). I have argued that the principle of sufficient reason (by which she justifies continuity) is, for her, primarily methodological rather than metaphysical (Brading, 2013), and if Wilson’s book tempts more people to read Du Châtelet for themselves then that is all to the good.

The power of Wilson’s book lies in the details of his examples, and I have not tried to do justice to those here. Instead, I have focused on the philosophical work his examples do. When it comes to the take-aways, too often for my taste Wilson resorts to metaphor, and to gesturing towards his point, at just the moment when I want an explicit and literal explanation. I suspect this is deliberate, but nevertheless make no apology for attempting to nail down what he has left open. If this prompts others to tackle the difficult and important issues raised in his book, then I will be satisfied.

REFERENCES

Joshua Eisenthal, “Hertz’s Mechanics and a unitary notion of force.” Studies in History and Philosophy of Science. Part A. 90. 226–234. 2021.

Emilie Du Châtelet, Foundations of Physics, §.55. In Emilie Du Châtelet: Selected Philosophical and Scientific Writings. I. Bour (trans.) and J. Zinsser (ed. And trans.). Chicago University Press. 2009.

Katherine Brading and Marius Stan, Philosophical Mechanics in the Age of Reason, Oxford University Press, 2023.

Katherine Brading, Emilie Du Châtelet and the Foundations of Physical Science, Routledge, 2019.

 

[1] For example, Wilson, 181: “In any case, with respect to Sider’s appeals to ‘realism’ [(Sider, 2007, 6) . . .] no reasonable formulation of ‘scientific realism’ should demand that future science will suit the academic dictates of Sider’s armchair metaphysics.” Wilson is surely right about this.