Epistemetrics

Placeholder book cover

Nicholas Rescher, Epistemetrics, Cambridge University Press, 2006, 112pp., $65.00 (hbk), ISBN 0521861209.

Reviewed by Jeffrey Tlumak, Vanderbilt University

2006.10.20


Rescher develops conceptual tools to quantify the results of human inquiry, explain why some items cannot be quantified, and clarify the scope and limits of our cognitive efforts. He does so by marshaling suggestions from an interesting variety of sources, but most centrally from Pierre Duhem, Kant, Herbert Spencer (and Edward Gibbon), and Leibniz.

From Duhem he extracts the thesis that security of knowledge is inversely proportional to detail of knowledge. Science risks insecurity to achieve greater detail. Common sense typically sacrifices detail for the sake of security. When our overriding goal is truth we have incentive to be vague. But knowledge is more than mere truth or correctness. We don't just seek truth, but informative truth that enhances understanding. The first moral about our limitations in the development of knowledge is that we can't feasibly achieve both great informative detail and security.

What is informative truth that enhances understanding? From Kant Rescher adopts an honorific conception of knowledge as a function of the extent to which information is cohesively systematized (organically unified), information as structured in an idealized expository treatise. He argues that since cognitive systematization is hierarchical in structure, and structure reflects significance, systemic role mirrors cognitive status. So, for example, length and detail of treatment correlate. Cognitive importance (of the item's being the way it is for understanding) is measurable in terms of the space allocated to, and (approximately) the frequency of citations of, the information in a well-designed articulation of the issue (= adequate systematization of the domain). Further, the significance of additional information is determined by its impact on preexisting information; significance of incrementally new information is a function of how much it adds, and thus of the ratio of the increment of new information to the volume of information already at hand.

From Spencer Rescher formulates the insight that progress in the development of knowledge involves not only more, but more complex and deeper knowledge, specifically that (since knowledge is organized and systematized information) this progress is measurable in terms of taxonomic complexity (inner variation and diversification) of the information at hand, and that this complexity is not proportional to the amount of information, but to its logarithm. (That knowledge is proportional only to the logarithm of volume of information traces back at least to Gibbon.) So, to increase knowledge additively we must increase information multiplicatively. To double knowledge we must quadruple relevant, non-redundant information. But the fact that expanding bodies of information contain much unproductive redundancy and irrelevancy heightens the need for an immense amount of information to achieve a modest amount of knowledge.

Volume of available information is measurable by opportunities to develop a framework for classifying the features of things -- with n concepts you can make n2 two-concept combinations; with m facts you can project m2 fact-connecting juxtapositions in each of which some sort of characteristic relation is at issue, etc. In moving cognitively from n to n+1 cognitive parameters we expand our information field multiplicatively but our knowledge only additively. Linear growth increases what has gone before by a fixed amount per period; exponential growth increases what has gone before by a fixed percentage per period. Specifically, hierarchic depth of a body of information is measurable by the logarithm of its textual volume -- the more information, the more taxonomic diversification, but at a decelerating pace.

A core moral about the development of knowledge is that diminishing returns on effort is an unavoidable facet of inquiry; progress becomes ever more demanding because requiring more and more information, and so an ever-growing investment of resources and effort -- we face escalating costs but diminishing returns, as Peirce recognized. In sum, the more we already know, the slower (by a very rapid decline) will be the rate at which knowledge grows with newly acquired information. The larger this body of information grows, the smaller the proportion that represents real knowledge. In the effort to achieve cognitive mastery over nature, exponential growth in the enterprise is associated with merely linear growth in the discipline, that is, with merely stable growth in knowledge. Among the first to formulate this phenomenon mathematically was Henry Brooks Adams. Powerful confirmation has occurred since, especially beginning with studies in the early 1950s; I omit numerical details over different historical time spans here, and convey only one instance, viz. that the number of practicing scientists (80% of those who ever existed exist now) and the number of scientific papers published has exploded, yet substantive innovation has remained roughly constant. (Throughout the book Rescher offers such empirical data to confirm his epistemetric generalizations.) Further, with cognitive growth, the rate of progress decreases drastically as one raises the standard of quality. The growth of high-quality information isn't proportional to that of information at large, but slower, at a rate decelerating as the standard is raised.

Rescher explores other variables that affect the growth of knowledge. For example, since, as presuppositions historically change, different questions can and cannot be asked, epistemic change relates to what can be asked as well as what is known. Resolved questions always spawn new questions (Kant again). And we should also be sobered by the fact that while methodologically we aspire to simplicity, economy, and systematicity, the world invariably checks our aspirations: for example, introducing conceptual commonalities generates new differentiations, and the very attempt to counteract fragmentation produces new fragments, new specializations.

The book's last two chapters concern the scope and limits of propositional (not how-to or tacit) knowledge. There are specific limitations, given the nature of science. For example, there are always missed facts as science progresses since needed concepts are not then available -- sometimes the acquisition of scientific information requires conceptual innovation. There are more ubiquitous, but still human restrictions (much of the basis of which was recognized by Leibniz). Key to understanding them and more stringent restrictions to come is a distinction between statements, truths, and facts. Statements are linguistically formulated propositions. Facts are non-linguistic aspects of reality that correspond to potential truths that become actual truths when they are appropriately linguistically embodied. (In this usage, once stated, a fact yields a truth.) So, for example, there are more facts than truths given that there is only so much statable in sentences of intelligible length, and, more generally, given that we think in language; human understanding cannot keep up with reality using symbolic devices or language. Indeed, inexhaustibility of facts about, and therefore linguistic characterization of, any given x is a mark of its reality; being a real thing analytically implies the existence of description-transcending features, whereas being exhaustively characterizable entails being fictional (or fictional or an ideal abstraction).

But human restrictions aside, the set of all statements is at most enumerably infinite. Any human language is produced recursively, so has only countably many expressions. But there are uncountably infinite objects. One particularly interesting phenomenon illustrating the limitation of knowledge is applicable but non-instantiable predicates, such as "unknowably true" itself. They are what Rescher calls "vagrant." Indeed, based on general principles there are some vagrant predicates that must have non-specifiable application. (Rescher takes this to show that reference or individuation does not entail identification, so that we can know that something is unknowable without knowing what it is, and so, as an aside, he takes this to show the unworkability of the substitutional interpretation of quantifiers and of intuitionistic logic.)

One way Rescher characterizes our situation is that we live in an analog, not digital,ß world. Cognition is bound to language. Language is digital and sequentially linear. So cognition is digital. Since language can't capture the entirety of fact -- for example, there is always the megafact which is the amalgam of all facts (such megafacts are legitimate since whereas a collection of individuals is not an individual, a complex of facts is itself a fact) -- quantitatively there are more facts than truths. There are non-denumerably many facts but true statements are denumerable (since statements are). Also, we can never establish that all the members of an infinite collection share a contingent property. Such properties are only ascribable by (per impossible) comprehensive examination of all the collection's members. A fortiori, we can never determine that one member of the set uniquely possesses some contingent property. (Also, there are always facts about a list-of-facts-as-a-whole which no member of the list can capture.)

But while human knowledge is inescapably limited and incomplete, it is (following Kant) unbounded, that is, has no fixed and determinate limits. There is no fixed and determinable proportion to the ratio and relationship of known truth to knowable fact. Necessarily, some fact is unknown, but no particular fact is necessarily unknown.

The above centrally traces (but does not exhaust) Rescher's line of thought. I now sketch some questions about his effort. Rescher stresses that his inchoate epistemetrics is a broader discipline than already-existing scientometrics -- science does not have a monopoly on knowledge. But when he seeks empirical confirmation or illustration of the principles he applies, the data cited are invariably scientific. Do the proposed laws apply to commonsense knowledge? More importantly, it's doubtful that the concept of knowledge he wields even applies to much commonsense knowledge. Does such knowledge form the kind of organic unity allegedly required?

On the other hand, it seems that the cohesive systematicity requirement on knowledge is not operative throughout the book. But this is potentially a bigger problem than the one about its applicability to everyday knowledge. As the book develops, several, seemingly non-equivalent conceptions of knowledge (or more accurately, in some cases, conceptions of degrees of knowledge) are used. I already noted the early transition from "informative truth that enhances understanding" to "cohesively systematized information," the official account. I am not sure whether this is intended explication, but I worry that it is a slide from a weaker to a stricter notion. More, sometimes knowledge is just "high quality information" (a notion susceptible to degrees-of-growth principles). Sometimes it is "adequate understanding." Sometimes unexplicated honorifics like "authentic knowledge" are used, where it's unclear whether "authentic" has descriptive meaning. And is Kantian talk of the requirements of "systematic knowledge" pleonasm, if knowledge per se requires systematicity? And note that Rescher admits that the idea of a unifying system is only a regulative (aspirational) ideal. If a unifying system is a completed systematization, no tension arises; but if it's something more modest, to say that's it's only a regulative ideal might, contrary to intent, affirm a kind of skepticism. But I mainly worry about these possible shifts because I wonder how much plausible quantification depends on adopting a bloated conception of knowledge (what Russell called a "high redefinition"), and also perhaps whether the defenses of different, proposed epistemetric laws require gerrymandering these conceptions of knowledge.

Further, it seems that some of the allegedly quantifiable markers for qualities such as depth and importance of information rest on dubious assumptions needed for quantification. For example, what underlies Rescher's lawlike correlation between frequency of citation and objective importance? Crucial is that he conceives of concepts and theories as tools, so conceives of their importance as importance of a tool, and then measures (at least estimates) importance of a tool in terms of how much occasion one has to make use of it. But as a generalization about which tools are more or less important this seems stipulatively artificial. For example, is a uniquely suited tool less important than an interchangeable one just because the latter is used more frequently? And as a measure of objective importance, an occasion-for-use criterion ignores various real-world (political, sociological, etc.) influences on who or what is more likely to get cited in what journals, and why it's getting cited in the first place. For example, authors may take pains to discredit a current fad, citing extensively someone they regard as a charlatan. To fend off such many contingencies by reverting to the ideal, non-redundant, systematic presentation of the domain in question is to install a priori a qualitative judgment about importance (and depth, etc.) that is immune to empirical disconfirmation. So my main, recurrent concern is that the idealizations needed to make quantification plausible depend on non-quantifiable assumptions about the palpably normative properties of information that make one piece of information qualitatively superior to another.

I even worry about how ostensibly quantitative notions operate. For example, Rescher claims that when new data oblige theoretical revision, the result must be more elaborate, more complex. But on familiar conceptions of complexity this seems wrong, especially in light of Rescher's acknowledgment that cognitive progress sometimes requires conceptual innovation. Entirely new concepts can sweep away old internal complexities. Are watershed breakthroughs such as electromagnetic theory and the wave/particle conception of light complexifications?

Finally, and this is not criticism but indication of interesting discussions the book provokes, the richness of the project of epistemetrics, its potential range of applicability, would be bolstered were one to defend some of its historically-contested background assumptions. Some are epistemological, some metaphysical. An example of the former is the distinction between oblique knowledge-that and direct knowledge of what. The present doubt is not whether there is any de dicto-de re distinction at all, but about how widely it legitimately applies in inquiry (as in Berkeley's criticism of representative realism). An example of the latter is that a collection of facts is a fact. Conjoined with the firm realism Rescher embraces throughout, it would seem that the megafact that amalgamates all (other) facts is a new fact that is not explicable in terms of its component facts and their interrelations. But then, to cite one familiar historical debate, someone committed to the principle of sufficient reason could insist that there is no adequate natural explanation of the world (if the world is the totality of facts, not of things), and a cosmological argument is in the offing.

Rescher's book is informative, and it's often most valuable precisely where it prompts resistance, for that's where deeper epistemological (and occasionally metaphysical) commitments get focused.