Events and Semantic Architecture

Placeholder book cover

Paul Pietroski, Events and Semantic Architecture, Oxford University Press, 2005, 276pp, $85.00 (hbk), ISBN 0199244308.

Reviewed by Kent Johnson, University of California, Irvine

2005.08.08


Here's the short story. In Events and Semantic Architecture (ESA), Paul Pietroski presents a linguistically and philosophically highly sophisticated view called "Conjunctivism". Conjunctivism says that absolutely all semantically relevant syntactic concatenation expresses conjunction. Thus, e.g., the meaning of every dog is a formula of the form "P ∧ Q". Moreover, all of these conjuncts contain one free (Boolos-inspired) plural variable which ranges over event-like objects. These free plural variables are bound by existential quantifiers at the level of sentences (or clauses, or syntactic "phases"). ESA continuously defends Conjunctivism over "Functionism", which encompasses the standard sorts of extremely powerful semantic architectures whose roots lie in Montague Grammar and categorial grammar. The defense proceeds via detailed examinations of a wide variety of linguistic constructions. Conjunctivism, according to Pietroski, provides a simpler or more natural account of the phenomena. Thus, Occam-like methodological reasons support Conjunctivism over Functionism. This type of project is of a piece with a major movement in syntax of the past decade. Both semantic Conjunctivism and syntactic "Minimalism" attempt to uncover the true structure of language by attempting to characterize it within remarkably weak formal systems.

Now let's consider some details. The guiding thought is that the semantic structures of sentences (or clauses or "phases") are instances of a normal form:

(1) ∃O ∧ Θ (O, Φi, A).

In (1), O is a plural variable whose value is always some events, Θ is a formula of the metalanguage, Φi is an expression of the object-language, and A is an all-purpose device for associating semantic values with free variables, context-sensitive expressions, etc. Except when Φi is a linguistically primitive element, Θ can be innocently interpreted as saying merely that the Os are semantic values of Φi with respect to A. Thus, (1) is intuitively read as: there are some events such that they are semantic values of Φ1 (with respect to A), and … and they are semantic values of Φn (with respect to A). Semantic composition is thus reduced to an extremely simple scheme (cf. p. 121):

(2) a. Val(O, Φ^Ψ, A) iff Val(O, Φ, A) & Val(O, Ψ, A)

b. Val(t, [S …], A) iff ∃O(Val(O, …, A)

("S" in (2b) goes proxy for a short list of designated syntactic levels; throughout this essay, I will simplify notation for the ease of exposition.) (2b) says that a sentence is true iff there are some events that satisfy the interior structure of the sentence, which in the normal case would be the concatenation NP^VP. On the other hand, when Φi is linguistically primitive, the structure of Θ can be more complex to reflect the lexical structure of the element. For instance, a sentence of the syntactic form [S[NPSue] [VPbutchered [NPchickens]]] is true iff (3), where (3a-d) are equivalent:

(3) a. ∃O Val(O, [S[Subj-NP Sue] [VP butchered [Obj-NP chickens]]], A)

b. ∃O(Val(O, [Subj-NP Sue], A) & Val(O, [VP butchered [Obj-NP chickens]]], A))

c. ∃O(Val(O, [Subj-NP Sue], A) & Val(O, [V butchered], A) & Val(O, [Obj-NP chickens],

A))

d. ∃O(External(O, Sue) & Past-butchering(O) & Internal(O, chickens))

(3d) is roughly and intuitively read as: there exist some events that were past-butcherings of chickens, and Sue was the agent of these events. (Notice that the syntactic position of being, e.g., a Subject-NP triggers a semantic treatment of Sue as the agent of the event.) Throughout ESA, Pietroski details how (1) and (2) form the basis of a compositional semantic theory that can handle a wide variety of constructions, including quantifiers, plurals, causative verbs, sentence-complementation, and serial verb constructions.

There are many significant and important features of the view developed in ESA. But since serious linguistic theorizing is a game of details, and Pietroski responsibly supplies them, I can only sketch out one instance of a Conjunctivist semantics. I'll focus on the book's most striking study: the treatment of quantification. The treatment is complex, but its most salient features can be outlined as follows. For a sentence like Every bottle fell, the appropriate syntactic object of interpretation is treated as something like [S' [QP Every [int-NP bottle] [ext-S t1 fell]]. Decomposing this sentence along the lines of (3) eventually yields (4a) which, details aside, is analyzed as (4b):

(4) a. ∃O(Val(O, every, A) & Val(O, [int-NP bottle], A) & Val(O, [ext-S t1 fell], A)).

b. There are some events O such that (i) all the exterior arguments of the Os are the value t, (ii) together, the interior arguments of the Os are all and only the bottles, and (iii) an exterior argument of an O is t iff the interior argument of that O fell.

Three features of this analysis of quantifiers are noteworthy. (Although they play little or no role in ESA, I suspect they will be important to further developments of Conjunctivism.) First, quantifiers are typically thought of as higher-order relations between properties, and this is how Pietroski treats them. In both (3) and (4), the relational features of transitive verbs and quantifiers are tucked away in the (stipulated) logical properties of events -- events are things that can have an agent and a theme, or an interior argument (a bottle) and an exterior argument (a truth value). Indeed, Pietroski often identifies events, or the items that go proxy for them, as ordered pairs (e.g., pp. 90, 96). Mathematically speaking, though, events function simply as index points, and needn't be thought of as having any particular structure; their interaction with semantic relations like External(⋅, ⋅) does their work. Second, the truly quantificational aspects of quantifier phrases are wholly located in the semantics of the quantifier word itself. E.g., in analogy with every, one can characterize Val(O, most, A) as holding iff most of the Os have t as their exterior argument. The role of the remaining NP and S is only to place further constraints on the nature of the two relata in each of the Os. Third, the syntactic details are crucial. In (4), bottle must be identified as part of an NP that is itself part of a quantifier phrase -- it cannot be merely part of an NP. The extra syntax is what allows for the condition that the Os are all the (relevant) bottles. Such a condition is not present with other uses. E.g., in the theory, Val(O, [nine bottles], A) holds iff there are some things, nine of them, and they are all bottles (but not necessarily all the bottles).

There's a great deal more to be said about the details of Conjunctivism as developed in ESA. Philosophers and linguists alike will profit by considering these details carefully. However, I'll end by sketching two general issues relevant to further research into a Conjunctivist semantics.

The first issue concerns the overall motivation for a Conjunctivist semantics. The major factor here is the desire to have a very simple theory of language (Pietroski reviews the ample reasons behind this desideratum). Conjunctivism achieves a remarkable degree of simplicity in the purely compositional aspects of language. Composition is in effect reduced to conjunction with occasional existential quantification. Thus, in terms of semantic composition alone, Conjunctivism does appear to be clearly simpler -- and hence to that extent, preferable -- to some of its more familiar rivals. However, there is quite a bit of logic and other metaphysical structure lurking below the conjunctions. As I noted above, Conjunctivism doesn't reduce or avoid the relational aspects of language. Rather, it locates these aspects in the properties of events. Indeed, a great deal of structure that Functionists encode directly into the compositional system can be found within the structure of the Conjunctivist's events. For instance, the events (or eventlike entities -- we needn't draw a distinction here; cf. above) that witness sentences formed with sentential connectives are treated as ordered pairs of sentence values (p. 90). Thus, Sam sang or Kevin quilted is witnessed by an event of the form , where at least one member of this pair is t. In later work, it may be useful to replace this part of the theory with a more general treatment that also subsumes the non-extensional sentence connectives, e.g., Sam sang because Kevin quilted. In any case, as Conjunctivism is further developed to handle an increasingly broad range of constructions and theoretical considerations, it will inevitably become increasingly complex. Notions like the "complexity" or "simplicity" of semantic architectures are highly informal. Suppose we concede that the distinctively compositional aspects of Conjunctivism are indeed simpler than many of its rivals, whatever simplicity amounts to in this context. Nonetheless it still remains an open and interesting question whether Conjunctivism, considered as a whole, is -- and will remain -- "simpler" than its rivals. It may be that enough "complexity" is tucked away into the nature of events that in the end, Conjunctivism won't be appreciably "simpler" than many of its rivals. In any case, to my ears, much more work needs to be done in terms of clarifying such nebulous notions of "simplicity" before these issues can be usefully addressed.

The second issue concerns the treatment of certain seemingly highly intentional aspects of grammar, such as "direct causation". It's almost universally recognized that verbs like boil as in (5) have covert syntactic and semantic structure:

(5) Sharon boiled the soup

Thus, the meaning of (5) is something like Sharon caused (in the right way) the soup to boil. Pietroski correctly notes that nobody knows how to specify what counts as "the right way" to cause soup to boil to have a case of boiling soup. But he suggests that we should be "unimpressed" with theories that bracket the problem by defining a term, say cause*, that by stipulation denotes the correct relation (p. 196). Instead, he proposes a Conjunctivist solution: (5) can be analyzed as follows (cf. p. 180):

(6) ∃O(Agent(O, Sharon) & ∃O'(Terminater(O, O') & Past-boiling(O') & Theme(O, the soup))

(6) says that there were some events whose agent was Sharon, and whose theme was the soup, and these events were terminated in events of boiling. But it's hard to see how a relation like Terminater(O, O') is any more illustrative than Cause*(O, O'). (Notice, by the way, that the two-place Terminater relation forces a rather weak reading of the claim that the conjuncts are monadic predicates with respect to the free variables.) Suppose, for instance, that Sharon drops a spoon, and then John walks over and turns the heat up on the soup, causing it to boil. (Sharon's dropping the spoon may or may not have caused John's action -- either case will work here.) In such a circumstance, Sharon didn't boil the soup. But nothing in the logic of Conjunctivism guarantees that Sharon boiled the soup comes out false here. It all depends on what sort of events there are, and what it is for one event to terminate another. So there must be some metaphysical/logical constraints that ensure that there is no event (or process) that begins with Sharon dropping the spoon and terminates with the soup boiling. What kinds of constraints are these? Pietroski's few glosses on the Terminater relation are of little help: "'Terminater' indicates a relation that causal processes can bear to final parts of themselves" (p. 180). But now we need an account of the Terminater relation, and of linguistically appropriate individuations of causal processes and final parts. It's hard to see how any such account won't amount to spelling out the semantic impact of cause*. In short, employing a Terminater relation is precisely parallel to employing a cause* relation. Contrary to Pietroski's remarks, neither appears to present an advantage over the other; both are placeholders for a psychologically rather rich notion. In other words, neither the logic of Conjunctivism or that of its rivals provides any especially deep insight into this notoriously difficult aspect of human languages.

In sum, Conjunctivism is an attractive and interesting view -- a solid understanding of it will surely be mandatory for anyone interested in the rigorous study of meaning in human languages. Pietroski addresses head on some of the really difficult issues for Conjunctivism. Such issues are of course the real proving ground of a theory. Stepping back a bit, though, one also notices a certain intuitive plausibility to the theory. If the purpose of our words is to convey or express thoughts and information, then when we add words to a sentence, we are (ceteris paribus) increasing the informational content of a sentence. It's entirely plausible that the informational analogue of "adding" semantic content is conjunction. If this is on the right track, Conjunctivism may be just the kind of semantic architecture one would expect.