Degrees of Belief

Placeholder book cover

Franz Huber and Christoph Schmidt-Petri (eds.), Degrees of Belief, Springer, 2009, 354pp., $27.95 (pbk), ISBN 9789048137183.
 

Reviewed by Horacio Arlo-Costa, Carnegie Mellon University

2010.01.26


This collection focuses on the laws that degrees of belief should obey (Part II), as well as the relationship between degrees of belief and qualitative notions of belief like plain or full belief (Part I), concluding with two articles that deploy logical techniques to articulate the notion of levels of belief (Part III). An essay by Franz Huber introduces the collection. The book covers a significant amount of ground both philosophically and formally. The first and second parts are concerned in part with the tenability and scope of contemporary forms of probabilism in epistemology: a version of Bayesian epistemology for which partial beliefs, or credences, play the leading foundational role (instead of full beliefs and an epistemology articulated by the two Jamesian commands: Believe the truth! Avoid error!).

Some authors from Part II seem to endorse a radical form of probabilism of the sort that Richard Jeffrey defends in various important papers (Jim Joyce seems to be an example). This view amounts to a form of eliminativism according to which the traditional notions of qualitative belief play no role in a mature epistemology of a Bayesian variety. None of the three authors in Part I seem to endorse this view. On the contrary, each seems to think that the traditional forms of qualitative belief play a crucial role in Bayesian epistemology.

The first two authors from Part I (Richard Foley and James Hawthorne) investigate acceptance rules articulated in terms of high probability. Of course, this leads to a view of belief that rejects closure under conjunction. Foley offers philosophical arguments for this view, and Hawthorne characterizes a logic of belief along similar lines. These two articles accordingly defend an epistemological position first proposed and articulated by Henry Kyburg in a series of writings.

Foley seems to assume that the only alternative to embracing a notion of belief that rejects closure under conjunction is the type of eliminative and radical probabilism mentioned above. In the last part of his article, he offers various arguments against this form of radical probabilism. One such argument proceeds as follows:

We sometimes welcome and even demand probabilities, but even here, the probabilities are arrived at against a backdrop of black-and-white assumptions -- that is, a backdrop of belief (p. 46).

This idea can be understood in several different ways. Kyburg would understand it in a way that seems close to what Foley suggests; namely, that the beliefs in the background are themselves obtained in terms of high probability. But Foley observes that the particular beliefs (that the die whose probability of landing on a six I just estimated is not weighted; that the deck of cards used to estimate the probability of drawing a heart is a standard deck, etc.) are so close to being certainties that the question of their probability does not seem to arise.

It is important to see that one can view these certainties as primitively given, thereby abandoning probabilism. After all, this has been the view of one of the founders of Bayesian epistemology: Bruno de Finetti cannot be seen (perhaps surprisingly for many) as a probabilist. De Finetti had a view according to which a qualitative notion of certainty is assumed as an epistemological primitive alongside the notion of probability. Moreover, the notion of certainty plays a crucial role in his characterization of probability. This is the view that he proposes in his Theory of Probability, Volume I (de Finetti, 1990):

Thinking of a subset of truths as given (knowing, for instance, that certain facts are true, certain quantities have given values, or values between certain limits, certain shapes, bodies or graphs of given phenomena enjoy certain properties, and so on), we will be able to ascertain which conclusions, among those of interest, will turn out to be - on the basis of the data either certain (certainly true), or impossible (certainly false), or else possible (p. 25).

What about probability? According to de Finetti, “probability is something that can be distributed over the field of possibility”:

Using a visual image, which at a later stage could be taken as an actual representation, we could say that the logic of certainty reveals to us a space in which the range of possibilities is seen in outline, whereas the logic of the probable will fill in this blank outline by considering a mass distributed upon it (p. 25).

This is a perfectly legitimate alternative to probabilism that Foley does not seem to consider in his article. As I explained above, he seems to think that the only alternative to a notion of belief that rejects closure under conjunction is a form of radical probabilism that shuns the notion of belief altogether. Yet de Finetti’s articulation of Bayesian epistemology is a clear alternative that relies on two epistemological primitives rather than one. This is a form of epistemological pluralism that apparently is not considered in Part I of this book. According to this pluralistic view, one can adopt a doxastic primitive that is well behaved logically and that therefore is indeed closed under conjunction. Moreover, this underlying notion of belief plays a crucial role in determining what is possible, i.e., which propositions will be considered as probability carriers. In other words, it determines the field over which probability is defined. In philosophy, Isaac Levi has defended this view in a series of writings (Levi, 1983).

Notice that if we adopt the form of probabilism Foley proposes (which Hawthorne also tacitly assumes), then apparently it is impossible to define a notion of full belief closed under conjunction in purely probabilistic terms. Even the notion of certainty would be impossible to capture probabilistically. For consider the obvious option, characterizing certainty as the measure of one’s beliefs. As Bas van Fraassen has indicated in his seminal article (van Fraassen, 1995), this characterization of certainty or full belief falls victim to transfinite lottery paradoxes. Consider the mass of the moon reckoned in kilograms. I am sure that it is a number in the real interval [a, b], so if my probability follows Lebesgue measure, the probability that the mass is exactly given by any number x in [a, b] is zero. Hence, my probability equals 100% that the number lies in the set [a, b] – {x}. In other words, the probability that the number is not x for every x in the interval is one, but I am certain (fully believe) that the number is in [a, b].

Thus, the form of probabilism Foley endorses is quite limited. It does not seem capable of representing the traditional notion of full belief. We will see below that there are forms of probabilism that are indeed capable of capturing this notion and which are therefore superior to the form of probabilism Foley accepts. In addition, Foley does not seem to contemplate the possibility of non-probabilist understandings of Bayesian epistemology (like the one proposed by de Finetti), which are also capable of accommodating a well-behaved notion of full belief.

Hawthorne’s article is very rich and contains interesting results, but it seems that the same criticism raised against Foley’s view can also be raised against his proposal. Hawthorne first develops a notion of qualitative probability (similar to the notion of qualitative probability used by Savage). He defines a binary notion a. The statement A a B can be read in various ways. One of them is: ‘a is at least as confident that A as that B.’ Another is ‘A is at least as plausible for a as is B.’ The model is weaker than the usual accounts of qualitative probability. Then certainty is defined as follows: Certa[A] if and only if A a B ¬B.

Hawthorne proposes some axioms for a and proves an interesting theorem showing that if Pa is any probability function and we define A a B if and only if Pa[A] Pa[B], then a satisfies all the proposed axioms. Completeness is also proved: if a satisfies the appropriate axioms, then Pa is unique and A a B if and only if Pa[A] Pa[B]. So as long a satisfies the appropriate axioms, we have that Certa[A] if and only if Pa[A] = 1, for a uniquely given function Pa.

Now the same criticisms offered above seem to apply here. This notion of certainty is equally victim to transfinite lottery paradoxes. Nevertheless, most of the interesting work presented in Hawthorne’s article focuses on defining belief rather than certainty. The corresponding notion of belief is not closed under conjunction. This work is related to fascinating formal work by the author and David Makinson, characterizing the notion of a probability-based conditional determined by threshold probability. This notion is worth studying, but still it seems that this form of probabilism does not manage to make contact with the traditional notions of epistemology (certainty, belief, knowledge). In a way, it changes theme and characterizes a new doxastic notion that is parasitic on the notion of probability.

Is there any form of probabilism immune to the aforementioned problems? Yes. Bas van Fraassen proposes the central idea of a form of probabilism (van Fraassen, 1995). There he introduces the notion of a probability core. According to van Fraassen, a set C is a probability core if the set C and its complement are normal (i.e., if P(.|C) and P(.|UC) are probability functions), and for each subset B of C and every set D in its complement, P(B|B + D) = 1. Then one can prove that cores are very well behaved. For example they are nested, and if one assumes countable additivity one can show that there is a smallest core (see (Arlo-Costa, 1999)). A slight modification of the theory (proposed in (Arlo-Costa and Parikh, 2005) and (Arlo-Costa, 2001)) guarantees that in the presence of countable additivity and for countable spaces, there is always a smallest core and a largest core. The smallest core has measure one. The proposal is to see the largest core as encoding full beliefs and the smallest core as encoding expectations or “almost certainties.” The theory is extendable to the infinite case as well. We can see then that, for example, none of the sets [a, b] – {x} in the example of the mass of the moon constitutes a core (in spite of carrying measure one). But, according to the conception of core offered by Arlo-Costa and Parikh the interval [a, b] is indeed a core (unfortunately according to van Fraassen’s original definition of core, there are no cores in this example, something that is also undesirable given that pre-systematically [a, b] seems to encode the strongest full belief of the agent).

This form of probabilism is capable of offering probabilistic characterizations of the traditional notions of full belief and expectation (“almost certainty”). The main idea of the theory is to offer a unified view of degrees of belief and full belief by appealing to a third primitive, the notion of supposition embedded in primitive conditional probability. Still, the price that one pays for this unification is to assume a pre-Kolmogorovian notion of probability. Unfortunately, this form of probabilism (and the notion of primitive conditional probability that it presupposes) is not represented in this volume. Therefore, Franz Huber concludes in the introduction, “subjective probabilities do not give rise to a notion of belief that is consistent and deductively closed” (p. 16). I would qualify this statement: Monadic subjective probabilities do not give rise to such a notion of belief, but dyadic notions of probability do give rise to a notion of full belief that is consistent and deductively closed.

Part I closes with an essay by K. Frankish. The central idea in this essay is also concerned with the derivation of a notion of flat-out belief or full belief from probability. Frankish’s essay is very rich. Various other decision theoretic accounts of belief like the one proposed by Maher (Maher, 1993) and Kaplan (Kaplan, 1981) are considered and criticized. Frankish concludes:

the relevant attitude is a commitment to using propositions as a premise in reasoning and decision-making for some purposes in some contexts, regardless of the degree of confidence we have in it. (p. 91-92)

In order for this idea to be precise, one has to explain exactly what it is “to use propositions as a premise in reasoning and decision-making.” There are many ways of understanding this proposal. The examples that Frankish presents seem to suggest that propositions of this type can be used to determine, for example, what actions are feasible and what actions are not feasible in a given decision problem. In any case, here we are operating in an area beyond the limits of probabilism. For example, we are supposed to have at hand the rudiments of a decision theory. The previous accounts (including the one in terms of probability cores sketched above) presume that full beliefs carry probability one (although not all events of probability one are full beliefs). Here apparently we can have flat-out beliefs carrying low credence. The account is supposed to be descriptive rather than normative, so it is unclear to what extent the corresponding notion of flat-out belief has its usual normative role.

Part II focuses on the justification of the axioms of the standard notion of probability. It contains as well various articles articulating the notion of degree of belief along different lines than the traditional probability calculus. Standard probability is additive, in the sense that the calculus obeys at least the law of finite additivity (P(A B) = P(A) + P(B), if AB = ). Rolf Haeanni’s article focuses on belief functions which are super-additive (Bel(A) + Bel(B) Bel(AB), if AB = ). Haenni presents as well interesting arguments showing how to combine logic and probability in this framework. Didier Dubois and Henri Prade offer a survey of possibility measures which are sub-additive. There is a qualitative possibility theory and a quantitative version of it, the former closely related to non-monotonic reasoning and the latter a special case of upper and lower probabilities (establishing therefore a link with the theory of imprecise probabilities). The article by Dubois and Prade focuses on qualitative possibility theory. The idea is to offer a survey of how partial belief can be represented in possibility theory, and how related issues like acceptance and belief revision can be handled in this setting.

Let be a field of propositions over a universe W. Then we can define a possibility measure as a function : →ℜ such that for all propositions A, B in :

() = 0

(W) = 1

(AB) = max{(A), (B)}

We can then define the dual necessity measure as a function N: →ℜ defined for all propositions A in as follows: N(A) = 1 – (¬A). Alternatively one can define necessity measures as functions N: →ℜ such that for all propositions A, B in :

N() = 0

N(W) = 1

N(AB) = min{N(A), N(B)}

Now let the necessity measures of possibility theory assign natural instead of real numbers in the unit interval to the various propositions so that instead of 1 represents maximal necessity. Then the axioms for necessity measures become:

N() = 0

N(W) =

N(AB) = min{N(A), N(B)}

Now (as Huber suggests in the introduction) think of the rank of a proposition A as the degree of necessity of its negation ρ(A) = N(¬A). Then we can introduce finitely minimitive ranking functions (studied by Spohn). In fact, these functions are a mere terminological variation of the variant of necessity measures proposed above:

ρ(W) = N() = 0

ρ() = N(W) =

ρ(AB) = N(¬A¬B) = min{N(¬A), N(¬B)} = min{ρ(A), ρ(B)}

Still the interpretations of both types of functions might diverge. This is not the case when we focus on Shackle’s degree of potential surprise (Shackle, 1949, 1969). As both Spohn and Huber explain in their articles, the main interest of ranking functions over and above Shackle’s degree of potential surprise and Dubois and Prade’s necessity functions is that they offer a nice account of conditional ranking functions (while apparently Shackle struggles with the definition of conditional potential surprise).

Spohn’s definition of conditional ranking functions is simple and elegant: the conditional ranking function ρ(.|.): xℑ N {} is defined for all A , B in , with A as:

ρ(A|B) = ρ(AB) – ρ(A)

The number r(A) represents the degree of disbelief in the proposition A. The additional difference between ranking functions and similarly motivated formalisms is that there are some arguments for ranking functions of the sort offered in probability theory. In fact, there are both pragmatic and non-pragmatic arguments for the thesis that degrees of belief should obey the axioms of the probability calculus. (Some of these arguments are discussed in this volume; see below.) Some arguments of this sort exist for belief functions (Smets, 2002). Such arguments are also available for ranking functions, as the articles of Huber and Spohn explain.

Two main arguments have been advanced for ranking functions. The first is a non-pragmatic argument offered by Huber (Huber, 2007). To appreciate how the argument works, it is useful to notice first that ranking functions give rise to belief sets that are consistent and logically closed. The idea is to define:

Bρ = {A : ρ(¬A) > 0} = { A : ρ(¬A) > ρ(A)}

The idea of the epistemic argument for ranking functions is that it is epistemically defective to have degrees of disbelief that violate the ranking axioms. Epistemically defective in which way? The argument specifies as a principle of theoretical rationality that it is epistemically defective to have beliefs that are not both consistent and deductively closed. Then the argument appeals to a behavioral specification of degrees of entrenchment and shows that an entrenchment function gives rise to consistent and deductively closed beliefs if and only if it satisfies the ranking axioms. Huber calls this the Consistency Argument.

Huber concludes (Huber, 2007) that

given a link between degrees of disbelief and degrees of entrenchment, the normative force of the Consistency Argument is then proportional to how odd one takes the possibility of knowingly or believingly having beliefs that are not both consistent and deductively closed.

Two observations are relevant here. First, as we noted above, conditional degrees of belief obeying the usual axioms for primitive conditional probability also give rise to consistent and logically closed beliefs (as a matter of fact, one of the consequences of the definition of core used by Arlo-Costa and Parikh is that a probability function is never coreless: full beliefs are always derivable from coherent conditional probability — at least this can be shown for spaces containing all unit sets). Perhaps a similar argument can be also marshalled for two-place probability.

The second point is that the normative force of Huber’s argument does not seem capable of convincing skeptics. After all, a philosopher following Kyburg’s main ideas about acceptance is an agent who “knowingly has beliefs that are not both consistent and deductively closed.” Such an agent can be perfectly aware of this fact but refuse to see it as a defect of his belief set. Indeed, an agent can regard it as a virtue. To be sure, Kyburg himself tends to see full logical closure as ‘conjunctivitis’: some sort of defective aspect of most theories of rationality. So an agent who adopts a high probability threshold view of belief will not at all view as odd the possibility of knowingly having beliefs that are not logically closed.

The second, more important, argument for ranking functions is a completeness result offered in (Hild and Spohn, 2008) and rehearsed in Spohn’s article in this volume. Roughly speaking, the theorem asserts the following: Iterated contractions obey certain axioms if and only if differences between ranks behave such and such; if differences between ranks behave such and such, then there is a ranking function measured on the ratio scale, i.e., unique up to a multiplicative constant, which exactly represents these differences.

This is a remarkable and interesting theorem that does various things at once. On the one hand, it offers the most complete axiomatization of iterated contraction available today. On the other hand, the theorem guarantees that if one accepts this axiomatization, then one is bound to accept rankings of the sort that Spohn has proposed. In other words, rankings uniquely represent contraction behavior. The theorem depends therefore on the tenability of the axioms for iterated contraction, an interesting and relatively open topic at the moment. In any case, this is a very important result that puts ranking theory at the same level as other theories of belief change that have been completely characterized, such as the AGM theory.

The remaining articles in Part II focus on arguments justifying some of the main tenets of probabilism. Colin Howson focuses on two controversial issues: the status of countable additivity (CA) as a fundamental axiom of probability, and the status of conditioning as a diachronic constraint on rationality. De Finetti offered well-known arguments against CA, but his arguments have been recently challenged in various ways (among others by extending the Dutch Book argument to infinity). Howson pays special attention to the arguments offered in (Cox, 1961). His conclusion is that “my personal opinion, for what it is worth, is that de Finetti has made a very persuasive case, which is strongly reinforced by Cox’s result, against CA.”

Regarding conditioning as a diachronic constraint on rationality, we have two opposed views in this volume: Skyrms who supports the arguments for conditioning as a dynamic rule, and Howson who presents the opposite view (but see also (Skyrms, 1993), which contains a more nuanced position than the one endorsed in this volume). Howson’s argument is as follows. Suppose that Q represents your belief state tomorrow. Suppose also that P(A) = 1, and P(Q(A) = r) >0, where r < 1. Then it is easy to see that P(A|Q(A) = r) = 1. Suppose, finally, that tomorrow you learn that Q(A) = r by introspection. Hence Q(A) P(A|Q(A) = r) = 1 and you violate conditionalization. The new twist in this story is that the violation is actually required if you are to be consistent: “It follows that, far from being a condition of consistency, ‘dynamic coherence’ will actually lead to inconsistency” (p. 116).

Jim Joyce presents in this volume an important essay offering a “non-pragmatic vindication of probabilism.” We need some definitions to introduce the main idea of Joyce’s argument.

A credence function b assigns degrees of belief to elements of the partition X = {X1, … , XN}. Given a credence function b and truth-value assignment v for X, an epistemic scoring rule S produces a real number S(b, v) that measures the epistemic disutility of holding the credences b when v is actual. A credence function b is inadmissible relative to a scoring rule S when there is some alternative bS such that S(b, v) > S(bS, v) for all v. With these meager elements we can present the main result offered by Joyce.

If S satisfies two natural properties, and is finite on [0, 1] and continuous for all credence functions and all truth-value assignments, then:

i. Every incoherent credence function is inadmissible relative to S and, moreover, is strictly dominated by some coherent credence function, and

ii. Every coherent credence function is admissible relative to S.

The two properties used in the proof are a property called TRUTH DIRECTEDNESS and another called COHERENT ADMISSIBILIITY. The latter property is a weakening of propriety, a condition usually imposed on scoring rules. There are two versions of coherent admissibility:

COHERENT ADMISSIBILITY (WEAK VERSION) No coherent credence function b is ever strictly dominated, i.e. if c b then S(c, v) S(b, v) for some v.

COHERENT ADMISSIBILITY (STRONG VERSION) No coherent credence function b is ever weakly dominated, i.e. if c b then S(c, v) < <s(>S(</s(>b, v) for some v.

As Joyce explains elsewhere (Joyce, 2009), the article that appears in this book states the weak version, but argues for the strong version and uses the strong version in the proof. So the strong version is the one needed in the proof. Joyce thinks that in any case this does not affect the thrust of the argument because he believes that the strong version is indeed true.

Teddy Seidenfeld pointed out in a recent talk given by Joyce at CMU that the strong form of Coherent Admissibility entails that coherent previsions are Bayes solutions. So, in view of that, it remains open whether every scoring rule that satisfies Coherent Admissibility must be proper. Joyce recently offered a counterexample establishing that Coherent Admissibility is indeed weaker than propriety.

Alan Hajek offers in this volume various arguments against Joyce’s result. One of them is that Joyce’s argument assumes that “any coherent credence function can be rationally held.” But depending on the content of the elements of the partition X, some credence assignments might be irrational (for example, they might violate principles of Direct Inference from chances). Joyce’s response (Joyce, 2009) is that it is not his purpose to argue that any coherent set of credences is rational tout court: “I only claim that they should not be excluded on grounds of inadmissibility” (p. 27).

The second objection Hajek advances is that the argument presumes that for any partition X with n cells and any non-negative {b1, … , bN} summing to one, ch(X1) = b1, … , ch(XN) = bN is a feasible chance assignment. But dependent on the content of the propositions in X, it might be that some chance assignments are ruled out for metaphysical reasons. The response in (Joyce, 2009) is that the only thing that is needed is that given a non-negative {b1, … , bN} with n bn = 1 and any partition X, there is at least an evidential situation in which b(Xn) = bn are the right credences to hold. To achieve this Joyce assumes a principle of EXTENSIONALITY that says that for any (ordered) N-element partitions X and Y, SX(., v) = SY(., v) for every (ordered) truth-value assignment v. So we have that given any non-negative {b1, … , bN} with n bn = 1, there is some partition X with n cells such that b1, … , bN are the right credences to invest in the elements of the partition. To achieve this, we just need an N-sided die loaded so that the chance of side 1 coming up is b1, the chance of side 2 is b2, etc.

The collection closes with two essays that consider qualitative and non-probabilistic approaches to belief. In many logical models of non-monotonic reasoning and belief change researchers have appealed to degrees of belief, or degrees of disbelief, or degrees of non-belief (degrees of expectation). Hans Rott’s paper attempts to show that these various structures can be combined into a unified whole. This yields a hierarchy that spans from guesses and expectations to a priori beliefs. Somewhere between the agent’s expectations and the a priori beliefs, her attitudes might be the ones of (plain) belief. Where to draw the line? According to Rott, if there is such a threshold for belief, the process of demarcating it is context-dependent. So, belief remains elusive. Of course, all the doxastic notions deployed in the unified model are consistent and logically closed, avoiding the classical problems of some of the versions of probabilism considered in the first part of the book.

David Makinson elaborates on how the notion of levels of belief appears in the usual constructions used in non-monotonic reasoning. For example, we might define A |~K x to hold when x is a classical consequence of K’ A for all maximal A-consistent subsets K’ of K. This definition can be liberalized by requiring that K’ A entails x only for certain selected maximal A-consistent subsets K’ of K. Moreover, these sets are selected by deploying a relation < that prioritizes among the subsets of K, treating some as preferred to others. Makinson offers a philosophical interpretation of this relation <: "it is natural to treat this relation < as representing our confidence in the subset K’, in other words the level of our belief in the truth of its elements" (p. 346). Makinson shows how the standard literature in non-monotonic consequence and belief revision avoided a direct reference to probability to articulate this notion of level of belief presupposed in most of the existing accounts of non-monotonic consequence.

Despite important lacunae (e.g., recent work on conditional probability and imprecise probabilities), Degrees of Belief rightfully bears its title, offering a comprehensive overview of recent work on quantitative and qualitative models of grades of belief. A detailed introduction and several excellent surveys make the book nearly self-contained, and various original essays which offer new and interesting results make the book a showcase for recent work. Thus Degrees of Belief should be received both as an excellent introduction to degrees of belief for non-experts and a useful resource for philosophers and other scholars working in formal epistemology and related areas. We should expect that Degrees of Belief will find readership for some years to come.

References

Arlo-Costa, Horacio, Bayesian Epistemology and Epistemic Conditionals: On the Status of the Export-Import Laws, Journal of Philosophy, Vol. XCVIII, 11, 555-598, 2001.

Arlo-Costa, Horacio, Qualitative and Probabilistic Models of Full Belief, Proceedings of Logic Colloquium ’98, Lecture Notes on Logic 13, S. Buss, P. Hajek, P. Pudlak (eds.), ASL, A. K. Peters, 1999.

Arlo-Costa, Horacio and Parikh, Rohit, Conditional Probability and Defeasible Inference, Journal of Philosophical Logic 34, 97-119, 2005.

Cox, R.T., The Algebra of Probable Inference (Baltimore, The John Hopkins Press, 1961).

De Finetti, Bruno, Theory of Probability, Volume I (New York: Wiley, 1990).

Hild, Matthias and Wolfgang Spohn, The Measurement of Ranks and the Laws of Iterated Contraction, Artificial Intelligence, 172, 1195-1218, 2008.

Huber, Franz, The Consistency Argument for Ranking Functions, Studia Logica, 86, 299-329, 2007.

Joyce, Jim, Scoring Rules and Coherence, Handout of a Talk, CMU, November 19, 2009.

Kaplan, Mark, Rational Acceptance, Philosophical Studies, 40, 129-145, 1981.

Levi, Isaac, and The Enterprise of Knowledge: An Essay on Knowledge, Credal. Probability and Chance (Cambridge, MIT Press, 1983).

Maher, Patrick, Betting on Theories (Cambridge, Cambridge UP, 1993).

Shackle, George L.S., Expectation in Economics (Cambridge, Cambridge UP, 1949).

Shackle, George L.S., Decision, Order and Time (2nd Edition, Cambridge, Cambridge UP, 1969).

Skyrms, Brian, A Mistake in Dynamic Coherence Arguments? Philosophy of Science, 60, 320-328, 1993.

Smets, Philippe, Showing How Measures of Quantified Belief are Belief Functions, in B. Bouchon, L. Foulloy and R.R. Yager (eds.), Intelligent Systems for Information Processing: From Representation to Applications (Amsterdam, Elsevier, 265-276, 2002).

Van Fraassen, Bas, Fine-Grained Opinion, Probability and the Logic of Full Belief, Journal of Philosophical Logic, 24, 349-377, 1995.