*Plenary speakers: talk outlines*

### Edith Elkind

Title: **Voting as maximum likelihood estimation: revisiting the
rationality assumption**

There are two complementary views of voting. The first view is that

voters have genuinely different preferences, and the goal of voting is

to reach a compromise among the voters. The second view is that there

is an objectively correct choice, and voters have different opinions

simply because of errors of judgement. The latter view, which dates

back to medieval church elections, gives rise to the maximum

likelihood-based approach to voting: we treat votes as noisy estimates

of the ground truth, and attempt to identify the ground truth that is

most likely to generate the observed votes, under a given model of

noise. In this talk, I will give an overview of voting rules that

arise in this framework and relationships among them.

### Teddy Seidenfeld

Title: **A modest proposal to use rates of incoherence as a guide for personal uncertainties about logic and mathematics.**

It is an old and familiar challenge to normative theories of personal probability that they do not make room for non-trivial uncertainties about (the non-controversial parts of) logic and mathematics. Savage (1967) gives a frank presentation of the problem, noting that his own (1954) classic theory of rational preference serves as a poster-child for the challenge.

Here is the outline of this presentation:

- First is a review of the challenge.
- Second, I comment on two approaches that try to solve the challenge by making surgical adjustments to the canonical theory of coherent personal probability. One approach relaxes the Total Evidence Condition: see Good (1971). The other relaxes the closure conditions on a measure space: see Gaifman (2004). Hacking (1967) incorporates both approaches.
- Third, I summarize an account of rates of incoherence, explain how to model uncertainties about logical and mathematical questions with rates of incoherence, and outline how to use this approach in order to guide the uncertain agent in the use of, e.g., familiar numerical Monte Carlo methods in order to improve her/his credal state about such questions (2012).

**References**

- Gaifman, H. (2004) Reasoning with Limited Resources and Assigning Probabilities to Arithmetic Statements. Synthese 140: 97-119.
- Good, I.J. (1971) Twenty-seven Principles of Rationality. In Good Thinking, Minn. U. Press (1983): 15-19.
- Hacking, I. (1967) Slightly More Realistic Personal Probability. Phil. Sci. 34: 311-325.
- Savage, L.J. (1967) Difficulties in the Theory of Personal Probability. Phil. Sci. 34: 305-310.
- Seidenfeld, T., Schervish, M.J., and Kadane, J.B. (2012) What kind of uncertainty is that? J.Phil. 109: 516-533.

### Marco Zaffalon

Title: **On the beauty and the unifying character and power of coherence -
A semi-serious tour on logic, statistics, and decision theory based on desirability.**

Research in uncertainty is not truly subject to refutation by experiments, which instead drives many other research agendas, like in physics. Should we take this to mean that our probabilistic theories can be completely unconstrained, or that we should find ourselves a few, minimal, clear requirements to discipline our process of theory-development. In this talk I argue that the logical self-consistency, or coherence, of an uncertainty theory should be such a requirement. This leads us directly to desirability. I discuss the tight relations of this simple formalism with the foundations of logic, personal probability & statistics, and decision theory, as well as its impressive modelling power. Applications, problems, and challenges will complement the discussion. Where do we go from here, or, is it worth taking on the burden of a coherentist discipline? In any case we are going to make a deep choice (agnosticism does not seem to be a good idea).