New Perspectives on Conventionalism
Dates: 29-30 September 2025
Location: Unipark Room 4.101, Erzabt-Klotz-Strasse 1, 5020 Salzburg, Austria
Conventionalist approaches to areas of discourse such as logic and mathematics were for many years almost universally rejected, but this appears to be changing. Recent years have seen the publication of significant works broadly in the conventionalist tradition, works that respond to objections previously thought to be fatal for conventionalism and that develop and defend views that either are explicitly conventionalist or at least are clearly indebted to historical conventionalist approaches. This workshop aims to bring together philosophers working on topics related to conventionalism, its challenges, and its prospects.
Programme
Monday 29 September
9:00 | Welcome and coffee | |
9:15 | ‘Adoption as formalisation’ | Gil Sagi (Haifa) |
10:45 | ‘On logical generality’ | Corine Besson (Sussex) |
12:00 | Lunch | |
14:00 | ‘Can conventionalism fully account for classical logic?’ | Fredrik Nyseth (Tromsø) |
15:30 | ‘Infinitary rules as idealisation of finitary metamathematical inferences’ | Andreas Fjellstad (Padua) |
17:00 | ‘Realism, conventionalism, and the determinacy of mathematical language’ | Sebastian G. W. Speitel (Bonn) |
19:00 | Dinner |
Tuesday 30 September
9:00 | Coffee | |
9:15 | ‘Imperatives and deontic modality: an inferential expressivist perspective’ | Luca Incurvati (ILLC, Amsterdam) |
10:45 | ‘Does pragmatic evidence help to solve the problem of existence for mathematical conventionalism?’ | Deborah Kant (Vrije Universiteit Brussel) |
12:00 | Lunch | |
14:00 | ‘Logical consequence in signalling systems’ | Emelia Stanley (Vienna) |
15:30 | ‘On defining’ | Alexander W. Kocurek (UC San Diego) |
17:00 | ‘Carnapian logicism and semantic analyticity’ | Hannes Leitgeb (MCMP, LMU Munich) |
19:00 | Dinner |
Abstracts
Corine Besson, ‘On logical generality’
Logical conventionalism is a metaphysical view concerning the nature of logic: logical facts, truths, principles and rules are by-products of general linguistic conventions. Logical conventionalism tends to be more specifically articulated in terms of logical inferentialism, according to which the basic general rules of logic define the meanings of the logical constants. Conventionalism so-understood is committed to a view concerning the psychology of deductive reasoning – concerning what is it for someone to be able to reason deductively: it is to somehow apprehend the general rules that define the logical constants and reason according to them. I argue that this psychological view is untenable. This casts doubt on logical conventionalism in general, since its metaphysical dimension cannot be decoupled from its psychological one. I also sketch my preferred view of the psychology of deductive reasoning, which I call ‘particular first’. This view, I argue, can separate metaphysical claims about the nature of logic – and its generality – from psychological ones about the nature of deductive reasoning. (back to programme)
Andreas Fjellstad, ‘Infinitary rules as idealisation of finitary metamathematical inferences’
As a tool to overcome the limitations of recursively enumerable logico-arithmetical theories uncovered through Gödel’s incompleteness theorems, infinitary inference rules have turned out to be extremely practical from a proof-theoretic perspective and have become an integral part of the logician’s toolbox. However, infinitary rules are also quite problematic from a philosophical perspective. How can we, as finite beings follow or reason in accordance with a rule involving infinitely many premises?
Adopting terminology from contemporary philosophy of science, I propose that infinitary rules are distorted representations of certain finitary inferences within metamathematics. In analogy with scientific models, the distortion is required for computational purposes; it ensures compatibility with the notion of a derivation as a tree of formulas. This proposal provides an alternative path towards a conventionalist account of arithmetical truth if the finitary inferences modelled by the infinitary rules are seen as constitutive for the universal quantifier within our logico-mathematical practices. (back to programme)
Luca Incurvati, ‘Imperatives and deontic modality: an inferential expressivist perspective’
In the first part of the talk, I will defend a non-cognitivist account of imperatives, starting from Paul Portner’s idea that imperatives serve to manage speakers’ to-do lists. I show that, once we recognize the multiplicity of operations that can be performed on a to-do list, the account has the resources to deal with weak uses of imperatives without postulating an additional list alongside it. In the second part of the talk, I present a logical framework which integrates weak and strong forms of assertion, rejection and imperatives. I use this framework to inferentially explain the meaning of deontic modals such as *must* in terms of imperatives. The resulting inferential expressivist account has the resources to explain performative uses of *must* and hitherto unaccounted for data about their occurrence pattern. I will end by outlining a number of outstanding issues and directions for future work. (back to programme)
Deborah Kant, ‘Does pragmatic evidence help to solve the problem of existence for mathematical conventionalism?’
Mathematical conventionalism claims that mathematical truth is determined by linguistic conventions, but it faces the problem of explaining the existence of mathematical entities. Jared Warren (2020) considers existence to be trivial: if a conventionally adopted theory of arithmetic contains existence claims, then numbers exist. Zeynep Soysal (2025), drawing on a descriptivist account about set-theoretic expressions, argues that more is required. In her view, not every theory—such as an inconsistent one—describes an existing entity. To address this, she links existence to consistency. Like Warren’s, her account is naturalistic in that it is based on the actual linguistic conventions of mathematical practice.
Using a dataset from interviews with 28 practicing set theorists (over 500 pages of transcripts and a 100-page systematic summary), we studied consistency and existence in set-theoretic practice. Our findings support the core conventionalist thesis: mathematicians can imagine that slightly different axioms could have been adopted. They also confirm Soysal’s view: set theorists reject inconsistent theories, and the consistency of their associated theory is evidence enough to use phrases like “sets exist.”
An unexpected complication emerged, however. Soysal’s descriptivism explicitly includes informal descriptions. Yet our data show that informal ways of talking about sets are not always consistent. We will present examples of this phenomenon and propose an initial specification of the notion of informal description that may resolve the issue. (back to programme)
Alexander W. Kocurek, ‘On defining’
The literature on real definition focuses on the noun ‘definition’, as in, ‘To be F is, by definition, to be G’. The verb ‘define’, as in ‘I define F to be G’ or ‘Let’s define F to be G’, is largely overlooked. I suggest we make progress on understanding the noun by better understanding the verb. Echoing Strawson’s (1950) critique of Russell (1905), “defining” is not something the definiens does: defining is something *we* do. Specifically, I argue ‘define’ is analogous to subjective attitude verbs, like ‘count’ or ‘consider’, which presuppose their complement is, in some sense, not entirely a factual matter. I then develop a conventionalist semantics of defining based on this idea. On this view, defining is an interpretive decision to “analyze” the definiendum “in terms of” the definiens, in a sense to be spelled out. Definition statements involving the noun are thus object-level expressions of these decisions. (back to programme)
Hannes Leitgeb, ‘Carnapian logicism and semantic analyticity’
In my talk I will argue for a (quasi-)Carnapian version of logicism about mathematics: there is a logicist conceptual framework in which (i) all standard mathematical terms are defined by logical terms, and (ii) all standard mathematical theorems are (likely to be) analytic. Along the way, the talk will explain the historical-philosophical background, how the definitions in (i) are to proceed, what the framework and the semantic notion of analyticity-in-a-framework are like, and why the probabilistic qualification ‘likely to be’ is used in (ii). The upshot is not some logicist epistemic foundationalism about mathematics but the insight that mathematics can be rationally reconstructed as being conceptual, i.e., as coming along with a conceptual framework. (back to programme)
Fredrik Nyseth, ‘Can conventionalism fully account for classical logic?’
Logical conventionalism, as usefully characterised by Jared Warren, is the view that facts about logical truth, falsity, necessity and validity are fully explained by linguistic conventions. Most conventionalists, moreover, maintain that classical logic is at least one coherent logical option. It is natural to ask, therefore, whether the relevant facts about classical logic can in fact be fully explained by linguistic conventions. Given a liberal conception of what counts as a legitimate linguistic convention, accounting for the classical consequence relation is relatively straightforward. What is less clear – especially in light of Carnap’s categoricity problem – is whether we can, at the same time, fully account for the truth-functional clauses that govern the logical connectives as they are classically understood. One common strategy here (proposed by Timothy Smiley and endorsed by, among others, Warren) is to invoke rules for rejection alongside rules for assertion/acceptance. In my talk, I will first argue that this “bilateralist” solution does not, by the conventionalist’s standards, ensure that the connectives behave entirely classically. The reason is, essentially, that the bilateralist solution takes bivalence for granted, and this calls for a conventionalist explanation. I then explore whether the conventionalist has the resources to address this. In particular, I ask: 1) whether facts about linguistic conventions can fully explain why no claim can be simultaneously true and false, and 2) whether facts about such conventions could fully explain why every claim must be determinately either true or false. (back to programme)
Gil Sagi, ‘Adoption as formalisation’
The adoption problem (AP) in the philosophy of logic, due to Kripke in unpublished work half a century ago, has recently seen an upsurge of attention in recent literature, largely due to Romina Birman’s work and the recent publication of Kripke’s “The Question of Logic”. The AP primarily targets Quinean anti-exceptionalism about logic, but Carnapian conventionalism is an obvious (presumably easy) target. In my talk, I intend to defend conventionalism from the AP by providing a novel account of adoption that also builds on responses to the AP of recent authors. Adoption will be seen as formalisation, conceived as the transition from natural language practice to a formal language by a reasoning subject through the undertaking of linguistic commitments. (back to programme)
Sebastian G. W. Speitel, ‘Realism, conventionalism, and the determinacy of mathematical language’
The goal of a mathematical theory, such as arithmetic, consists in providing a comprehensive and systematic picture of a piece of mathematical discourse and/or the structure(s) ‘talked about’ in such discourse, e.g., the natural number structure. Gödel’s incompleteness theorems establish that many important such theories cannot be descriptively exhaustive while remaining computationally feasible. In devising such theories, then, different desiderata have to be weighed up against each other, and choices consonant with the presuppositions of the respective position have to be made: whereas, on a naïve realist picture, a justification of successful reference of mathematical terms is of paramount importance to respond to skeptical challenges, a conventionalist will be more concerned with the well-definedness and applicability of mathematical notions.
These different ambitions notwithstanding, what realists and conventionalists have in common is a concern with the determinacy of mathematical language. For the realist this is essential to managing the underdetermination of mathematical reality by theory. Conventionalists, on the other hand, need to guarantee unproblematic and unambiguous specifications of mathematical notions. Implementations of this common goal of determinacy diverge. In this talk, I want to compare realist and conventionalist approaches to the issue of determinacy of mathematical discourse. As a case study, I will consider three extant theories of arithmetic (first-order Peano-arithmetic with an ω-rule; second-order Peano-arithmetic; and first-order Peano-arithmetic with generalized quantifiers) and assess how they are used and characterized by realists and conventionalists, with particular emphasis on the ways in which they implement core commitments of the respective stances. (back to programme)
Emelia Stanley, ‘Logical consequence in signalling systems’
One articulation of logical conventionalism is that facts about logical validity in a language are wholly explained by linguistic conventions of that language (Warren 2020, p. 33), and are therefore independent of “extralinguistic” facts. While somewhat overlooked in the recent conventionalist literature, in Convention, A Philosophical Study (1969), David Lewis provides a powerful account of conventionality in terms of arbitrary equilibria in coöperative games. Specifically, Lewis articulates the particular sense in which languages are conventional, using the formal framework of signalling games. With a goal to develop an account of conventionalism in Lewis’ game‐theoretic terms, I present original research showing that signalling equilibria define a consequence relation, giving rise to facts about logical validity unique to certain linguistic conventions. I show firstly that any coördination equilibrium in a basic signalling game circumscribes a unary consequence relation between its signals. I then introduce a multiple observer signalling game and a stronger notion of coöperative equilibrium, in order to extend these results to a plural notion of consequence (from multiple signals to another). Finally, I prove that the structure of a consequence relation can be meaningfully distinct for different signalling equilibria for the same game, which I argue vindicates the central logical conventionalist claim that facts about validity are grounded in linguistic conventions, rather than facts “about the situation.” I connect these results to conjectures about logical pluralism and inferentialism, including the (game‐theoretic) purpose of logical reasoning. (back to programme)
Funding
This event is supported by the Austrian Science Fund (FWF) through the ‘Categoricity by Convention’ research project (grant no. P33708) and the Knowledge in Crisis Cluster of Excellence.