The purpose of this workshop series is to bring together researchers in Montreal who are interested in the history and philosophy of mathematics to better get to know each other and each other's work. Occasionally, we also plan to invite speakers from all over the world. The talks will be fairly informal, work-in-progress presentations, warm-ups for conference presentations, graduate student presentations, etc., on traditional topics of history and philosophy of mathematics, as well as on mathematical practice, reasoning with diagrams, mathematical cognition and education.

To receive current announcements, please subscribe to our mailing list.

For more information on philosophy of mathematics in Montreal, see also http://philomathmontreal.wordpress.com.


Stay tuned for updates.


Past talks

Michael Cuffaro (U of Western Ontario):
Universality, invariance, and the foundations of computational complexity in the light of the quantum computer

Thursday, February 16, 2017
Concordia University, S-05, 2145 McKay. 4:00-5:30pm

Abstract: Computational complexity theory is a branch of computer science that is dedicated to classifying computational problems in terms of their difficulty. While computability theory tells us what we can compute in principle, complexity theory informs us with regard to our practical limits. It thus provides a bridge between the philosophy of mathematics and the philosophy of technology. In this chapter I argue that the science of quantum computing illuminates complexity theory. It does so by emphasising that the fundamental concepts of complexity theory are not model-independent. Yet I argue that this does not, as some have suggested, force us to radically revise the foundations of complexity theory, for model-independence never has been essential to those foundations. Complexity theory is best characterised as a practical science, in the sense that its fundamental aim is to describe what is achievable in practice under various models of computation for our various practical purposes. Model-independence is inessential to this aim, and re ecting on quantum computing illuminates complexity theory by reminding us of this, too often under-emphasised, fact.


Michael Makkai (McGill):
In defense of Bourbaki's structuralism

Thursday, November 17, 2016
Université de Montréal, Dept of Philosophy, 2910 Edouard-Montpetit, Room 422. 2:00-4:00pm

Abstract: I will explain Bourbaki's concept of species of structures, described in their Elements of Mathematics, Volume 1, Set Theory, Chapter IV, "Structures". Bourbaki's definition is irreducibly meta-mathematical. Armed with a meta-mathematical understanding of Bourbaki's notion and its relation to formal languages, we come to new formal languages that support improved and generalized versions of the concept of structure, ones that give rise to a more robust, more defensible, structuralist philosophy of mathematics.


Patrick Girard (U of Auckland):
Impossible modal logic

Tuesday, October 11, 2016
Concordia University, S-05, 2145 McKay. 4:15-5:45pm

Abstract: I will present semantics for modal logic and definability theorems about it, working entirely within a paraconsistent background logic and inconsistent naive set theory. The goal is to develop a paraconsistent modal logic on its own terms, using a thoroughgoing paraconsistent metatheory, and see its power as metaphysics. There are many paraconsistent logics, but I adopt a strong stance, dialetheism, in which some contradictions are taken to be true. I will present results about modal definability. These are first steps in a non-classical metaphysics. Other expected results (such as completeness) await a more mature theory.


Michael J. Barany (Princeton U):
Foundations, horizons, pasts, and futures in the history and philosophy of modern mathematics

Wednesday, March 23, 2016
McGill University, Leacock Building, Room 927. 5:00-6:30pm

Abstract: Historians and philosophers of mathematics have long recognized that there can be a substantial difference between what makes a mathematical idea right and what makes it meaningful, useful, exciting, or profound. These latter features, which can dominate contextually-sensitive accounts of mathematical history and practice, are often only indirectly related to the foundational desiderata of logical entailment or soundness that have preoccupied so many historians and philosophers of the discipline. Where foundational analyses can treat mathematical ideas as fixed elements of a static body of concepts and implications, to study mathematical meaning, use, and related themes (including their relation to foundational questions) requires regarding mathematical ideas as dynamic and as implicating context-dependent pasts and futures. I will present an approach for such a historicist history and philosophy of modern mathematics and develop some of its consequences through a discussion of two parallel episodes: the historical adoption of the theory of distributions in mathematical analysis from 1946–1950 and the contemporary adoption of perfectoid spaces in arithmetic geometry. While both have been understood as successful interventions in foundational terms, I argue that their rise and importance can be best understood only with reference to how their proponents reframed pasts and futures in their respective contexts of communication and research. In particular, I call attention to aspects of routine mathematical practice such as wordplay, citation, and academic travel that are normally absent in foundational narratives but prove essential here.


Catarina Dutilh Novaes (U Groningen):
Reductio proofs from a dialogical perspective

Wednesday, April 29, 2015
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: It is well known that reductio proofs pose interesting philosophical questions. What does it mean to assert something with the precise goal of then showing it to be false, i.e. because it leads to absurd conclusions? What kind of speech act is that? Moreover, the mathematics education literature has numerous studies showing how hard it is for students to truly comprehend the idea of reductio proofs, which indicates the cognitive complexity of these constructions. In my talk, I take as a starting point a dialogical conceptualization of deductive proofs, according to which a deductive proof is best understood as a dialogue between two (fictitious) participants — Prover and Skeptic. I argue that many of the philosophical and cognitive difficulties surrounding reductio proofs are dispelled once one adopts a dialogical perspective.


Matt Clemens (U Southern Indiana):
Toward an artifactualist account of mathematics

Friday, March 13, 2015
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: A weakening of mathematical platonism leads to an anti-nominalist view, which affirms the existence and abstractness of mathematical objects, while denying the independence of such objects from cognitive agents. One way such an anti-nominalist view might be developed is to take mathematical objects to be abstract artifacts, created by the descriptive acts of mathematicians. In this talk, I will sketch a version of this kind of view and distinguish it from a few related approaches in the philosophy of mathematics.


Vincenzo de Risi (MPI Berlin):
The development of Euclidean axiomatics in the early modern age

Friday, February 20, 2015
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: I will discuss the development of the system of axioms of elementary geometry in the main editions of Euclid in the 16th, 17th and 18th centuries, as well as the underlying epistemology. Several mathematicians, in fact, added, changed, or removed axioms and postulates in the Elements, engendering a wide ranging discussion on the foundations of geometry. Issues about continuity, parallelism, mereology, irrational magnitudes, licensed geometrical constructions, and many other topics were raised in these treatises and deeply influenced the development of epistemology. I will try to spell out the most important changes, that eventually produced a new understanding of the nature of the geometrical principles and mathematics itself.


Oran Magal (McGill):
Analyticity, triviality, and creativity: the case of mathematics

Friday, January 16, 2015
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: Poincaré provocatively asked whether mathematics is merely “an immense tautology”. This is a variant of the so-called paradox of analysis, which I argue arises not only in the context of conceptual analysis but also for deductive reasoning quite generally. To resolve it, I suggest that the same reasoning, e.g., that involved in the construction of a proof, can be seen as both creative and analytic at the same time, accounting for its non-triviality despite its being, in some sense, logically ‘given in advance’. This is an instance of a more broadly applicable distinction between two directions of analyticity, so to speak: prospective and retrospective. Finally, I would like to use this idea of analyticity as potentially substantive rather than trivial to bring out some merits of Gödel’s argument that mathematics is analytic: a substantive body of ‘truths by virtue of meaning’. If this view is defensible, it is an interesting alternative to empiricist-‘naturalistic’ philosophical accounts of mathematics.


Stephen Menn (McGill):
Eudoxus' theory of proportion and his method of exhaustion

Friday, November 21, 2014
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: Euclid in Elements V gives an astonishingly rigorous and logically complex formulation of the theory of proportion, proving such propositions as "alternation" (if A:B::C:D then A:C::B:D) for all magnitudes, before applying them to lines and areas in Elements VI. It is very hard to see what could have motivated, or led to the discovery of, such a complex set of proofs of propositions that might easily be taken for granted (notably "if A>B then A:C>B:C"). (T.L. Heath's story, that this theory was provoked by a foundational crisis caused by the discovery of incommensurables, was refuted 80 years ago by Oskar Becker.) A possible clue comes from an anonymous scholiast who says that much of Elements V goes back to Eudoxus (a collaborator in Plato's Academy, 50–100 years before Euclid). We know, on better grounds, that Eudoxus invented the "method of exhaustion" used by Euclid in Elements XII to prove e.g. that circles are to each other as the squares on their diameters, and that a cone is one-third the volume of a cylinder with the same base and height. It is easier to explain the origin of the method of exhaustion than of the Euclidean theory of proportion, and if, as is often thought, the two theories were somehow linked for Eudoxus, this might help us understand the proportion theory, but it is remarkably difficult to explain how the two theories were connected. Building on work of Wilbur Knorr, which distinguishes an earlier Eudoxian theory of proportion (surviving in Archimedes Equilibrium of Planes I) from Euclid's theory in Elements V, I offer a reconstruction, first of how Eudoxus could have been led to discover his theory of proportion in connection with the method of exhaustion, and then of how Euclid could have been led to develop his theory of proportion out of Eudoxus'.


Stewart Shapiro (Ohio State):
Frege on the real numbers

Friday, March 28, 2014
McGill University, Arts Building, Room 160. 12:00-2:00pm

Abstract: This paper is concerned with Gottlob Frege's theory of the real numbers as sketched in the second volume of his masterpiece Grundgesetze der Arithmetik. It is clear that Frege's incomplete sketch represents a mathematically significant proposal in its own right, one which purports to have several important advantages over competing contemporary theories. It is perhaps unsurprising that Frege's theory of the real numbers is intimately intertwined with and largely motivated by his metaphysics, something which has of course received a great deal of independent attention. One of Frege's more significant claims in the Grundgesetze is that the cardinal numbers and the real numbers are ontologically distinct, or constitute "completely different domains". Cardinal numbers answer the question "How many things of a certain kind are there?", while real numbers answer the question "How large is a certain magnitude compared to a unit of magnitude of that same kind?" The account raises interesting, and surprisingly underexplored, questions about Frege's metaphysics: Can this metaphysics even accommodate mass quantities like water, gold, light intensity, or charge? Frege's main complaint with his contemporaries Cantor and Dedekind is that their theories of the real numbers do not build the applicability of the real numbers directly into the construction. In taking Cantor and Dedekind's Arithmetic theories to be insufficient, clearly Frege takes it to be a desideratum on a theory of the real numbers that their applicability be essential to their construction. But why? After all, it's not as if we can actually measure magnitudes like weight or density with the kind of infinite precision embodied by the real numbers anyway.
We begin with a detailed review of Frege's theory, one that mirrors Frege's exposition in structure. This is followed by a critique, outlining Frege's linguistic motivation for ontologically distinguishing the cardinal numbers from the real numbers. We briefly consider how Frege's metaphysics might need to be developed, or amended, to accommodate some of the problems. Finally, we offer a detailed examination of Frege's Application Constraint — that the reals ought to have their applicability built directly into their characterization. It bears on deeper questions concerning the relationship between sophisticated mathematical theories and their applications.


Sean Walsh (UC Irvine):
The aims of arithmetization and the analysis of number in Kronecker, Weierstrass, and Frege

Friday, November 1, 2013
McGill University, Leacock Building, Room 927. 4:00-5:30pm

Abstract: Late 19th Century philosophy of mathematics is renown for both the program of arithmetization and various attempts to define or characterize the natural number concept. What were the goals of arithmetization in the eyes of practioners such as Kronecker, Weierstrass, and Frege, and in what ways did these goals constrain or motivate the attempt to define or characterize the number concept? Focusing even on this small subset of thinkers (and some related figures), the variety of answers that one finds to these most basic questions is surprisingly large. While the chief goal of this talk is conceptual clarification, a subsidiary aim is to better chart the various lines of historical influence.


Eileen Nutting (U Kansas):
Hilbert's geometry and mathematical truth

Friday, October 25, 2013
McGill University, Leacock Building, Room 927. 4:00-5:30pm

Abstract: David Hilbert's primary goal in his work on the foundations of geometry was to demonstrate the consistency of both Euclidean and non-Euclidean geometries. His consistency proofs, however, place significant pressure on the contours of an adequate truth theory. To accommodate these pressures, he developed a novel account of geometrical truth. Understanding this account and its motivations gives us insight into what we ought to expect from a theory of mathematical truth. Notably, the resulting expectations conflict with the ones that Paul Benacerraf assumes in his paper "Mathematical Truth."


Erich Reck (UC Riverside):
The nature and purpose of Dedekind Abstraction

Friday, November 30, 2012
Concordia University, 1515 St-Catherine West, EV 11-705. 4:00-6:00pm

Abstract: While Richard Dedekind's technical contributions to the foundations of mathematics were absorbed into modern logic relatively quickly and almost completely, his philosophical remarks have received a more mixed response. This applies especially to his notion of abstraction, as introduced most explicitly in his well-known booklet, Was sind und was sollen die Zahlen? In this talk I will compare several different ways in which the nature of Dedekind abstraction has been understood. I will then propose a novel approach to it, based on formulating laws or principles that are analogous, at least to some degree, to neo-logicist abstraction principles. Motivating this approach further will involve reflecting on the purpose of Dedekind abstraction, as conceived of by Dedekind himself and as still relevant today.


Konstantinos Nikolantonakis (U Western Macedonia, Greece):
Were there "revolutions" in mathematics? Examples from the history of mathematics in light of T.S. Kuhn's historical philosophy of science

Friday, November 9, 2012
McGill University, Leacock Building, Room 927. 3:30-5:00pm

Abstract: The second half of the 20th century witnessed a kind of revolution in the history and philosophy of science with the edition of T.S. Kuhn's book Structure of Scientific Revolutions, published in 1962, presenting a view of science that is generally labeled as "historical philosophy of science". In this article I will discuss whether or not elements of the "historical philosophy of science" can be applied to the field of mathematics. My addressing the issue of whether or not Kuhn's view of scientific revolutions is applicable to mathematics has been inspired by my study on the formation of our ten numerals and the methods for the operation of multiplication during the Middle Ages in Europe. After presenting notions (object level and meta-level) from a very well known example from the literature concerning Non-Euclidean Geometry and using the analyses of Zheng and Dunmore we shall apply these notions to the field of arithmetic during the Middle Ages in Europe. Our argument focusses especially on the way we have passed from the arithmetic of pebbles, via Fibonacci and Pacioli, helped by the translation in Latin of Al-Khwarizmi's treatise, to the foundation of modern arithmetic.


Andrew Arana (U Illinois, Urbana-Champaign):
Transfer in algebraic geometry

Friday, November 2, 2012
McGill University, Leacock Building, Room 927. 3:30-5:00pm

Abstract: The focal question of this talk is to investigate the value of transfer between algebra and geometry, of the sort exemplified by the Nullstellensatz. Algebraic geometers frequently talk of such transfer principles as a "dictionary" between algebra and geometry, and claim that these dictionaries are fundamental to their practice. We'll first need to get clear on what such transfer consists in. We'll then investigate what how such transfer might improve how knowledge is gathered in algebraic geometric practice.


Paolo Mancosu (UC Berkeley):
Axiomatics and purity of methods: On the relationship between plane and solid geometry

With a commentary by Michael Hallett (McGill).

Thursday, April 19, 2012
Salle W-5215, Pavillon Thérèse-Casgrain (455 Boul. René-Lévesque), UQAM. 2:00-4:30pm

Abstract: Traditional geometry concerns itself with planimetric and stereometric considerations, which are at the root of the division between plane and solid geometry. To raise the issue of the relation between these two areas brings with it a host of different problems that pertain to mathematical practice, epistemology, semantics, ontology, methodology, and logic. In addition, issues of psychology and pedagogy are also important here.
In this talk (which is based on joint work with Andy Arana), my major concern is with methodological issues of purity. In the first part I will give a rough sketch of some key episodes in mathematical practice that relate to the interaction between plane and solid geometry. In the second part, I will look at a late nineteenth century debate (on "fusionism") in which for the first time methodological and foundational issues related to aspects of the mathematical practice covered in the first part of the paper came to the fore. I conclude this part of the talk by remarking that only through an axiomatic and analytical effort could the issues raised by the debate on "fusionism" be made precise. The third part of the talk focuses on Hilbert's axiomatic and foundational analysis of the plane version of Desargues' theorem on homological triangles and its implications for the relationship between plane and solid geometry. Finally, building on the foundational case study analyzed in the third section, in the fourth section I point the way to the analytic work necessary for exploring various important claims on "purity", "content" and other relevant notions.


Janet Folina (Macalester):
Is the proof in the picture? Seeing, believing and proving

Thursday, March 22, 2012
McGill University, Leacock Building, Room 927. 3:00-4:30pm

Abstract: What is the role of visual information in mathematics? Can pictures be proofs? This talk will appeal to several basic philosophical distinctions and a few simple examples of mathematical "pictures" in support of a limited role for diagrams in mathematical justification.


Patrick Girard (Auckland):
Being flexible about ceteris paribus reasoning

Monday, November 28, 2011
McGill University, Leacock Building, Room 927. 4:00-5:30pm

Abstract: Ceteris Paribus clauses in reasoning are used to allow for defeaters of norms, rules or laws, such as in von Wright's example "I prefer my raincoat over my umbrella, everything else being equal". I offer and analysis in which sets of formulas Γ, embedded in modal operators, provide necessary and sufficient conditions for things to be equal in ceteris paribus clauses. For most laws, the set of things allowed to vary is small, often finite, and so Γ is typically infinite. Yet the axiomatisation provided so far can only deal with the special and atypical case in which Γ is finite. I address this problem by being more flexible about ceteris pairbus conditions, in two ways. The first is to offer an alternative, slightly more general semantics, in which the set of formulas are only give necessary but not (necessarily) sufficient conditions. This permits a simple axiomatisation. The second is to consider those sets of formulas which are sufficiently flexible to allow the construction of a satisfying model in which the stronger necessary-and-sufficient interpretation is maintained. I finally discuss how this more abstract setting relates to von Wright's initial idea.


Jean-Baptiste Joinet (Paris 1):
Toward protological foundations for logic

Thursday, October 27, 2011
McGill University, Leacock Building, Room 927. 5:30-7:00pm

Abstract: I will try to present and characterize the kind of answer to the traditional question "Which foundations for logic?" that emerges from the work of the contemporary French school of proof-theory of Jean-Yves Girard and Jean-Louis Krivine. In particular, I will stress the architectonic role of negation (duality) as a taming agent of the wild, protological world of computational interactions, from which the logical rules (types constructions) emerge. If times allows, I will finally discuss the impact of these ideas on the perspective of a physical foundation for logic.


Göran Sundholm (Leiden):
Three kinds of function

Thursday, October 6, 2011
McGill University, Leacock Building, Room 927. 5:30-7:00pm

Abstract: The development of the notion of function in commonly held to have gone from the idea that functions are (anchored in) expressions with free variables to the idea that they are mappings not tied to expressions and that the "sets of ordered pairs unique in the last component" conception is the precise version of this. I shall, to the contrary, distinguish three notions and discuss examples : 1. Euler-Frege functions — dependent objects of lowest level, with substitution taking the role of application; 2. Riemann-Dedekind mappings — independent objects of higher level, with a primitive notion of application; 3. Courses of value ("graphs"), used by Frege, Von Neumann, and set theory (Russell, Hausdorff, ...) — independent objects of lowest level, where one needs a special application function of kind 1. (Frege's curved arch, Von Neumann's [x,y], Russell's elevated inverted comma for descriptive functions.


Olivia Caramello (Cambridge):
The idea of bridge and its unifying role in science

Thursday, September 22, 2011
Salle W-5215, Pavillon Thérèse-Casgrain (455 Boul. René-Lévesque), UQAM. 10:30-11:30am

Abstract: In the paper "The unification of Mathematics via Topos Theory" I introduced a new point of view on the concept of Grothendieck topos, namely the idea of a topos as a 'bridge' which can be effectively used for transferring information between distinct mathematical theories. The topos-theoretic techniques resulting from an implementation of this idea have already proved themselves to be very fruitful in Mathematics; indeed, they have generated a great number of non-trivial applications in distinct mathematical fields including Algebra, Topology, Algebraic Geometry, Model Theory and Proof Theory. On the other hand, one can further abstract from these methodologies to try to identify the real essence of the idea of 'bridge', and look for other incarnations of the concept both in Mathematics and in different scientific fields. It turns out that the idea of bridge is intimately tied to that of invariance, and that a suitable combination of these two concepts can play a unifying role in Science as well. In the talk I will begin by reviewing the philosophical principles underlying the unification methodologies and proceed to sketch the general idea of bridge; I will then consider the relationship between this concept and the idea of invariance, and discuss the organizing role of these two notions in Mathematics and Science in general. The analysis will be complemented by analogies with concepts in Linguistics, Physics and Biology.


Sébastien Gandon (Clermont-Ferrand):
Indoor geometry: On the tradition of construction with obstructions in the plane

Wednesday, February 16, 2011
McGill University, Leacock Building, Room 927. 5:00-7:00pm

Abstract: In order to prove a geometric theorem, one often has to extend the lines and to introduce new points in the given figure. But, what to do if the sheet of paper on which one does the constructions is too small to encompass the extensions? The obvious answers are: "do it again", "take this problem into consideration when beginning your drawing", "take a larger sheet of paper" or "draw a smaller figure". However, there is—and has been—another answer, which consists in attempting to prove the theorem without going over the edge of the sheet of paper. In this talk, I will speak about this tradition of geometrical constructions "with obstructions in the plane". I will claim that it has a long history (one finds some trace of it in Proclus and Hero of Alexandria), and that it always has a double dimension: practical, on the one hand, and foundational, on the other (how to reconcile the infinity of Euclidean space with the finitude of the heavens?). I will secondly claim that this sort of issue has played a very important role in the foundational discussions concerning the nature of projective space in Klein and Pasch.


Agustín Rayo (MIT):
An account of possibility

Tuesday, December 21, 2010
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: I develop an account of the sorts of considerations that should go into determining where the limits of possibility lie.


Jacques Dubucs (IHPST, Paris):
Une nouvelle interprétation de la philosophie des mathématiques de Kant

Tuesday, December 21, 2010
Salle W-5215, Pavillon Thérèse-Casgrain (455 Boul. René-Lévesque), UQAM. 11:00-12:30pm

Abstract: L'interprétation de la philosophie des mathématiques de Kant a une longue histoire, dans laquelle alternent critiques destructives et tentatives de réhabi litation. Dans les épisodes marquants de cette saga figurent la critique de Russell, puis la réhabilitation proposée par Beth et Hintikka. Selon ces derniers, et contrairement à ce qu'écrivait Russell, la fameuse « construction des concepts dans l'intuition » se résume à la méthode d'« instanciation » bien connue des logiciens, et Kant ne contrevient donc en rien aux normes de la logique mathématique contemporaine. Je me propose de montrer que 1) Cette interprétation, qui ne distingue pas entre le problème de la généralité tel qu'il était posé par Locke et Berkeley et la façon dont le criticisme l'aborde, est irrecevable au regard des textes kantiens 2) Une interprétation fidèle à la continuité entre l'Esthétique Transcendantale et la Méthodologie Transcendantale doit faire droit à la variante kantienne de l'idée d'anisotropie des possibles, c'est-à-dire à la distinction entre Begriffsmöglichkeiten et Anschauungsmöglichkeiten 3) Une telle conception peut être expliquée et, jusqu'à un certain point, défendue, à l'aide de la notion contemporaine de modèle attendu d'une théorie mathématique.


Colin McLarty (CASE Western):
On the current state and the prospects for philosophy of mathematics

Friday, December 10, 2010
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: This will be more an essay than an argument. It will address why philosophy as a whole needs philosophy of mathematics, and why philosophy of mathematics needs contact with living mathematics. That could be the mathematics of Euclid insofar as we can recover the sense of Euclid as alive rather than a cut and dried textbook, but should include the latest mathematics as well since we cannot take that as cut and dried. It will talk about philosophy of mathematics in relation to logic, in relation to history of mathematics, and to philosophy of science and philosophy of language.


Mic Detlefsen (Notre Dame):
Freedom in mathematics

Friday, October 29, 2010
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: There are different types of freedom that figured in the discussions of foundational thinkers in the nineteenth and early twentieth centuries. Prominent among these was one which was commonly known as freedom of concept-formation (freie Begriffsbildung) or concept-introduction. Freedom of concept-introduction was essentially a negative freedom. Specifically, it was a freedom from the traditional empiricist-constructivist constraint on concept-introduction, a constraint I will generally refer to as the Instantiation Condition. According to this condition, a concept can be legitimately introduced into mathematical practice only if its content is obtainable from that of an intuition or experience by application of an identified process of abstraction. The concern was not ultimately with how, as a matter of human psychology, we manage to form concepts (and/or such linguistic expressions as are generally used to represent them). Rather, it was with what constitutes the admission of a concept into mathematical practice, and the conditions under which such admission is justified. These will therefore be my chief concerns here too.


John Bell (Western Ontario):
The Axiom of Choice in a constructive setting

Monday, April 12, 2010
McGill University, Leacock Building, Room 927. 3:00-5:00pm

Abstract: The talk concerns the status of the Axiom of Choice in various constructive contexts, including intuitionistic set theory, constructive type theory and Hilbert's epsilon calculus.


Marcus Rossberg (Connecticut):
Non-conservativeness in higher-order logic

Friday, March 12, 2010
McGill University, Leacock Building, Room 927. 3:30-5:00pm

Abstract: I present a new difficulty that proof-theoretic approaches to a theory of meaning face. I show that third-order logic is not conservative over second-order logic: there are sentences formulated in pure second-order logic that are theorems of third-order logic, but cannot be proven in second-order logic. The proof is a corollary of the definability of a truth predicate for second-order arithmetic in third- order logic, that has until now escape attention. The challenge is that this inability to demonstrate the truth of such second-order sentences using the operational rules of second-order logic alone seems to refute the claim of proof-theoretic semantics that the meaning of the quantifiers is determined by their introduction and elimination rules: such sentences — being truths of third-order logic — should be true in virtue of the meaning of the logical vocabulary. An investigation of the Henkin models for higher-order logic suggests, perhaps surprisingly, that the meaning of the second- and higher-order quantifier is determined by their introduction and elimination rules after all.


Richard Zach (Calgary):
The Decision Problem and the development of metalogic

Friday, December 4, 2009
McGill University, Leacock Building, Room 927. 4:00-5:30pm

Abstract: In parallel with their work on proof theory in the 1920s and early 1930s, Hilbert and his collaborators and students—in particular, Ackermann, Behmann, Bernays, and Schönfinkel—did substantial work towards a positive solution for the decision problem. This begins with an unpublished talk by Behmann in 1921 in which the term "Entscheidungsproblem" first appears, and continues until the early 1930s with a number of published as well as unpublished contributions. Approaches to the decision problem evolved significantly during this time, from a purely algebraic approach in the style of Schröderian algebra of logic to relatively modern proofs which establish the finite controllability of certain prefix classes of formulas. This evolution goes hand-in-hand with an evolution of attendant concepts, in particular, semantic concepts such as satisfiability. An analysis of this work sheds light on the development of the semantics of first-order logic in the 1920s, on changing views as to what constitutes a "decision procedure," and on the connection between the decision problem and the consistency problem.


Gregory Lavers (Concordia):
Frege the conventionalist, Carnap the Fregean

Wednesday, September 30, 2009
McGill University, Leacock Building, Room 927. 5:30-7:00pm

Abstract: In this paper I examine the fundamental views on the nature of logical and mathematical truth of both Frege and Carnap. I argue that their positions are much closer than is standardly assumed. I attempt to establish this point on two fronts. First, I argue that Frege is not the metaphysical realist that he is standardly taken to be. Second, I argue that Carnap, where he does differ from Frege, can be seen to do so because of mathematical results proved in the early twentieth century. The differences in their views are, then, not primarily philosophical differences. Also, it might be thought that Frege was interested in analyzing our ordinary mathematical notions, while Carnap was interested in the construction of arbitrary systems. I argue that this is not the case: our ordinary notions play an even more important role in Carnap's philosophy of mathematics than they do in Frege's.

For more information, please subscribe to our mailing list, or contact:
Gregory Lavers (Concordia University),
Jean-Pierre Marquis (Université de Montréal),
Mathieu Marion (Université du Québec à Montréal),
Dirk Schlimm (McGill University).