The purpose of this workshop series is to bring together researchers in Montreal who are interested in the history and philosophy of mathematics to better get to know each other and each other's work. Occasionally, we also plan to invite speakers from all over the world. The talks will on traditional topics of history and philosophy of mathematics, as well as on mathematical practice, reasoning with diagrams, mathematical cognition and education.

To receive current announcements, please contact one of the organizers listed below.


Alban Da Silva (Paris Cité):
Sand-drawing in Vanuatu: A case in Ethnomathematics

Friday, November 25, 2022
McGill University. Leacock Building, Room 927. 3:30-5:00pm

Abstract: In the central islands of Vanuatu (former New Hebrides) exists a practice of drawing symmetrical figures on the ground, according to ‘rules’ that constrain the practice and that allow us to study its ‘mathematical dimension’. Based on a survey carried out between 2016 and 2019 on Pentecost Island within the Raga society, and a re-reading of the Marcia Ascher’s ethnomathematical work, I have developed a new mathematical model and IT-tools to highlight the operative and algorithmic nature of this practice. The ethnographic data led me to introduce several concepts—cycles in graphs, topological operations, algorithms—that seem to be at the very core of the processes of creation and memorization of these drawings. These modeling tools allow us to study the practice of sand-drawing among the Raga societies, not as an isolated fact, but as a revelation of their way of ‘experiencing and inhabiting the world’ as many anthropologists, like Philippe Descola, are now used to talk about ‘culture’. From Veblen’s theorem to the ontologies of the Raga, this talk will present a case of mathematical otherness that might be of interest to historians of mathematics as well as philosophers of mathematics and anthropologists.


Past talks

Juan Fernández-González and Dirk Schlimm (McGill):
From a doodle to a theorem: a case study in mathematical discovery

Tuesday, December 15, 2020
ZOOM. 2:00-3:00pm

Abstract: In this paper we present the genesis of a theorem in geometry, the Midpoint Path Theorem, from the original idea to the published version. It makes it possible to multiply the length of a line segment by 0 < r/s < 1, a rational number, by constructing only midpoints and a straight line. This can be achieved with a compass and a straightedge. We explore the narrative behind the discovery, with first-hand insights by its author. Some general aspects of this case study of mathematical practice are discussed.


Sorin Bangu (Bergen):
Mathematical explanations of physical facts

Friday, February 14, 2020
McGill University, Leacock Building, Room 927. 3:30-5:00pm

Abstract: The paper proposes an account of how mathematical explanations of physical facts might work. I argue that the role of mathematics is explicatory, and give reasons to believe that this explicatory role is also an explanatory one. In more detail, the plan of the talk is as follows. I begin with a close analysis and critique of a recent theory of such explanations, due to Marc Lange (2013; 2017). Then I introduce the explicatory model, which I test on several examples. Some of these examples have already been proposed in the literature, and I argue that the model can accommodate them naturally: in all cases, mathematics is explicatory and, under certain conditions, also explanatory. Time permitting, I’ll end with a discussion of a new example.


Norbert Schappacher (Strasbourg):
Looking for unity. The politics, philosophy, and mathematics of Claude Chevalley in the 1930s

Friday, September 27, 2019
McGill University, Leacock Building, Room 927. 3:00-4:30pm

Abstract: In the 1930s (and once more in the 1970s) the mathematician and cofounder of Bourbaki, Claude Chevalley, engaged himself in what he himself would later call "theoretical politics". In the talk I will try to present the current state of my search for material about Chevalley's activities in the 1930s, with a view to a global appraisal of his thought at the time. Topics will include the extent of Dandieu's "dichotomic method" as well as the dual contacts — mathematical and political — that Chevalley had with German partners.


Yacin Hamami (VU Brussels):
Rigor judgments in mathematical practice

Friday, September 20, 2019
McGill University, Leacock Building, Room 927. 3:30-5:00pm

Abstract: How are mathematical proofs judged to be rigorous in mathematical practice? Traditional answers to this question have usually considered that judging the rigor of a mathematical proof proceeds through some sort of comparisons with the standards of formal proof. Several authors have argued, however, that this kind of view is implausible (see, e.g., Robinson, 1997; Detlefsen, 2009; Antonutti Marfori, 2010), and have thus called for the development of a more realistic account of rigor judgments in mathematical practice. In this talk, I will sketch a framework aiming to move forward in this direction. My starting point is the observation that judging a mathematical proof to be correct or rigorous amounts to judging the validity of each of the inferences that comprise it. Accordingly, the framework focuses on the processes by which mathematical agents identify and judge the validity of inferences when processing the text of an ordinary mathematical proof. From the perspective of the resulting framework, I will then discuss what is sometimes called the standard view of mathematical rigor, by examining whether there is any ground supporting the thesis that whenever a proof has been judged to be rigorous in mathematical practice it can be routinely translated into a formal proof.


Markus Pantsar (Helsinki):
From computational to cognitive complexity: Considerations from mathematical problem solving

Wednesday, April 3, 2019
McGill University, Leacock Building, Room 927. 5:00-6:30pm

Abstract: In the research of mathematical cognition, one of the key questions is the complexity of cognitive processes involved in mathematical problem solving tasks. In the commonly used paradigm of computational modelling of cognitive tasks, such tasks are characterized functionally, i.e., purely in terms of their input and output. In the case of mathematical problem solving, this paradigm allows a direct application of computational complexity measures in determining the complexity of cognitive tasks. In computational complexity theory, complexity of problems is characterized through the concept of Turing machine. More specifically, the complexity of a problem is defined as the complexity of an optimal algorithm for solving the problem. An optimal algorithm refers to an algorithm run by a Turing machine that takes the least amount of computational resources (time or space).

In this talk, I argue that while useful, focusing on the computational complexity measures can be ill-fitted in characterizing mathematical problem solving processes in several ways. I will present examples from mathematical practice that show human problem solvers to often use characteristically suboptimal problem solving algorithms. These are already present in mental arithmetic and are evident in many important aspects of mathematical problem solving, including diagrams and the spatial arrangement of symbols. I will conclude that in order to respect such integral parts of real-life problem solving processes, we must focus on humanly optimal, rather than computationally optimal problem solving algorithms. However, the humanly optimal algorithms are not universal and thus the resulting concept of cognitive optimality must take into account the culturally determined aspects of mathematical problem solving, ranging from the symbol systems to the cognitive tools used in the problem solving processes.


Yelda Nasifoglu (Oxford):
Reading Euclid's Elements of Geometry in early modern Britain

Wednesday, January 30, 2019
McGill University, Leacock Building, Room 927. 5:00-6:30pm

Abstract: Since its creation c. 300 BCE, Euclid's Elements of Geometry has been a subject of study both as a canonical mathematical text and a representative of ancient thought. It enjoyed particular popularity during the Early Modern period when hundreds of editions of the text appeared between 1482 and 1700. Depending on their theoretical and practical functions, they ranged between elaborate folios and pocket-size compendia, and were widely studied by scholars, natural philosophers, mathematical practitioners, and schoolchildren alike. In this presentation, I will discuss some of the research conducted for the AHRC-funded "Reading Euclid's Elements of Geometry in Early Modern Britain" project based at the History Faculty, University of Oxford, paying particular attention to how the project has used Book History to understand the changes in how Euclid was read, printed, collected, annotated, and taught.


Michael Makkai (McGill):
The treatment of Cantorian sets by Bourbakian abstract set-theory

Thursday, December 6, 2018
Université de Montréal, Dept of Philosophy, 2910 Edouard-Montpetit, Room 422. 2:00-4:00pm

Abstract: Abstract sets, that is, sets with "ur-elements", elements with no individuality other than being members of the set in question, have been around from the start, for Dedekind, Cantor, and others. They became of importance for Bourbaki, who insists that the identity of a mathematical object lies in its structural relations to other objects, rather than in an intrinsic identity. Lawvere's first-order theory of the category of sets (FOTCS) and his subsequent topos theory are decisive steps towards a set-theory that allows only abstract sets. I have introduced a simple formal system of abstract set-theory based on dependent types for which Bourbaki's requirement "all properties must be invariant under isomorphisms" holds true as a meta-theorem in the strong ("parametric") form demanded by Bourbaki (in Lawvere's FOTCS, as Colin McLarty has shown, the weaker non-parametric version is true only). This time I want to emphasize the inclusiveness of the new abstract set-theory, not shared by topos theory in a sufficiently natural way, by explaining that, in abstract set theory, present-day epsilontic Cantorian set-theory can be formulated as a theory of a particular Bourbakian "species of structures".


Jean-Michel Salanskis (Paris X Nanterre):
Visions of mathematics and philosophy of mathematics

Wednesday, September 19, 2018
McGill University, Leacock Building, Room 927. 5:30-7:00pm

Abstract: This talk gives a brief and synthetic picture of J.-M. Salanskis' views concerning mathematics and philosophy of mathematics. First, it is sustained that we can understand mathematical knowledge in the line of hermeneutics in the sense of Heidegger and Gadamer. This goes through exhibiting a formal diagram of hermeneutics and showing how hermeneutics works inside mathematics. Then, it is argued that we can discern and distinguish two strata of mathematical objectivity: constructive stratum and "correlative" stratum. Both can be defined as intentional types in the sense of Husserl, carrying with them a way of ascertaining truth. It is even suggested that both have to be conceived as working together: their solidary play accounts for mathematical infinitarism as well as for exceptional robustness of mathematical truth. In such a view, Brouwer appears as Hilbert's best support. Ultimately, what has been presented gets synthetized in the framework of a general conception of philosophy of mathematics, the agenda of which is determined as answering five big questions.


Georg Schiemer (U Vienna):
Transfer principles and structural equivalence

Friday, November 24, 2017
Université de Montréal, Dept of Philosophy, 2910 Édouard-Montpetit, Room 422. 3:00-4:30pm

Abstract: Structuralism in the philosophy of mathematics holds that pure mathematics is the science of abstract structures and that mathematical objects are merely positions in such structures. An related characterization states that mathematical theories study only structural properties of such objects, i.e. properties not concerning their intrinsic nature, but rather their interrelations with other objects in a given system. In the mathematical prehistory of modern structuralism, in particular, in nineteenth- and early twentieth-century mathematics, one can identify at least two general methods how to characterize the structural content of theories. According to the first approach, structural properties of mathematical systems (such as number systems or geometrical spaces) are specified axiomatically, based on the notion of 'implicit definition' in a mathematical language. Dedekind's treatment of the natural numbers in Was sind und was sollen die Zahlen (1888) and Hilbert's axiomatization of Euclidean geometry in Grundlagen der Geometrie (1899) are paradigmatic examples of this use of structural axiomatics. According to the second approach, structures or structural properties of objects in a given mathematical field are characterized in terms of the notion of invariance under transformations. Klein's group-theoretic approach to classify different geometries in terms of their transformation groups in Vergleichende Betrachtungen über neuere geometrische Forschungen (1872) is an important case of such an invariance-based account. In the talk, I want to compare these two ways how to think about mathematical structure both from a historical and a conceptual perspective. In particular, my focus will be on the question how the notion of structural equivalence of mathematical systems (or theories) is characterized in work related to the two traditions. As will be shown, both in axiomatic and in invariance-theoretic contributions, such equivalence criteria have been specified in terms of so-called `transfer principles', i.e. mappings between systems that preserve their central structural features.


Kerry McKenzie (UC San Diego):
Delusions of a final theory: structuralist metaphysics and the problem of theory change

Friday, October 13, 2017
Université de Montréal, 2910 Edouard-Montpetit (the Philosophy department), local 422. 14:00-16:00pm

Organized in collaboration with the Réseau Montréalais de philosophie des sciences / Montréal Philosophy of Science Network.

Abstract: Structuralist philosophy of science in its contemporary guise is committed to three core theses: first, that science makesprogress; second, that it is structure that is ontologically fundamental; and third, that our metaphysics must be informed by science if it is to have any value. But these three theses give rise to an obvious tension, given that we as yet lack a fundamental physics theory that can inform the claims that lie at the heart of its metaphysics. To resolve the tension, one might hope that we can regard metaphysics based on merely pro tem fundamental physics to at least be making progress toward the description of the truly fundamental level. But I will argue that any such notion of progress cannot be analogous to that which science enjoys. At the root of this is the fact that structuralist metaphysics, for all its naturalistic credentials, is in fact a form of 'analytic’ metaphysics, and the categories of the latter have an all-or-nothing character that makes them intrinsically unreceptive to any meaningful notion of approximation. However, with this now in plain sight, we are better positioned to imagine what a structuralist metaphysics have should looked like all along – a metaphysics, that is, that is tolerant enough to undergo progress as well as merely suffer change.


Dirk Schlimm (McGill):
Frege's Begriffsschrift notation: Design principles and trade-offs

Friday, September 15, 2017
McGill University, Leacock Building, Room 927. 11:30am-1:00pm

Abstract: Well over a century after its introduction, Frege's two-dimensional Begriffsschrift notation is still considered mainly a curiosity that stands out more for its clumsiness than anything else. This talk focuses mainly on the propositional fragment of the Begriffsschrift, because it embodies the characteristic features that distinguish it from other expressively equivalent notations. I argue for the perspicuity and readability of the Begriffsschrift by discussing several idiosyncrasies of the notation, which allow an easy conversion of logically equivalent formulas, and presenting the notation's close connection to syntax trees. Moreover, Frege's considerations regarding the design principles underlying the Begriffsschrift are presented. Frege was quite explicit about these in his replies to early criticisms and unfavorable comparisons with Boole's notation for propositional logic. This discussion reveals that the Begriffsschrift is in fact a well thought-out and carefully crafted notation that intentionally exploits the possibilities afforded by the two-dimensional medium of writing like none other.


Cheryl Misak (Toronto):
Ramsay and the foundations of mathematics

Thursday, September 21, 2017
McGill University, Leacock Building, Room 927. 6:00-8:00pm

Abstract: The meeting will have to parts: First, Prof. Misak will give a talk on "Ramsey and the Vienna Circle". This will provide the context for the second part, namely a discussion of extracts from a draft of Prof. Misak's intellectual biography of Ramsey, which participants should have read in advance.


Michael Cuffaro (U of Western Ontario):
Universality, invariance, and the foundations of computational complexity in the light of the quantum computer

Thursday, February 16, 2017
Concordia University, S-05, 2145 McKay. 4:00-5:30pm

Abstract: Computational complexity theory is a branch of computer science that is dedicated to classifying computational problems in terms of their difficulty. While computability theory tells us what we can compute in principle, complexity theory informs us with regard to our practical limits. It thus provides a bridge between the philosophy of mathematics and the philosophy of technology. In this chapter I argue that the science of quantum computing illuminates complexity theory. It does so by emphasising that the fundamental concepts of complexity theory are not model-independent. Yet I argue that this does not, as some have suggested, force us to radically revise the foundations of complexity theory, for model-independence never has been essential to those foundations. Complexity theory is best characterised as a practical science, in the sense that its fundamental aim is to describe what is achievable in practice under various models of computation for our various practical purposes. Model-independence is inessential to this aim, and re ecting on quantum computing illuminates complexity theory by reminding us of this, too often under-emphasised, fact.


Michael Makkai (McGill):
In defense of Bourbaki's structuralism

Thursday, November 17, 2016
Université de Montréal, Dept of Philosophy, 2910 Edouard-Montpetit, Room 422. 2:00-4:00pm

Abstract: I will explain Bourbaki's concept of species of structures, described in their Elements of Mathematics, Volume 1, Set Theory, Chapter IV, "Structures". Bourbaki's definition is irreducibly meta-mathematical. Armed with a meta-mathematical understanding of Bourbaki's notion and its relation to formal languages, we come to new formal languages that support improved and generalized versions of the concept of structure, ones that give rise to a more robust, more defensible, structuralist philosophy of mathematics.


Patrick Girard (U of Auckland):
Impossible modal logic

Tuesday, October 11, 2016
Concordia University, S-05, 2145 McKay. 4:15-5:45pm

Abstract: I will present semantics for modal logic and definability theorems about it, working entirely within a paraconsistent background logic and inconsistent naive set theory. The goal is to develop a paraconsistent modal logic on its own terms, using a thoroughgoing paraconsistent metatheory, and see its power as metaphysics. There are many paraconsistent logics, but I adopt a strong stance, dialetheism, in which some contradictions are taken to be true. I will present results about modal definability. These are first steps in a non-classical metaphysics. Other expected results (such as completeness) await a more mature theory.


Michael J. Barany (Princeton U):
Foundations, horizons, pasts, and futures in the history and philosophy of modern mathematics

Wednesday, March 23, 2016
McGill University, Leacock Building, Room 927. 5:00-6:30pm

Abstract: Historians and philosophers of mathematics have long recognized that there can be a substantial difference between what makes a mathematical idea right and what makes it meaningful, useful, exciting, or profound. These latter features, which can dominate contextually-sensitive accounts of mathematical history and practice, are often only indirectly related to the foundational desiderata of logical entailment or soundness that have preoccupied so many historians and philosophers of the discipline. Where foundational analyses can treat mathematical ideas as fixed elements of a static body of concepts and implications, to study mathematical meaning, use, and related themes (including their relation to foundational questions) requires regarding mathematical ideas as dynamic and as implicating context-dependent pasts and futures. I will present an approach for such a historicist history and philosophy of modern mathematics and develop some of its consequences through a discussion of two parallel episodes: the historical adoption of the theory of distributions in mathematical analysis from 1946–1950 and the contemporary adoption of perfectoid spaces in arithmetic geometry. While both have been understood as successful interventions in foundational terms, I argue that their rise and importance can be best understood only with reference to how their proponents reframed pasts and futures in their respective contexts of communication and research. In particular, I call attention to aspects of routine mathematical practice such as wordplay, citation, and academic travel that are normally absent in foundational narratives but prove essential here.


Catarina Dutilh Novaes (U Groningen):
Reductio proofs from a dialogical perspective

Wednesday, April 29, 2015
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: It is well known that reductio proofs pose interesting philosophical questions. What does it mean to assert something with the precise goal of then showing it to be false, i.e. because it leads to absurd conclusions? What kind of speech act is that? Moreover, the mathematics education literature has numerous studies showing how hard it is for students to truly comprehend the idea of reductio proofs, which indicates the cognitive complexity of these constructions. In my talk, I take as a starting point a dialogical conceptualization of deductive proofs, according to which a deductive proof is best understood as a dialogue between two (fictitious) participants — Prover and Skeptic. I argue that many of the philosophical and cognitive difficulties surrounding reductio proofs are dispelled once one adopts a dialogical perspective.


Matt Clemens (U Southern Indiana):
Toward an artifactualist account of mathematics

Friday, March 13, 2015
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: A weakening of mathematical platonism leads to an anti-nominalist view, which affirms the existence and abstractness of mathematical objects, while denying the independence of such objects from cognitive agents. One way such an anti-nominalist view might be developed is to take mathematical objects to be abstract artifacts, created by the descriptive acts of mathematicians. In this talk, I will sketch a version of this kind of view and distinguish it from a few related approaches in the philosophy of mathematics.


Vincenzo de Risi (MPI Berlin):
The development of Euclidean axiomatics in the early modern age

Friday, February 20, 2015
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: I will discuss the development of the system of axioms of elementary geometry in the main editions of Euclid in the 16th, 17th and 18th centuries, as well as the underlying epistemology. Several mathematicians, in fact, added, changed, or removed axioms and postulates in the Elements, engendering a wide ranging discussion on the foundations of geometry. Issues about continuity, parallelism, mereology, irrational magnitudes, licensed geometrical constructions, and many other topics were raised in these treatises and deeply influenced the development of epistemology. I will try to spell out the most important changes, that eventually produced a new understanding of the nature of the geometrical principles and mathematics itself.


Oran Magal (McGill):
Analyticity, triviality, and creativity: the case of mathematics

Friday, January 16, 2015
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: Poincaré provocatively asked whether mathematics is merely “an immense tautology”. This is a variant of the so-called paradox of analysis, which I argue arises not only in the context of conceptual analysis but also for deductive reasoning quite generally. To resolve it, I suggest that the same reasoning, e.g., that involved in the construction of a proof, can be seen as both creative and analytic at the same time, accounting for its non-triviality despite its being, in some sense, logically 'given in advance'. This is an instance of a more broadly applicable distinction between two directions of analyticity, so to speak: prospective and retrospective. Finally, I would like to use this idea of analyticity as potentially substantive rather than trivial to bring out some merits of Gödel's argument that mathematics is analytic: a substantive body of 'truths by virtue of meaning'. If this view is defensible, it is an interesting alternative to empiricist-'naturalistic' philosophical accounts of mathematics.


Stephen Menn (McGill):
Eudoxus' theory of proportion and his method of exhaustion

Friday, November 21, 2014
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: Euclid in Elements V gives an astonishingly rigorous and logically complex formulation of the theory of proportion, proving such propositions as "alternation" (if A:B::C:D then A:C::B:D) for all magnitudes, before applying them to lines and areas in Elements VI. It is very hard to see what could have motivated, or led to the discovery of, such a complex set of proofs of propositions that might easily be taken for granted (notably "if A>B then A:C>B:C"). (T.L. Heath's story, that this theory was provoked by a foundational crisis caused by the discovery of incommensurables, was refuted 80 years ago by Oskar Becker.) A possible clue comes from an anonymous scholiast who says that much of Elements V goes back to Eudoxus (a collaborator in Plato's Academy, 50–100 years before Euclid). We know, on better grounds, that Eudoxus invented the "method of exhaustion" used by Euclid in Elements XII to prove e.g. that circles are to each other as the squares on their diameters, and that a cone is one-third the volume of a cylinder with the same base and height. It is easier to explain the origin of the method of exhaustion than of the Euclidean theory of proportion, and if, as is often thought, the two theories were somehow linked for Eudoxus, this might help us understand the proportion theory, but it is remarkably difficult to explain how the two theories were connected. Building on work of Wilbur Knorr, which distinguishes an earlier Eudoxian theory of proportion (surviving in Archimedes Equilibrium of Planes I) from Euclid's theory in Elements V, I offer a reconstruction, first of how Eudoxus could have been led to discover his theory of proportion in connection with the method of exhaustion, and then of how Euclid could have been led to develop his theory of proportion out of Eudoxus'.


Stewart Shapiro (Ohio State):
Frege on the real numbers

Friday, March 28, 2014
McGill University, Arts Building, Room 160. 12:00-2:00pm

Abstract: This paper is concerned with Gottlob Frege's theory of the real numbers as sketched in the second volume of his masterpiece Grundgesetze der Arithmetik. It is clear that Frege's incomplete sketch represents a mathematically significant proposal in its own right, one which purports to have several important advantages over competing contemporary theories. It is perhaps unsurprising that Frege's theory of the real numbers is intimately intertwined with and largely motivated by his metaphysics, something which has of course received a great deal of independent attention. One of Frege's more significant claims in the Grundgesetze is that the cardinal numbers and the real numbers are ontologically distinct, or constitute "completely different domains". Cardinal numbers answer the question "How many things of a certain kind are there?", while real numbers answer the question "How large is a certain magnitude compared to a unit of magnitude of that same kind?" The account raises interesting, and surprisingly underexplored, questions about Frege's metaphysics: Can this metaphysics even accommodate mass quantities like water, gold, light intensity, or charge? Frege's main complaint with his contemporaries Cantor and Dedekind is that their theories of the real numbers do not build the applicability of the real numbers directly into the construction. In taking Cantor and Dedekind's Arithmetic theories to be insufficient, clearly Frege takes it to be a desideratum on a theory of the real numbers that their applicability be essential to their construction. But why? After all, it's not as if we can actually measure magnitudes like weight or density with the kind of infinite precision embodied by the real numbers anyway.
We begin with a detailed review of Frege's theory, one that mirrors Frege's exposition in structure. This is followed by a critique, outlining Frege's linguistic motivation for ontologically distinguishing the cardinal numbers from the real numbers. We briefly consider how Frege's metaphysics might need to be developed, or amended, to accommodate some of the problems. Finally, we offer a detailed examination of Frege's Application Constraint — that the reals ought to have their applicability built directly into their characterization. It bears on deeper questions concerning the relationship between sophisticated mathematical theories and their applications.


Sean Walsh (UC Irvine):
The aims of arithmetization and the analysis of number in Kronecker, Weierstrass, and Frege

Friday, November 1, 2013
McGill University, Leacock Building, Room 927. 4:00-5:30pm

Abstract: Late 19th Century philosophy of mathematics is renown for both the program of arithmetization and various attempts to define or characterize the natural number concept. What were the goals of arithmetization in the eyes of practioners such as Kronecker, Weierstrass, and Frege, and in what ways did these goals constrain or motivate the attempt to define or characterize the number concept? Focusing even on this small subset of thinkers (and some related figures), the variety of answers that one finds to these most basic questions is surprisingly large. While the chief goal of this talk is conceptual clarification, a subsidiary aim is to better chart the various lines of historical influence.


Eileen Nutting (U Kansas):
Hilbert's geometry and mathematical truth

Friday, October 25, 2013
McGill University, Leacock Building, Room 927. 4:00-5:30pm

Abstract: David Hilbert's primary goal in his work on the foundations of geometry was to demonstrate the consistency of both Euclidean and non-Euclidean geometries. His consistency proofs, however, place significant pressure on the contours of an adequate truth theory. To accommodate these pressures, he developed a novel account of geometrical truth. Understanding this account and its motivations gives us insight into what we ought to expect from a theory of mathematical truth. Notably, the resulting expectations conflict with the ones that Paul Benacerraf assumes in his paper "Mathematical Truth."


Erich Reck (UC Riverside):
The nature and purpose of Dedekind Abstraction

Friday, November 30, 2012
Concordia University, 1515 St-Catherine West, EV 11-705. 4:00-6:00pm

Abstract: While Richard Dedekind's technical contributions to the foundations of mathematics were absorbed into modern logic relatively quickly and almost completely, his philosophical remarks have received a more mixed response. This applies especially to his notion of abstraction, as introduced most explicitly in his well-known booklet, Was sind und was sollen die Zahlen? In this talk I will compare several different ways in which the nature of Dedekind abstraction has been understood. I will then propose a novel approach to it, based on formulating laws or principles that are analogous, at least to some degree, to neo-logicist abstraction principles. Motivating this approach further will involve reflecting on the purpose of Dedekind abstraction, as conceived of by Dedekind himself and as still relevant today.


Konstantinos Nikolantonakis (U Western Macedonia, Greece):
Were there "revolutions" in mathematics? Examples from the history of mathematics in light of T.S. Kuhn's historical philosophy of science

Friday, November 9, 2012
McGill University, Leacock Building, Room 927. 3:30-5:00pm

Abstract: The second half of the 20th century witnessed a kind of revolution in the history and philosophy of science with the edition of T.S. Kuhn's book Structure of Scientific Revolutions, published in 1962, presenting a view of science that is generally labeled as "historical philosophy of science". In this article I will discuss whether or not elements of the "historical philosophy of science" can be applied to the field of mathematics. My addressing the issue of whether or not Kuhn's view of scientific revolutions is applicable to mathematics has been inspired by my study on the formation of our ten numerals and the methods for the operation of multiplication during the Middle Ages in Europe. After presenting notions (object level and meta-level) from a very well known example from the literature concerning Non-Euclidean Geometry and using the analyses of Zheng and Dunmore we shall apply these notions to the field of arithmetic during the Middle Ages in Europe. Our argument focusses especially on the way we have passed from the arithmetic of pebbles, via Fibonacci and Pacioli, helped by the translation in Latin of Al-Khwarizmi's treatise, to the foundation of modern arithmetic.


Andrew Arana (U Illinois, Urbana-Champaign):
Transfer in algebraic geometry

Friday, November 2, 2012
McGill University, Leacock Building, Room 927. 3:30-5:00pm

Abstract: The focal question of this talk is to investigate the value of transfer between algebra and geometry, of the sort exemplified by the Nullstellensatz. Algebraic geometers frequently talk of such transfer principles as a "dictionary" between algebra and geometry, and claim that these dictionaries are fundamental to their practice. We'll first need to get clear on what such transfer consists in. We'll then investigate what how such transfer might improve how knowledge is gathered in algebraic geometric practice.


Paolo Mancosu (UC Berkeley):
Axiomatics and purity of methods: On the relationship between plane and solid geometry

With a commentary by Michael Hallett (McGill).

Thursday, April 19, 2012
Salle W-5215, Pavillon Thérèse-Casgrain (455 Boul. René-Lévesque), UQAM. 2:00-4:30pm

Abstract: Traditional geometry concerns itself with planimetric and stereometric considerations, which are at the root of the division between plane and solid geometry. To raise the issue of the relation between these two areas brings with it a host of different problems that pertain to mathematical practice, epistemology, semantics, ontology, methodology, and logic. In addition, issues of psychology and pedagogy are also important here.
In this talk (which is based on joint work with Andy Arana), my major concern is with methodological issues of purity. In the first part I will give a rough sketch of some key episodes in mathematical practice that relate to the interaction between plane and solid geometry. In the second part, I will look at a late nineteenth century debate (on "fusionism") in which for the first time methodological and foundational issues related to aspects of the mathematical practice covered in the first part of the paper came to the fore. I conclude this part of the talk by remarking that only through an axiomatic and analytical effort could the issues raised by the debate on "fusionism" be made precise. The third part of the talk focuses on Hilbert's axiomatic and foundational analysis of the plane version of Desargues' theorem on homological triangles and its implications for the relationship between plane and solid geometry. Finally, building on the foundational case study analyzed in the third section, in the fourth section I point the way to the analytic work necessary for exploring various important claims on "purity", "content" and other relevant notions.


Janet Folina (Macalester):
Is the proof in the picture? Seeing, believing and proving

Thursday, March 22, 2012
McGill University, Leacock Building, Room 927. 3:00-4:30pm

Abstract: What is the role of visual information in mathematics? Can pictures be proofs? This talk will appeal to several basic philosophical distinctions and a few simple examples of mathematical "pictures" in support of a limited role for diagrams in mathematical justification.


Patrick Girard (Auckland):
Being flexible about ceteris paribus reasoning

Monday, November 28, 2011
McGill University, Leacock Building, Room 927. 4:00-5:30pm

Abstract: Ceteris Paribus clauses in reasoning are used to allow for defeaters of norms, rules or laws, such as in von Wright's example "I prefer my raincoat over my umbrella, everything else being equal". I offer and analysis in which sets of formulas Γ, embedded in modal operators, provide necessary and sufficient conditions for things to be equal in ceteris paribus clauses. For most laws, the set of things allowed to vary is small, often finite, and so Γ is typically infinite. Yet the axiomatisation provided so far can only deal with the special and atypical case in which Γ is finite. I address this problem by being more flexible about ceteris pairbus conditions, in two ways. The first is to offer an alternative, slightly more general semantics, in which the set of formulas are only give necessary but not (necessarily) sufficient conditions. This permits a simple axiomatisation. The second is to consider those sets of formulas which are sufficiently flexible to allow the construction of a satisfying model in which the stronger necessary-and-sufficient interpretation is maintained. I finally discuss how this more abstract setting relates to von Wright's initial idea.


Jean-Baptiste Joinet (Paris 1):
Toward protological foundations for logic

Thursday, October 27, 2011
McGill University, Leacock Building, Room 927. 5:30-7:00pm

Abstract: I will try to present and characterize the kind of answer to the traditional question "Which foundations for logic?" that emerges from the work of the contemporary French school of proof-theory of Jean-Yves Girard and Jean-Louis Krivine. In particular, I will stress the architectonic role of negation (duality) as a taming agent of the wild, protological world of computational interactions, from which the logical rules (types constructions) emerge. If times allows, I will finally discuss the impact of these ideas on the perspective of a physical foundation for logic.


Göran Sundholm (Leiden):
Three kinds of function

Thursday, October 6, 2011
McGill University, Leacock Building, Room 927. 5:30-7:00pm

Abstract: The development of the notion of function in commonly held to have gone from the idea that functions are (anchored in) expressions with free variables to the idea that they are mappings not tied to expressions and that the "sets of ordered pairs unique in the last component" conception is the precise version of this. I shall, to the contrary, distinguish three notions and discuss examples : 1. Euler-Frege functions — dependent objects of lowest level, with substitution taking the role of application; 2. Riemann-Dedekind mappings — independent objects of higher level, with a primitive notion of application; 3. Courses of value ("graphs"), used by Frege, Von Neumann, and set theory (Russell, Hausdorff, ...) — independent objects of lowest level, where one needs a special application function of kind 1. (Frege's curved arch, Von Neumann's [x,y], Russell's elevated inverted comma for descriptive functions.


Olivia Caramello (Cambridge):
The idea of bridge and its unifying role in science

Thursday, September 22, 2011
Salle W-5215, Pavillon Thérèse-Casgrain (455 Boul. René-Lévesque), UQAM. 10:30-11:30am

Abstract: In the paper "The unification of Mathematics via Topos Theory" I introduced a new point of view on the concept of Grothendieck topos, namely the idea of a topos as a 'bridge' which can be effectively used for transferring information between distinct mathematical theories. The topos-theoretic techniques resulting from an implementation of this idea have already proved themselves to be very fruitful in Mathematics; indeed, they have generated a great number of non-trivial applications in distinct mathematical fields including Algebra, Topology, Algebraic Geometry, Model Theory and Proof Theory. On the other hand, one can further abstract from these methodologies to try to identify the real essence of the idea of 'bridge', and look for other incarnations of the concept both in Mathematics and in different scientific fields. It turns out that the idea of bridge is intimately tied to that of invariance, and that a suitable combination of these two concepts can play a unifying role in Science as well. In the talk I will begin by reviewing the philosophical principles underlying the unification methodologies and proceed to sketch the general idea of bridge; I will then consider the relationship between this concept and the idea of invariance, and discuss the organizing role of these two notions in Mathematics and Science in general. The analysis will be complemented by analogies with concepts in Linguistics, Physics and Biology.


Sébastien Gandon (Clermont-Ferrand):
Indoor geometry: On the tradition of construction with obstructions in the plane

Wednesday, February 16, 2011
McGill University, Leacock Building, Room 927. 5:00-7:00pm

Abstract: In order to prove a geometric theorem, one often has to extend the lines and to introduce new points in the given figure. But, what to do if the sheet of paper on which one does the constructions is too small to encompass the extensions? The obvious answers are: "do it again", "take this problem into consideration when beginning your drawing", "take a larger sheet of paper" or "draw a smaller figure". However, there is—and has been—another answer, which consists in attempting to prove the theorem without going over the edge of the sheet of paper. In this talk, I will speak about this tradition of geometrical constructions "with obstructions in the plane". I will claim that it has a long history (one finds some trace of it in Proclus and Hero of Alexandria), and that it always has a double dimension: practical, on the one hand, and foundational, on the other (how to reconcile the infinity of Euclidean space with the finitude of the heavens?). I will secondly claim that this sort of issue has played a very important role in the foundational discussions concerning the nature of projective space in Klein and Pasch.


Agustín Rayo (MIT):
An account of possibility

Tuesday, December 21, 2010
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: I develop an account of the sorts of considerations that should go into determining where the limits of possibility lie.


Jacques Dubucs (IHPST, Paris):
Une nouvelle interprétation de la philosophie des mathématiques de Kant

Tuesday, December 21, 2010
Salle W-5215, Pavillon Thérèse-Casgrain (455 Boul. René-Lévesque), UQAM. 11:00-12:30pm

Abstract: L'interprétation de la philosophie des mathématiques de Kant a une longue histoire, dans laquelle alternent critiques destructives et tentatives de réhabi litation. Dans les épisodes marquants de cette saga figurent la critique de Russell, puis la réhabilitation proposée par Beth et Hintikka. Selon ces derniers, et contrairement à ce qu'écrivait Russell, la fameuse « construction des concepts dans l'intuition » se résume à la méthode d'« instanciation » bien connue des logiciens, et Kant ne contrevient donc en rien aux normes de la logique mathématique contemporaine. Je me propose de montrer que 1) Cette interprétation, qui ne distingue pas entre le problème de la généralité tel qu'il était posé par Locke et Berkeley et la façon dont le criticisme l'aborde, est irrecevable au regard des textes kantiens 2) Une interprétation fidèle à la continuité entre l'Esthétique Transcendantale et la Méthodologie Transcendantale doit faire droit à la variante kantienne de l'idée d'anisotropie des possibles, c'est-à-dire à la distinction entre Begriffsmöglichkeiten et Anschauungsmöglichkeiten 3) Une telle conception peut être expliquée et, jusqu'à un certain point, défendue, à l'aide de la notion contemporaine de modèle attendu d'une théorie mathématique.


Colin McLarty (CASE Western):
On the current state and the prospects for philosophy of mathematics

Friday, December 10, 2010
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: This will be more an essay than an argument. It will address why philosophy as a whole needs philosophy of mathematics, and why philosophy of mathematics needs contact with living mathematics. That could be the mathematics of Euclid insofar as we can recover the sense of Euclid as alive rather than a cut and dried textbook, but should include the latest mathematics as well since we cannot take that as cut and dried. It will talk about philosophy of mathematics in relation to logic, in relation to history of mathematics, and to philosophy of science and philosophy of language.


Mic Detlefsen (Notre Dame):
Freedom in mathematics

Friday, October 29, 2010
McGill University, Leacock Building, Room 927. 3:30-5:30pm

Abstract: There are different types of freedom that figured in the discussions of foundational thinkers in the nineteenth and early twentieth centuries. Prominent among these was one which was commonly known as freedom of concept-formation (freie Begriffsbildung) or concept-introduction. Freedom of concept-introduction was essentially a negative freedom. Specifically, it was a freedom from the traditional empiricist-constructivist constraint on concept-introduction, a constraint I will generally refer to as the Instantiation Condition. According to this condition, a concept can be legitimately introduced into mathematical practice only if its content is obtainable from that of an intuition or experience by application of an identified process of abstraction. The concern was not ultimately with how, as a matter of human psychology, we manage to form concepts (and/or such linguistic expressions as are generally used to represent them). Rather, it was with what constitutes the admission of a concept into mathematical practice, and the conditions under which such admission is justified. These will therefore be my chief concerns here too.


John Bell (Western Ontario):
The Axiom of Choice in a constructive setting

Monday, April 12, 2010
McGill University, Leacock Building, Room 927. 3:00-5:00pm

Abstract: The talk concerns the status of the Axiom of Choice in various constructive contexts, including intuitionistic set theory, constructive type theory and Hilbert's epsilon calculus.


Marcus Rossberg (Connecticut):
Non-conservativeness in higher-order logic

Friday, March 12, 2010
McGill University, Leacock Building, Room 927. 3:30-5:00pm

Abstract: I present a new difficulty that proof-theoretic approaches to a theory of meaning face. I show that third-order logic is not conservative over second-order logic: there are sentences formulated in pure second-order logic that are theorems of third-order logic, but cannot be proven in second-order logic. The proof is a corollary of the definability of a truth predicate for second-order arithmetic in third- order logic, that has until now escape attention. The challenge is that this inability to demonstrate the truth of such second-order sentences using the operational rules of second-order logic alone seems to refute the claim of proof-theoretic semantics that the meaning of the quantifiers is determined by their introduction and elimination rules: such sentences — being truths of third-order logic — should be true in virtue of the meaning of the logical vocabulary. An investigation of the Henkin models for higher-order logic suggests, perhaps surprisingly, that the meaning of the second- and higher-order quantifier is determined by their introduction and elimination rules after all.


Richard Zach (Calgary):
The Decision Problem and the development of metalogic

Friday, December 4, 2009
McGill University, Leacock Building, Room 927. 4:00-5:30pm

Abstract: In parallel with their work on proof theory in the 1920s and early 1930s, Hilbert and his collaborators and students—in particular, Ackermann, Behmann, Bernays, and Schönfinkel—did substantial work towards a positive solution for the decision problem. This begins with an unpublished talk by Behmann in 1921 in which the term "Entscheidungsproblem" first appears, and continues until the early 1930s with a number of published as well as unpublished contributions. Approaches to the decision problem evolved significantly during this time, from a purely algebraic approach in the style of Schröderian algebra of logic to relatively modern proofs which establish the finite controllability of certain prefix classes of formulas. This evolution goes hand-in-hand with an evolution of attendant concepts, in particular, semantic concepts such as satisfiability. An analysis of this work sheds light on the development of the semantics of first-order logic in the 1920s, on changing views as to what constitutes a "decision procedure," and on the connection between the decision problem and the consistency problem.


Gregory Lavers (Concordia):
Frege the conventionalist, Carnap the Fregean

Wednesday, September 30, 2009
McGill University, Leacock Building, Room 927. 5:30-7:00pm

Abstract: In this paper I examine the fundamental views on the nature of logical and mathematical truth of both Frege and Carnap. I argue that their positions are much closer than is standardly assumed. I attempt to establish this point on two fronts. First, I argue that Frege is not the metaphysical realist that he is standardly taken to be. Second, I argue that Carnap, where he does differ from Frege, can be seen to do so because of mathematical results proved in the early twentieth century. The differences in their views are, then, not primarily philosophical differences. Also, it might be thought that Frege was interested in analyzing our ordinary mathematical notions, while Carnap was interested in the construction of arbitrary systems. I argue that this is not the case: our ordinary notions play an even more important role in Carnap's philosophy of mathematics than they do in Frege's.

For more information, please contact:
Gregory Lavers (Concordia University),
Jean-Pierre Marquis (Université de Montréal),
Mathieu Marion (Université du Québec à Montréal),
Dirk Schlimm (McGill University).