French-Japanese Workshop on Philosophy of Proofs



パリ大学第1校哲学科及びパリ科学史・科学哲学研究所からの5名の訪問団との間で上記のワークショップを開催します.参加自由です.
(プログラムの最新版については,以下のURLをご覧ください)
http://abelard.flet.keio.ac.jp/seminar/frjp16jan.html
また, 15日午前中に同じ会場で非公式研究会を行います.

Date:
January 15th–16th, 2016
Place:
Distance learning room (B4F), south building, mita campus of Keio University

Speakers and discussants:

フランス側:

日本側:


Program (tentative):

Friday, January 15th

10:30
Closed Discussion Session
11:15
Guest speaker: Claude Kirchner (INRIA-National Research Institute of Digital Science) “Deduction modulo computation: main ideas and consequences”
14:00
Session I
17:00
Close

Saturday, January 16th

10:30
Session II
12:30
Lunch Break
14:00
Session III
17:00
Close

Abstracts:

Claude Kirchner “Deduction modulo computation: main ideas and consequences”
The complementarity of computation and deduction is known and used since the rise of logic. It has not always been well understood or accepted, like in the now settled controversy about the use of computations in the proof of the fourth color theorem. A formal treatment of this combination, called deduction modulo and initiated by Dowek, Hardin and Kirchner, allows one to define deduction systems like the sequent calculus or natural deduction modulo some congruence defined by a theory implemented by a computational system such as rewriting. The consequences of reasoning modulo are deep. Typically, cut elimination does no hold for any congruence but sufficient conditions on the theories can be given allowing one to deal modulo important theories like HOL. The second fondamental consequence is that the deduction modulo logical systems allow proving exactly the same theorems as when integrating the modulo part in the theory, but the proof are extremely different. Indeed they can be in some cases arbitrary shorter.

Yuta Takahashi “Normalization in Gentzen’s 1935 Consistency Proof”
Gentzen gave an interpretation to the formulas of first-order arithmetic in his first (1935) consistency proof. In this talk, we aim to show that the interpretation took a part of his response to a Brouwer-style objection against the significance of consistency proofs. Accomplishing this aim, we shed light on a philosophical significance of normalization in Gentzen’s 1935 consistency proof. Specifically, we claim that by means of the normalization procedure that is found in his response Gentzen gave a method for understanding the theorems of first-order classical arithmetic from his finitist standpoint.

Koji Mineshima “Combining a type-logical semantics and a wide-coverage statistical parser”
We present a type-logical semantics for wide-coverage statistical parsers based on Combinatory Categorial Grammar (CCG) developed for English and Japanese. The system we have been developing enables to map open-domain texts into formulas in higher-order logic that capture a variety of semantic information such as quantification and intensionality. We also discuss how a robust model of lexical knowledge can be integrated into our type-theoretical framework. (Joint work with Pascual Martínez-Gómez, Yusuke Miyao and Daisuke Bekki)

Florencia Di Rocco “Logic and Mathematics of Japanese Counters’’
Either a feature of a “conceptual scheme’’ -Quine- or a triviality of “syntax’’ -Peyraube, Thekla-, the function of counters is traditionally thought as that of getting “individuals’’ out from the noun they apply to. I will challenge this classical approach by a contextualist position in philosophy of language. Extending linguistics -Hashimoto, Chao- from Chinese to Japanese, I will present counters as operators enlightening contextual relevant features. Through every-day examples in Japanese, we will show how counters work as Austin’s “adjuster words’’, their “logic’’ being ascribed to the dynamics of “language games’’ -Wittgenstein- or “rules of adjustment of salience’’ in a “well-run conversation’’ -David Lewis-. This position will progressively lead to the idea that counting operations do not necessarily deal with “individuals’’. We will thus raise up a certain number of problems related to philosophy of mathematics, logic and pragmatics concerned in the use of Japanese counters -such as the link between unities and individuals, counting and measuring, and between numbers, counters and ordinary concepts- and sketch a wittgensteinian type of answer. By showing their contextual plasticity, I will challenge the mere idea of the existence of rigid “counter words’’.

Marco Panza “The twofold role of diagrams in Euclid’s geometry”
Proposition I.1 of the Elements is, by far, the most popular example used to justify the thesis that many of Euclid’s geometric arguments are diagram-based. Many scholars have articulated this thesis in different ways and argued for it. I suggest to reformulate it in a quite general way, by describing what I take to be the twofold role that diagrams play in Euclid’s plane geometry (EPG). Euclid’s arguments are object dependent. They are about geometric objects. Hence, they cannot be diagram-based unless diagrams are supposed to have an appropriate relation with these objects. I take this relation to be a quite peculiar sort of representation. Its peculiarity depends on the two following claims that I shall argue for: (i) The identity conditions of EPG objects are provided by the identity conditions of the diagrams that represent them; (ii) EPG objects inherit some properties and relations from these diagrams.

Alberto Naibo “Representing inferences and proofs: the case of harmony and conservativity”
Traditionally, proof-theoretic semantics focuses on the study of logical theories from a general point of view, rather than on specific mathematical theories. Yet, when mathematical theories are analyzed, they seem to behave quite differently from purely logical theories. A well-known example has been given by Prawitz (1994): adding of a set of inferentially harmonious rules to arithmetic does not always guarantee to obtain a theory which is a conservative extension of arithmetic itself. This means that outside logic the nice correspondence between harmony and conservativity (advocated for example by Dummett (1991)) seems to be broken. However, as it has been pointed out by Sundholm (1998), this is not necessarily a consequence due to the passage from a logical setting to a mathematical one. It could depend also on the way in which proofs are represented. In particular, if proofs are seen as composed by rules which act on judgments involving proof-objects, rather than on rules which act on propositions, then the aforementioned correspondence can be in fact be reestablished. An analysis of this phenomenon is proposed. In particular, two different ways of representing proof-objects are taken into consideration: the Church-style presentation and the Curry-style presentation. It is then shown that a crucial difference can be obtained by choosing the first rather than the second.
Bibliographical references:
Dummett, M. (1991). The Logical Basis of Metaphysics. London: Duckworth.
Prawitz, D. (1994). Review of ‘The Logical Basis of Metaphysics’ by Michael Dummett. Mind, NS, 103 (411): 373–376.
Sundholm, G. (1998). Proofs as acts and proofs as objects: Some questions for Dag Prawitz. Theoria, 64 (2-3): 187–216.

Andrew Arana “Complexity of proof and purity of methods”
Roughly, a proof of a theorem, is “pure” if it draws only on what is “close” or “intrinsic” to that theorem. Mathematicians employ a variety of terms to identify pure proofs, saying that a pure proof is one that avoids what is “extrinsic”, “extraneous”, “distant”, “remote”, “alien”, or “foreign” to the problem or theorem under investigation. In the background of these attributions is the view that there is a distance measure (or a variety of such measures) between mathematical statements and proofs. Mathematicians have paid little attention to specifying such distance measures precisely because in practice certain methods of proof have seemed self-evidently impure by design: think for instance of analytic geometry and analytic number theory. By contrast mathematicians have paid considerable attention to whether such impurities are a good thing or to be avoided, and some have claimed that they are valuable because generally impure proofs are simpler than pure proofs. This talk is an investigation of this claim, formulated more precisely by proof-theoretic means. Our thesis is that evidence from proof theory does not support this claim.

事務局:
慶應義塾大学文学部岡田研究室内 三田ロジックセミナー 講演会事務局
E-mail:
logic [AT] abelard.flet.keio.ac.jp

主催:慶應義塾大学 論理と感性のグローバルリサーチセンター (科学研究費 新学術領域「予測・判断・意思決定の論理と計算」プロジェクト)
後援: 慶應義塾大学次世代研究プロジェクト 論理思考の次世代型研究と論理的思考力発達支援への応用研究


This document was translated from LATEX by HEVEA.