論理と数学の哲学及び歴史研究に関する日仏ワークショップを開催します。
毎年このワークショップシリーズでは、論証・証明について、論理学・哲学・計算論的観点から議論し、本分野の日仏連携・協力を進めています。フランスから4人、国内から3人の招待講演者を迎えて本年度の会を開催します。(今年度は特にユークリッド原論の歴史・言語分析・証明の研究者3人をお呼びしています。)
And others to be announced
タイトルをクリックするとアブストラクトを参照できます
In this talk I will differentiate between the way that constructions are used in problems and in theorems in the plane books of Euclid's Elements problem-constructions and proof-constructions. I will then show how the concept of given, as developed in the Data, can be used to explicate all five Euclidean postulates, and to show how these postulates are used in problems. Finally, I will discuss the detailed syntax of a Euclidean problem-construction and show that this is rather different that usually described as a construction with compass and straightedge.
I present an ongoing project of syntactic analysis of the whole Greek text of Euclid's Elements (about 150,000 words). Its goal is construction of stemma for each sentence of this work.
The Elements are written in a natural language (ancient Greek). Though the text is quite monotonous, one can sometimes discern subtle stylistic oscillations, so that one can expect that a thoroughgoing syntactic analysis might reveal different origins of different parts of this work. The grammatical rules at the base of the analysis are modelled after Tesnière's dependency grammar. A computer program is under development which helps the analysis suggesting applicable rules to the word the user selects, and saves the results.
The Elements are the most known, read and commented work of Greek mathematics since antiquity, and it is now difficult to identify the original parts from later additions, for our sources are limited to Byzantine manuscripts (9th century onward) and Arabo-Latin translations (although the latter sometimes reveal later interventions not discernible in Greek tradition).
A syntactically analyzed text of the whole Elements (searchable by any conceivable conditions) would greatly advance our understanding of ancient Greek mathematics, helping the distinction of various components with different origins mixed in the extant Greek text of the Elements.
A classical problem concerning Euclid¹s geometry concerns the contrast between the apparent particularity of its proofs and the apparent generality of its propositions.
It will be argued that:
1)
no philological enquiry can alone solve the problem, since what is needed to this purpose is reconstructing the internal logic of Euclid¹s theory;
2) Despite appearances, Euclid¹s proof are, in fact,
as general as they could have been, in agreement with this logic.
In Frege and Russell’s line, the main tool to propagate ontology toward higher orders is realized by quotienting sets by equivalence relations. Indeed, this leads to produce, at the next order, separated individuals (disjoint non empty subsets covering the original set) and this in such a way that this separation is conceptually generated from a relation at the initial order.
This way to relate ontology and equivalence relations is Parmenidian in inspiration, non only because it considers individuals independently of time, but also because it is deeply found upon the idea of sameness. The important point with this respect is not so much that individualities at the new order reflect a similarity at the previous order, but the fact that a given individuality at the new order depends only of itself (i.e. of the relational commitment of its own members) and in particular is independent of other individualities (i.e. of the relational commitment of members of other classes).
In my last talk in Keio (in April 2017), I defended the view that "Denotational semantics of programs" (the research programme launched by Strachey and Scott around 1969) can be seen as a first step to substitute to Parmenidian ontology an Heraclitean ontology, in the sense that it faces the first point (individuals in time): in a computational universe, transformation comes first and separation phenomenas are generated by (and compatible with) these transformation.
In the present talk, my aim is to present a complementary program toward an heraclitean ontological point of view, agonal ontology, facing the second point.
We discuss how a proof can be accepted or rejected as a normal proof. From this view point we consider the role of normalization of proofs, as well as re-normalization of them. We relate this consideration to Cartesian normalization, Husserlian normalization, Hilbert-Gentzen re-normalization, Wittgensteian normalization, of proofs/demonstrations.
Schwichtenberg has developed the program called “Proofs as Programs" by measuring the “complexity as programs” of proofs in an arithmetical system (with recursion). Technically, he used a slow growing hierarchy to “clime down” tree ordinals, which are introduced by Buchholz. This observation is due to Arai and is going back to Girard’s “Hierarchy comparison theorem”.
In this talk, we sketch a new approach to this program by focusing on parameter-free subsystems of Girard’s F. There are at least two advantages in this approach. (i) Our approach is simpler and smoother than the original one. (ii) It is relatively easy and uniform to extend our results to stronger fragments (corresponding to iterated inductive definitions) though the theory analysed by Schwichtenberg has the strength of Peano Arithmetic.
This work is based on joint work with Schwichtenberg.
In this talk, we consider the validity of cut. One natural criterion for the validity of cut which contains a connective C is that the main reduction step for C is defined. (Danos and Regnier, 1989) introduced generalized multiplicative connectives that cannot be defined by combinations of binary multiplicative connectives in a fragment of Linear Logic. The validity of cut for these connectives is defined by the cut-eliminability of them. We show that the validity of cut via cut-eliminability does not guarantee the compositionality of the validity in a model (i.e. soundness of the cut rule). We also show that if a connective C satisfies the deducibility of identical condition introduced in (Hacking, 1979) (or the uniqueness condition introduced in (Belnap, 1962)) in addition to the cut-eliminability, the validity of cut guarantees the compositionality of the validity in a model.