Intersection Types for Normalization and Verification (Invited Talk)

Author Kazushige Terui



PDF
Thumbnail PDF

File

LIPIcs.FSTTCS.2013.41.pdf
  • Filesize: 253 kB
  • 2 pages

Document Identifiers

Author Details

Kazushige Terui

Cite As Get BibTex

Kazushige Terui. Intersection Types for Normalization and Verification (Invited Talk). In IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 24, pp. 41-42, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013) https://doi.org/10.4230/LIPIcs.FSTTCS.2013.41

Abstract

One of the basic principles in typed lambda calculi is that 
typable lambda terms are normalizable. Since the converse direction does not hold for simply typed lambda calculus, people have been studying its extensions. This gave birth to the intersection type systems, that exactly characterize various classes  
of lambda terms, such as strongly/weakly normalizable terms and 
solvable ones (see e.g. [van Bakel/TCS/1995] for a survey).

More recently, a new trend has emerged: intersection types are 
not only useful for extending simple types but also for refining them 
[Salvati/JoLLI/2010]. One thus obtains finer information on simply typed terms by assigning intersection types. This in particular leads to the concept of normalization by typing, that turns out to be quite efficient in some situations [Terui/RTA/2012]. Moreover, intersection types are invariant under beta-equivalence, so that they constitute a denotational semantics in a natural way [Ehrhard/CSL/2012]. 

Finally, intersection types also work in an infinitary setting,where terms may represent infinite trees and types play the role of automata. This leads to a model checking framework for higher order recursion schemes via intersection types [Kobayashi/POPL/2009, Kobayashi+Luke Ong/LICS/2009].
 
The purpose of this talk is to outline the recent development of intersection types described above. In particular, we explain how an efficient evaluation algorithm is obtained by combining normalization by typing, beta-reduction and Krivine's abstract machine, to result in the following complexity characterization. Consider simply typed lambda terms of boolean type o -> o -> o and of order r. Then the problem of deciding whether a given term evaluates to "true" is 
complete for n-EXPTIME if r = 2n +2, and complete for n- EXPSPACE   
if r = 2n + 3 [Terui/RTA/2012].

Subject Classification

Keywords
  • simply typed lambda calculus
  • computational complexity
  • denotational semantics
  • intersection types

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail