Separating Terms by Means of Multi Types, Coinductively
Abstract
Intersection type systems, as adequate models of the -calculus, induce an equational theory on terms, that we refer to as type equivalence. We give a new proof technique to coinductively characterize type equivalence. To do so, we explore a simple setting, namely weak head type equivalence, which is the equational theory induced by a weak head non-idempotent intersection type system.
We prove a folklore result: weak head type equivalence coincides with Sangiorgi’s normal form bisimilarity. What is new in our development is that we only rely on coinductive program equivalences, bypassing the need to introduce term approximants, which were used in previous works characterizing type equivalence.
The crucial part of this characterization is to show that type equivalent terms are normal form bisimilar: we do so by constructing shape typings that can only type terms of a specific normal form structure. Shape typings are a light form of principal types, a technique often used in intersection types to generate from one or few principal typing all possible typings of a term.
Keywords and phrases:
lambda calculus, intersection types, program equivalence2012 ACM Subject Classification:
Theory of computation Lambda calculusAcknowledgements:
I am grateful to the reviewers for their careful read of the paper and helpful comments and to Beniamino Accattoli for providing me feedback on a first draft.Editors:
Rasmus Ejlers Møgelberg and Benno van den BergSeries and Publisher:
Leibniz International Proceedings in Informatics, Schloss Dagstuhl – Leibniz-Zentrum für Informatik
1 Introduction
Intersection types were developed to study the theory of programming languages and are at the intersection between operational and denotational techniques. While only a certain subset of terminating programs are typable in a simple type system, all terminating programs are typable in an intersection type system. As they can give a meaning to all terminating terms, intersection type systems may give syntactic presentations of denotational models [21, 12].
Non-idempotent intersection types, here referred to as multi types, have been especially successful syntactic presentations of the relational model, both in Head Call-by-Name (CbN) [14] and Call-by-Value (CbV) [17]. In this work we focus on another variant, namely Weak Head Call-by-Name (further restricting head reduction to forbid reduction under lambdas).
In this work, we will syntactically characterize the equational theory induced on terms by multi type systems. Such a characterization is already known for CbN: Böhm tree equivalence exactly represents the equational theory induced by the CbN idempotent intersection type system [22] and by the CbN non-idempotent variant [13]. Both developments introduce term approximants to carefully define the syntactic program equivalence that will match the equational theory induced by the type system. Terms approximants are a powerful but heavy tool in the study of program equivalences; in particular, they require to extend the calculus to a calculus of approximants. The aim of this paper is to show that approximants can be avoided and replaced by a simple use of coinduction, namely normal form bisimulations [23, 19].
Syntactic Equality and Normal Form Bisimilarities.
Denotational models give rise to mathematical interpretations of the -calculus, where different -terms may be sent to the same mathematical object. Equivalence classes of -terms in these models should always include syntactical equality–that only equates programs that have exactly the same syntax (up to -equivalence)–and -equivalence. Indeed, and have different syntax but represent the same identity function. The computational theory , that only quotients by -equivalence, is still far from reaching most denotational models’ theories. Indeed, some terms are not -equivalent but have the same behavior–the paradigmatic example being fixpoint combinators. To be able to relate those terms, one needs to define syntactical theories based on infinite trees, namely normal form bisimilarities.
Normal form bisimilarity is an intensional equivalence, that compares the structure of the terms’ (possibly infinite) normal forms. The normal form bisimilarity of interest here is Sangiorgi’s normal form bisimilarity, denoted . It is obtained from the study of bisimulations in the -calculus and the translation of the -calculus into the -calculus [23]. Later, Lassen related this bisimilarity, there called weak head normal form bisimilarity, to Lévy-Longo tree equivalence [19].
Weak Head Multi Types.
Adequate intersection types for weak head call-by-name evaluation were first introduced in an idempotent setting by Dezani-Ciancaglini et al. [16] and then refined by Bucciarelli et al. [15] in a non-idempotent setting, allowing for quantitative information and lighter proofs. We shall use the latter presentation, via multi types defined later in Section 3. The quantitative aspect will play a crucial role, as explained in Section 5.
Type Equivalence.
Given an intersection type system, one can define what we call here type equivalence , which is the equational theory induced by the model syntactically represented by the type system. Two terms are type equivalent if they are typable by exactly the same pairs of typing contexts and types. While it is easier to prove type equivalence than contextual equivalence on certain pairs of terms, it is still quantified universally and, unlike normal form bisimilarities, does not provide a straightforward proof technique. Type equivalence is included in contextual equivalence, but this inclusion is in general strict.
This Paper: Coinductively Characterizing Type Equivalence.
Our main result is that type equivalence induced by the weak head multi type system coincides with Sangiorgi’s normal form bisimilarity. This characterization may be folklore for experts of the topic but we provide here a clean development of the coinductive proof technique.
There are various related works on characterizing syntactically the equational theory of a model. To the best of our knowledge, characterizations regarding intersection types all mention term approximants. Approximants are an inductive way to look at infinitary behaviors and were the core of the original definition of Böhm trees. Coinduction is a more modern way to define infinitary computation and it is particularly successful in defining (Böhm) tree equivalence as normal form bisimilarity. By avoiding approximants, we do not need to complexify theorem statements to include approximants (on top of terms). The proof technique is more easily breakable into sub-lemmas, as it rests on a coinductive argument.
We split the characterization proof in two directions:
Typing transfer shows that normal form bisimilar terms are typable by the same pairs of typing contexts and types (in fact, sometimes even using the same type derivations). This first direction is the easy one, done by induction, exploiting quantitative arguments of multi types. The other direction uses coinduction and is shown by building shape typings, which explictly specify that type equivalent terms have matching normal forms. Shape typings are reminiscent of principal types, and share some of their properties. They are however lighter and principal types for intersection/multi types are a quite technical topic beyond the scope of this paper.
Going back to finitary normal forms, we give an intersection type variant of Böhm theorem: for any two distinct -normal forms, there exists a typing context and a type that may only type one of the normal forms and not the other (in the weak head intersection type system).
Monotonicity of Normal Form Bisimilarity, for free.
Once Sangiorgi’s normal form bisimilarity is proven equivalent to type equivalence, we can deduce the monotonicity of Sangiorgi’s normal form bisimilarity without much work. A relation is monotone if it is stable by context closure. Such a property is not trivial for intensional equivalences like Böhm tree equivalence or normal form bisimilarity, as they are not defined compositionally. On the other hand, proving monotonicity for type equivalence is an immediate induction on derivations, as derivations are built compositionally.
While there exists some techniques to show that normal form bisimilarities are monotone, they do not scale up so easily. Lassen’s adaptation of Howe’s method for normal form bisimilarity [19] is a tedious but efficient process for Böhm tree equivalence and Sangiorgi’s normal form bisimilarity. It does not scale up to extensional call-by-value bisimilarities, where Biernacki et al. need to introduce an extension of the proof method [11]. These limits justify looking for alternative proof techniques of monotonicity.
On Related Separation Constructions.
The crucial part of the proof relies in the separation construction, where it is shown that type equivalent terms are normal form bisimilar. Our construction of shape typings is not so different than the separation described in the book of Barendregt and Manzonetto [10], which is exactly the proof from Breuvart et al. [13].
Separation theorems are not a new concept, see Barendregt’s book about the -calculus [9] for an overview. These techniques are also reminiscent of the well-known Böhm out technique [9], where from two terms that are not normal form bisimilar, one builds a context separating them for contextual equivalence. However these techniques are usually presented via contrapositive: from a difference in Böhm trees, one can extract a context/type that can separate them. In our case, we do not explicitly go by contrapositive but specify principal types, closer to Ronchi Della Rocca’s development [22] where principal types are carefully defined and used.
A Single Ground Type is Enough.
We delve into a more technical aspect of multi types. Intersection types for call-by-name evaluation, whether it is head or weak head, require ground types, that are atom types for variables of a term. In general, it is sufficient to have only one ground type, as properties of multi types are still preserved, hence we sometimes only use one ground type for simplicity [4]. To characterize type equivalence, however, Breuvart et al. deliberately use infinitely many ground types to ease the separation of terms [13]. We follow at first their simplification and clearly outline where these ground types are needed. We are then able to give an alternative construction that only requires a single ground type.
2 Weak Head Reduction
We briefly survey the definition of weak head reduction and its associated contextual equivalence.
Weak Head Reduction.
We focus on a specific reduction of the -calculus, weak head reduction, the variant of head reduction that does not reduce under lambdas.
Weak head reduction, denoted , is the closure of the beta rule under applicative weak head contexts. In essence, all steps are of the shape for .
Normal forms with respect to are exactly characterized by the following grammar, separating normal forms into rigid terms and abstractions:
We say that a term is -normalizing if it reduces to a -normal form. Note that by well-known normalization theorems for weak head reduction, the fact that a term -reduces to a -normal form is equivalent to -reducing to a -normal form. We define the contextual preorder and equivalence based on weak head termination.
Definition 1 (Contextual Preorder and Equivalence).
The weak head contextual preorder and contextual equivalence are defined as follows:
-
if, for all context such that and are closed terms, whenever is -normalizing, so is .
-
is the equivalence relation induced by , that is, and .
Contrarily to the usual call-by-name contextual equivalence (based on head termination), and are not contextually equivalent. Indeed, in the empty context, the former does not terminate whereas the latter is in weak head normal form.
On Contextual Equivalence and -equivalence.
As a consequence, the rule does not hold in general: and are not contextually equivalent (for , the paradigmatic looping term). In fact, if one is interested in the weak head contextual preorder, refines but does not always refine ( for all but for all variables ). If converges to an abstraction, there is no difference between both terms, as this -equivalence is actually included in equivalence. Otherwise, it is in fact not true that refines as can be non terminating whereas is always a weak head normal form.
Issues with -equivalence and weak head reduction actually go beyond this first observation. There exists -equivalent terms that are contextually equivalent, the paradigmatic example being [1, 23]. These two terms are contextually equivalent even though and are not. Contextual inequivalence is somehow non-compositional in presence of .
3 Weak Head Multi Types
We follow the definitions of multi types for weak head reduction as designed by Bucciarelli et al. [15], recalled in Fig. 1.
Multi Types.
Multi types are finite multisets of linear types that are either base types or arrow types . In the original presentation of Bucciarelli et al. [15], there are distinct ground types and an abstraction type : we unify both for simplicity. Note that there is a countable number of distinct base types, following Breuvart et al. [13], to simplify separating terms by types.
Typing contexts are finite-support maps from variables to multi types, denoted . Any omitted variable is implicitly typed by the empty type . The empty typing context, where all variables are typed by the empty type, is denoted .
Typing Rules.
The typing rules described in 1 are the rules to infer the type of a term.
There are two axiom rules, one for variables () and one for abstractions (). The latter is specific to weak head call-by-name, by allowing to type all abstractions.
There is another rule to type abstractions () whose only premise gives a type for the body of the abstraction, given a multi type for the binded variable, that now moves to the typing context.
The application rule () is the only way to type terms of the form . Intuitively, this rule requires the argument to be typed for every occurrence that will be used in typing the function . The premices of the rule indeed require to type with a linear type and with many linear types , giving a different type derivation for each of the linear types in the argument position of the type of . In particular, if is typed with the linear type , then one does not have to provide any type derivation for to apply the -rule.
Definition 2 (Typing).
A typing of a term is a pair of a typing context and a linear type such that there exists a derivation with final judgment .
We write if is a type derivation with final judgment .
Results about Multi Types.
Intersection types are in particular known for the fact that typability may exactly capture normalizable terms. This system is an example of such characterization, as a term is typable in if and only if it is weak head normalizable.
Theorem 3 (Characterization of Termination, [15]).
A term is typable in if and only if where is a weak head normal form.
Proving characterization of termination for idempotent intersection types is generally obtained by reducibility arguments, but this proof technique can be avoided by switching to multi types–where characterization of termination can be shown with combinatorial arguments. The main selling point of multi types, the non-idempotent variant of intersection types, is indeed to obtain quantitative results from typing judgments. The most famous example is the following statement of quantitative subject reduction, that allows to show that any type of a term is stable by reduction, and the derivation of the reduct shall be of a strictly smaller size. The size of a derivation is its total number of rules.
Proposition 4 (Quantitative Subject Reduction, [15]).
For all () such that and , there exists with .
From quantitative subject reduction, it is typical to deduce (without reducibility arguments!) that the intersection type system is correct, i.e. all typable terms are normalizing. For the converse implication to hold, thus completing the characterization of termination, one can easily deduce it from subject expansion. Note that the following statement of subject expansion contains quantitative information about the size of derivations but the quantitative argument plays no role in proving that all normalizing terms are typable (the induction is done on the number of steps to normalization).
Proposition 5 (Quantitative Subject Expansion, [15]).
For all () such that and , there exists with .
Quantitative arguments are also sometimes taken a step further, introducing alternative type systems that exactly measure the length of the reduction sequence to normal forms, see Accattoli et al. [3].
The Equational Theory Induced by the Multi Type System.
In this work, we focus on the program equivalence induced by the multi type system of Fig. 1. Indeed, multi type systems induce a model, that we shall not discuss here, and every model induces an equivalence relation on terms (by relating terms with the same interpretation in the model). We introduce the type preorder, that exactly rephrases the (in)equational theory induced by the weak head multi type system.
Definition 6 (Type Preorder).
The type preorder and type equivalence are relations on terms defined as follows:
-
if for all such that there exists then there exists .
-
Type equivalence is defined by symmetry: iff and .
It is easy to check that is indeed a preorder (reflexive and transitive). Furthermore by subject reduction and subject expansion it is invariant by and adequate. As it is also stable by contexts, the type preorder satisfies the usual definition of an (in)equational theory [9, 10].
We say that (type-)improves if , that is if all typings of are typings of . Symmetrically, two terms are type equivalent if that they have the same set of typings.
Proposition 7.
The type preorder enjoys the following properties:
-
1.
-invariance: if then and ;
-
2.
Adequacy: if and is -normalizing then is -normalizing;
-
3.
Monotonicity: if then for any context .
As a direct corollary of Point 2 and 3 of Proposition 7, the type preorder is included in the contextual preorder.
Corollary 8.
The type preorder is included in the contextual preorder, i.e. . Similarly for equivalences: .
4 Weak Head Normal Form Bisimilarity
In this section, we recall the definition of Sangiorgi’s normal form (bi)similarity and show the correspondence with the type preorder.
First, from a relation on terms we define its closure on -normal forms . Formally, is defined, by induction, as the smallest relation including the three following rules:
Definition 9 (Weak head normal form simulation).
A relation on terms is a weak head normal form (whnf) simulation if for all such that either does not -terminate, or -reduces to a normal form and -reduces to a normal form such that .
Whnf similarity, noted , is defined, by coinduction, as the largest whnf simulation.
Note that the definition here is slightly different than Sangiorgi’s and Lassen’s well-known presentations [23, 19] as we make explicit an inductive definition for rigid terms, via .
We define an -operator, , where is the whnf simulation induced by . Formally:
We have that is a whnf simulation iff . To ensure that whnf similarity can be coinductively defined, we prove that the -operator is monotone.
Proposition 10 (Monotonicity of whnf simulations).
If then
Proof.
Let . We have two cases:
-
If does not -terminate, then .
-
Otherwise, -reduces to a normal form and -reduces to a normal form such that . As , we have that (by an easy induction on ). We conclude that by definition.
Type equivalence exactly matches normal form bisimilarity.
In the two following sections, we will exactly characterize the type preorder by whnf similarity, i.e. . We call soundness , the easy part, and completeness , the difficult part.
Monotonicity, for free.
An interesting corollary of this characterization is an alternative of monotonicity for whnf similarity (i.e. that is stable by contexts). Such a property is often also called compatibility and is the main technical point when showing soundness of a similarity with respect to contextual equivalence. The proof of such a property is not always complex, as there is a general method described by Lassen, but can be quite lengthy. Note that in some cases, Lassen’s method does not directly apply and requires adjustments [11], thus motivating the need for alternative and lighter proof techniques.
Proposition 11 (Monotonicity of ).
If then for all , .
5 Soundness via Typing Transfer
We shall first show soudness and we do so via the following typing transfer proposition. It says that if two terms are whnf similar, then any typing of transfers to (note that if and are normal forms, even part of the structure of the derivation transfers). We show such a result via induction on the size of the derivation. A crucial ingredient is that subject reduction does not increase the size of the derivation, otherwise the induction argument would not work. The quantitative aspect of the subject reduction proof is therefore crucial and justifies the need of multi types.
Proposition 12 (Typing Transfer).
Let be a whnf simulation. If and there exists a derivation then there exists a derivation .
Proof.
By induction on the size of the derivation .
The term is typable by the derivation therefore it is normalizable by Theorem 3. Hence we have and therefore (since is a bisimulation) with with and weak head normal forms. By quantitative subject reduction (Proposition 4), there is a derivation whose size is at most the size of . Instead of looking for a derivation of , we can look for a derivation of and conclude by (qualitative) subject expansion for -reduction (Proposition 5).
Now, from and , we build the derivation . By case analysis on the last rule of the derivation :
-
1.
Axiom rule.
Then by , and types accordingly.
-
2.
Abstraction- rule.
Then by , and types accordingly.
-
3.
Abstraction rule.
Then by , with .
The derivation is of a strictly smaller size than . By induction, since , there is a derivation .
We build the derivation by applying the typing rule to .
-
4.
Application rule.
Then by , with and . By induction on the sub-derivations, we get the appropriate derivation for .
From the fact that typings transfer for any whnf simulation, we deduce easily soundness () as typings transfer, in particular, for the largest whnf simulation that is whnf similarity.
Theorem 13.
For all terms , if then .
Proof.
Let terms such that . Let a typing for . As is a whnf bisimulation, by Proposition 12, we have that . Hence .
6 Completeness via Shape Typings
It remains now to show that if two terms are typable by the same intersection types, then they have to be syntactically similar, specifically that they have to be weak head normal form bisimilar.
Coinductive Argument.
To prove completeness, i.e. that , we shall use coinduction. We first show that is a whnf simulation, which implies it is included in the largest whnf simulation, namely whnf similarity .
Building Shape Typings, Principally.
To prove that the type preorder is a whnf simulation, we build specific shape typings that specify the shape of the normal form of a term. As an example, there exists that ensure that for any term typable by this typing, i.e. , we have that weak head normalizes to . In this specific case, and .
These shape typings specify the structure of normal form of any term they type. We choose them with some flexibility in mind so that they may be used compositionally. In that sense, they are principal typings in that they generate a number of other possible typings for terms that they may type. Indeed, another choice for could be to replace by any linear type. As an example, and is also a correct typing for but it is not separating, since it also types .
A first definition of shape typings can be seen with the following (sub-)type system (only typing weak head normal forms):
where denotes the substitution of a type variable (ground type) by a linear type .
Dry typing systems follow similar definitions (see [2] where dry typings are defined in a more complex setting than weak head reduction). Coming up with these first shape typings resembles finding a typing only inhabited by one or few terms (up to -conversion). Arrial’s implementation of the inhabitation algorithm [7] was helpful to check ideas of shape typings (algorithm developed and proven correct in [8]).
This system is an outer shape typing system in the following sense:
-
1.
Any normal form can be typed in the system;
-
2.
The system is sound for the weak head multi type system : if then ;
-
3.
These shape typings allow us to distinguish terms with different outer shape normal forms.
Point 1 is immediate given the definition of . Point 2–which is the fact that this transformation is sound with respect to the original weak head type system–is formalized with Lemma 16 (note that the lemma is more general, somehow allowing to substitute a linear type for another one). Point 3 is the starting point of our separation construction, formally specified in Proposition 17.
Proposition 17 (and the shape typing system ) is not enough to complete the proof that type equivalent terms generate the same (infinitary) Lévy-Longo tree (i.e. are normal form bisimilar), as some terms have the same outer shape but differ on a deeper syntax level (a difference between and generates a difference between and , and between and ). We therefore need to generalize our shape typing technique to be able to detect deep differences (formally proven in the so-called Normal Inversion Lemma 21).
In a nutshell, the proof technique of this section will be decomposed in two steps:
-
1.
Look for a possible outer difference with ;
-
2.
If the outer syntax matches, go deeper with the inversion lemma.
We’ll (coinductively) repeat the process to show that, if type-improves then is whnf similar to .
Building Shape Typings, Formally.
Now, we formally prove completeness, by exhibiting our shape typings. We first specify a family of types, erasing types, that will be helpful to discriminate between terms and act as shape types for rigid normal forms.
Definition 14 (Erasing Type).
The erasing type, written , represents a computation that erases arguments and returns the th ground type. Precisely, .
Erasing types can be used to type any rigid term , by assigning a erasing type to the head variable , resulting in the rigid term to be typed with the erasing type ( is the number of arguments applied to the head variable).
To formally prove this, with the following lemma, we strengthen the inductive hypothesis by showing that assigning a erasing type to the head variable results in the rigid term to be typed with the erasing type.
Lemma 15 (Shape Typings for Rigid Terms).
Let be a rigid term of head variable and such that the head variable is applied to arguments. Then, for all and , .
In particular, for any , is an (outer) shape typing for the rigid term .111Note that this is the same shape typing as the one provided in the shape type system , but written in a direct way.
Proof.
By induction on rigid terms(/the number k).
-
Variable i.e. . Then the head variable of is and it is applied to arguments. We conclude as is a correct derivation for all consisting only of the axiom rule.
-
Application i.e. . We have that has the same head variable as and that head variable is applied to arguments in . Let be a natural number.
By induction, . We conclude by building the following derivation:
The second property we shall use about shape typings is the following: all typings of rigid terms can be seen as extensions of shape typings, as there exists (1) a type for the head variable, (2) which determines the type of the rigid term.
Lemma 16 (Typings of Rigid Terms follow Shape Typings).
Let be a rigid term with head variable and such that the head variable is applied to arguments.
For any typing , there exist multi types such that:
-
1.
;
-
2.
For any linear type , we have a new typing , with 222The notation means that is such that . Note that this definition does not constrain the domains of and to be disjoint..
Proof.
By induction on .
-
Variable i.e. . The last rule of any derivation must be the axiom rule, which ensures that . For the second point, it suffices to see that is valid for any .
-
Application i.e. . The last rule of any derivation must be the application rule.
By induction there exists multi types such that . Set then as .
For the second point, by induction, for all linear type , we have that , hence we conclude by applying the application rule.
A first step towards showing that the type preorder is a whnf simulation is to show that the structure of the normal forms are preserved (the outermost syntax), which is specified in the following proposition. In the proof, we use shape typings for rigid terms, as well as a shape typing for abstractions .
Proposition 17 (Type preorder preserves the shape of normal forms).
Let be a weak normal form, a term, and a rigid term.
-
1.
Variable Preservation: if , then .
-
2.
Abstraction Preservation: if , then for some term .
-
3.
Rigid Preservation: if , then for some rigid and some term .
Proof.
-
1.
Suppose . Cases of :
-
Different Variable: if , then , but is not typable by .
-
Abstraction: if , then , but is not typable by , since typing by requires an empty context for abstractions.
-
Rigid Term: if , then , but , as a non variable rigid term needs an arrow type in the context for its head variable (per Lemma 16).
-
-
2.
We know that however for all rigid term (at least one variable has to appear with a non-empty type in the context by Lemma 16). Thus, is an abstraction of the form for some term .
-
3.
There exists a derivation where by Lemma 15 but for all variables and for all abstraction . Hence is a non-variable rigid term.
The main properties of shape typings (Lemma 16) are used to prove the two following lemmas, which shall be key to be able to know the structure of typing derivation.
Lemma 18.
Let be a rigid term of head variable such that . If for some and does not appear in other types in , then .
Proof.
By induction on rigid terms:
-
. Then the only derivation possible for is , for which the statement holds.
-
. Then the derivation must start with a typing rule :
with .
By Lemma 16, is included in (as it is the only occurrence of the type).
By induction hypothesis, . Then by Lemma 16, must be empty (otherwise it would reflect in the type of ). Hence .
Lemma 19.
Let be a rigid term of head variable such that for some . If for some and does not appear in other types in , then for .
Proof.
By induction on rigid terms:
-
. Then the only derivation possible for is , for which the statement holds.
-
. Then the derivation starts with a typing rule :
with .
By Lemma 16, is included in (as it is the only occurrence of the type).
By induction hypothesis, and for , which concludes the proof.
The following proposition is the only point where we use the fact that there are countably many distinct ground types: in all other proofs, we either rely on this proposition or only use the first ground type . This restriction is actually not needed but eases the proof. In fact, it is enough to have only one ground type and be able to specify large enough types. We show after the proof how to choose the right typings if there is only a single ground type.
The proposition states that normal forms related by the type preorder may be broken apart into subterms that are still related by the type preorder. In this sense, the type preorder is decompositional on normal forms.
Proposition 20 (Decompositionality of ).
Let , be terms and , be rigid terms.
-
1.
Left preservation: if then .
-
2.
Right preservation: if and then .
Proof.
-
1.
Let such that . We shall show that as well.
Let be the maximum index of that appears in (if it does not appear, ).
By Lemma 16, we can set and (where ) and we still have that . It is clear that types :
By hypothesis, we have that is typable by . The derivation starts as:
By Lemma 16, is included in (as is the only type where occurs in ). By Lemma 19 applied on , we know . Hence is empty and .
Hence, we have a derivation of final judgment . By Lemma 16 and as only appears in one linear type in , we have that as well.
-
2.
Let such that . We shall show that .
Let be the maximum index of that appears in (if it does not appear, ).
Let such that ( is then the head variable of ).
Then, as , we also have that .
By hypothesis, we have that . The derivation starts with the application rule:
By Lemma 16, must be included in (as it is the only occurrence of the type). The only derivation of containing in the context must be by Lemma 18. Hence only contains one element and (by Lemma 16, as the type must appear in ).
Hence, we have a derivation of final judgment which concludes the proof.
A Single Ground Type is Enough.
The proof above uses the fact that there is a countable number of distinct ground types in our multi types–but it is not needed. Alternatively, the key point is to set up a notion of size of linear types by counting the number of top-most arrows in linear types (counting only arrows appearing outside of multi types). We shall use types of size (that is, with arrows outside of multi types) as replacement for -th ground types. Then one has to lookup in the largest linear type ( the maximum number of top-most arrows of a type that appears in – can appear anywhere, even inside multi types) and set up a strictly larger linear type (that is, with strictly more arrows) as our . Then the proof follows by using Lemma 16, Lemma 18 and Lemma 19 similarly (these lemmas are not particular for countable ground types but mention linear types appearing only once).
Back to the Completeness Proof.
The following technical lemma is straightforward and used in subsequent proofs of other lemmas.
Lemma 21.
If then .
Proof.
By induction on rigid terms:
-
, then by we have that . It is trivial to show that .
-
, then by we have that such that and . By induction we have that , and hence by compositionality of .
Another technical lemma is the reverse implication of Lemma 21, which is slightly less straightforward. We separate them as this second lemma’s proof is more intricate, relying on Proposition 20.
Lemma 22.
If then .
Proof.
By induction on :
Proposition 17 states that normal forms related by the type preorder must have the same normal form structure. The next lemma completes this, by stating that normal forms with the same structure and related by the type preorder have inner sub-terms related by the type preorder as well. This lemma is similar to an inversion lemma, but only working on normal forms, hence its name of normal inversion lemma.
Lemma 23 (Normal Inversion Lemma for ).
Let , be terms and , be rigid terms.
-
1.
Body of Abstractions: if then .
-
2.
Rigid Head: if then .
-
3.
Rigid Arguments: if and then
Proof.
-
1.
Let such that . We shall show that . It is easy to see that by applying rule . By hypothesis, we have that the same typing works for , i.e. . Then the typing derivation starts by the rule and the premise will read , which concludes the proof.
- 2.
- 3.
Now, we are able to prove that for all normal forms, if they are related by the type preorder, then they are related by one step of simulation over the type preorder. The proof uses, first, the fact that the outer shape of normal forms match (Proposition 17) and, second, the normal inversion lemma (Lemma 23).
Theorem 24.
Let and be two -normal forms such that . Then .
Proof.
By case analysis on :
-
1.
If , then by Point 1 of Prop. 17 we have that .
- 2.
- 3.
The previous theorem is enough to prove that the type preorder is a whnf simulation, as whnf simulations only compare normal forms. We are then able to prove our final theorem that concludes the syntactic characterization of the type preorder.
Theorem 25.
-
1.
The type preorder is a whnf simulation.
-
2.
Completeness:
Proof.
-
1.
Let and be such that . If does not -terminate, we have nothing else to check.
-
2.
We can finally apply the coinductive argument: is a whnf simulation by Point 1 and is the largest whnf simulation by definition.
By combining soundness (Theorem 13) and completeness (Theorem 25), we get the complete syntactic characterization of the weak head type preorder .
Theorem 26.
Weak head normal form similarity coincides with the weak type preorder, that is, for all terms , iff . By symmetry, as well.
Type-Böhm Theorem for Weak Head.
A corollary is a separation theorem (akin to Böhm theorem, but adapted for types) for -normal forms. The standard Böhm theorem gives separating evaluation contexts for distinct -normal forms. Here, we give separating types and for distinct -normal forms: our weak head types are able to distinguish -equivalent terms.
Corollary 27 (Type-Böhm Theorem).
Let be two syntactically distinct -normal forms. There exists a typing context and a type such that but .
Proof.
It is clear that for distinct -normal forms and . Then, one can apply Theorem 25 to deduce a separating typing context and type.
7 Conclusions
We study the equational theory induced by multi types and focus, in particular, on the proof technique to coinductively characterize it. We are able to avoid introducing term approximants, refining previous works by Breuvart et al. [13] and Ronchi Della Rocca [22].
We apply the proof technique to a simple case where one considers weak head multi types, the appropriate multi types for weak head evaluation. We construct shape typings in order to show that terms that are typable by the same types must have the same normal forms. Building these shape typings is done explicitly, and they resemble the types built by Breuvart et al. [13] in the case of head reduction in order to show that different approximants cannot be type equivalent. After carefully examining the proof, we are able to clearly state that a countable number of ground types are not needed, a single ground type being enough. We also go for a more direct proof, without considering the contrapositive statement (syntactically different terms implies that there exists a separating typing), closer to Ronchi Della Rocca’s separation [22], which is also based on head reduction.
This work bridges between two programming languages approaches: the study of the untyped -calculus with its models and bisimilarity, a technique imported from process calculi. While the main result is folklore for experts of the -calculus (Lévy-Longo tree equivalence coincides with type equivalence), the coinductive flavor of bisimilarity appears to be new in that equivalence–and makes for quite independent lemmas and easy proofs.
Future Work.
The work presented here focuses on non-idempotent intersection types, which clearly plays a critical role in the soundness proof but appears to be less critical in our completeness development. Techniques with term approximants have been successful in the idempotent setting to characterize the equational theory, which raises the question whether or not our coinductive presentation can also work with idempotent intersection type systems.
We are working on re-using the proof technique described in this paper to a trickier setting, namely the call-by-value -calculus. It would be interesting to give a syntactical characterization of the equational theory induced by Ehrhard’s call-by-value relational semantics [17], for which there are none. A first (wrong) conjecture appeared in [18]. Preliminary results containing only typing transfer are already available on arXiv [5].
Recent work on CbN idempotent intersection types by Polonsky and Statman discuss uniqueness typings to separate terms up to equivalence [20]. It would be interesting to see if their results generalize to normal form bisimilarity.
Recent developments have been able to refine contextual equivalence to a quantitative contextual equivalence, while crucially retaining the fact that this equivalence is invariant by (therefore, an equational theory) [6]. This work focusses on call-by-name and head reduction, where it is shown that Böhm tree equivalence coincides with this new quantitative equivalence. The authors conjecture that Lévy-Longo tree equivalence could match a weak head quantitative contextual equivalence.
As said in the introduction, we do not use a contrapositive statement to show that type equivalence implies normal form bisimilarity. In intermediate lemmas, we may use classical reasoning techniques. It would be interesting to check (possibly via a proof assistant) which steps can be done only with constructive/intuistionistic reasoning. In the case of Böhm out technique, we are not aware of a result that does not use the contrapositive yet the result is deeply constructive (constructing explicitly a separating context).
References
- [1] S. Abramsky and C. H. L. Ong. Full Abstraction in the Lazy Lambda Calculus. Information and Computation, 105(2):159–267, 1993. doi:10.1006/inco.1993.1044.
- [2] Beniamino Accattoli. Semantic Bounds and Multi Types, Revisited. In Aniello Murano and Alexandra Silva, editors, 32nd EACSL Annual Conference on Computer Science Logic (CSL 2024), volume 288 of Leibniz International Proceedings in Informatics (LIPIcs), pages 7:1–7:24, Dagstuhl, Germany, 2024. Schloss Dagstuhl – Leibniz-Zentrum für Informatik. doi:10.4230/LIPIcs.CSL.2024.7.
- [3] Beniamino Accattoli, Stéphane Graham-Lengrand, and Delia Kesner. Tight typings and split bounds. PACMPL, 2(ICFP):94:1–94:30, 2018. doi:10.1145/3236789.
- [4] Beniamino Accattoli, Giulio Guerrieri, and Maico Leberle. Types by need. In Luís Caires, editor, Programming Languages and Systems, pages 410–439, Cham, 2019. Springer International Publishing. doi:10.1007/978-3-030-17184-1_15.
- [5] Beniamino Accattoli, Adrienne Lancelot, and Claudia Faggian. Normal form bisimulations by value, 2023. doi:10.48550/arXiv.2303.08161.
- [6] Beniamino Accattoli, Adrienne Lancelot, Giulio Manzonetto, and Gabriele Vanoni. Interaction equivalence. Proc. ACM Program. Lang., 9(POPL), January 2025. doi:10.1145/3704891.
- [7] Victor Arrial, Giulio Guerrieri, and Delia Kesner. An implementation of the quantitative inhabitation for different lambda calculi in a unifying framework, 2023. doi:10.1145/3554339.
- [8] Victor Arrial, Giulio Guerrieri, and Delia Kesner. Quantitative inhabitation for different lambda calculi in a unifying framework. Proc. ACM Program. Lang., 7(POPL):1483–1513, 2023. doi:10.1145/3571244.
- [9] Hendrik Pieter Barendregt. The Lambda Calculus – Its Syntax and Semantics, volume 103 of Studies in logic and the foundations of mathematics. North-Holland, 1984.
- [10] Henk Barendregt and Giulio Manzonetto. A Lambda Calculus Satellite. College Publications, 2022. URL: https://www.collegepublications.co.uk/logic/mlf/?00035.
- [11] Dariusz Biernacki, Sergueï Lenglet, and Piotr Polesiuk. Proving soundness of extensional normal-form bisimilarities. Electronic Notes in Theoretical Computer Science, 336:41–56, 2018. The Thirty-third Conference on the Mathematical Foundations of Programming Semantics (MFPS XXXIII). doi:10.1016/j.entcs.2018.03.015.
- [12] Viviana Bono and Mariangiola Dezani-Ciancaglini. A tale of intersection types. In Proceedings of the 35th Annual ACM/IEEE Symposium on Logic in Computer Science, LICS ’20, pages 7–20, New York, NY, USA, 2020. Association for Computing Machinery. doi:10.1145/3373718.3394733.
- [13] Flavien Breuvart, Giulio Manzonetto, and Domenico Ruoppolo. Relational Graph Models at Work. Logical Methods in Computer Science, Volume 14, Issue 3, July 2018. doi:10.23638/LMCS-14(3:2)2018.
- [14] Antonio Bucciarelli, Thomas Ehrhard, and Giulio Manzonetto. Not enough points is enough. In Jacques Duparc and Thomas A. Henzinger, editors, Computer Science Logic, pages 298–312, Berlin, Heidelberg, 2007. Springer Berlin Heidelberg. doi:10.1007/978-3-540-74915-8_24.
- [15] Antonio Bucciarelli, Delia Kesner, and Daniel Ventura. Non-idempotent intersection types for the lambda-calculus. Logic Journal of the IGPL, 25(4):431–464, 2017. doi:10.1093/jigpal/jzx018.
- [16] Mariangiola Dezani-Ciancaglini, Silvia Ghilezan, and Silvia Likavec. Behavioural inverse limit -models. Theoretical Computer Science, 316(1):49–74, 2004. Recent Developments in Domain Theory: A collection of papers in honour of Dana S. Scott. doi:10.1016/j.tcs.2004.01.023.
- [17] Thomas Ehrhard. Collapsing non-idempotent intersection types. In Patrick Cégielski and Arnaud Durand, editors, Computer Science Logic (CSL’12) - 26th International Workshop/21st Annual Conference of the EACSL, CSL 2012, September 3-6, 2012, Fontainebleau, France, volume 16 of LIPIcs, pages 259–273. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2012. doi:10.4230/LIPIcs.CSL.2012.259.
- [18] Axel Kerinec, Giulio Manzonetto, and Michele Pagani. Revisiting call-by-value böhm trees in light of their taylor expansion. Log. Methods Comput. Sci., 16(3), 2020. URL: https://lmcs.episciences.org/6638.
- [19] Søren B. Lassen. Bisimulation in untyped lambda calculus: Böhm trees and bisimulation up to context. In Stephen D. Brookes, Achim Jung, Michael W. Mislove, and Andre Scedrov, editors, Fifteenth Conference on Mathematical Foundations of Progamming Semantics, MFPS 1999, Tulane University, New Orleans, LA, USA, April 28 - May 1, 1999, volume 20 of Electronic Notes in Theoretical Computer Science, pages 346–374. Elsevier, 1999. doi:10.1016/S1571-0661(04)80083-5.
- [20] Andrew Polonsky and Richard Statman. On sets of terms having a given intersection type. Logical Methods in Computer Science, Volume 18, Issue 3, September 2022. doi:10.46298/lmcs-18(3:35)2022.
- [21] Simona Ronchi Della Rocca. Intersection Types and Denotational Semantics: An Extended Abstract. In Silvia Ghilezan, Herman Geuvers, and Jelena Ivetic, editors, 22nd International Conference on Types for Proofs and Programs (TYPES 2016), volume 97 of Leibniz International Proceedings in Informatics (LIPIcs), pages 2:1–2:7, Dagstuhl, Germany, 2018. Schloss Dagstuhl – Leibniz-Zentrum für Informatik. doi:10.4230/LIPIcs.TYPES.2016.2.
- [22] Simonetta Ronchi Della Rocca. Characterization theorems for a filter lambda model. Information and Control, 54(3):201–216, 1982. doi:10.1016/S0019-9958(82)80022-3.
- [23] Davide Sangiorgi. The lazy lambda calculus in a concurrency scenario. Information and Computation, 111(1):120–153, 1994. doi:10.1006/inco.1994.1042.
