OPUS 4 Latest Documents RSS FeedLatest documents
http://publikationen.ub.uni-frankfurt.de/index/index/
Mon, 29 Oct 2012 15:45:08 +0100Mon, 29 Oct 2012 15:45:08 +0100Correctness of an STM Haskell implementation
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/27007
A concurrent implementation of software transactional memory in Concurrent Haskell using a call-by-need functional language with processes and futures is given. The description of the small-step operational semantics is precise and explicit, and employs an early abort of con
icting transactions. A proof of correctness of the implementation is given for a contextual semantics with may- and should-convergence.
This implies that our implementation is a correct evaluator for an abstract specification equipped with a big-step semantics.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/27007Mon, 29 Oct 2012 15:45:08 +0100Simulation in the call-by-need lambda-calculus with letrec, case, constructors, and seq
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/27005
This paper shows equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in LR, the deterministic call-by-need lambda calculus with letrec extended by data constructors, case-expressions and Haskell's seqoperator. LR models an untyped version of the core language of Haskell. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations.
The proof is by a fully abstract and surjective transfer of the contextual
approximation into a call-by-name calculus, which is an extension
of Abramsky's lazy lambda calculus. In the latter calculus equivalence
of similarity and contextual approximation can be shown by Howe's
method. Using an equivalent but inductive definition of behavioral preorder
we then transfer similarity back to the calculus LR.
The translation from the call-by-need letrec calculus into the extended call-by-name lambda calculus is the composition of two translations. The first translation replaces the call-by-need strategy by a call-by-name strategy and its correctness is shown by exploiting infinite tress, which emerge by unfolding the letrec expressions. The second translation encodes letrec-expressions by using multi-fixpoint combinators and its correctness is shown syntactically by comparing reductions of both calculi. A further result of this paper is an isomorphism between the mentioned calculi, and also with a call-by-need letrec calculus with a less complex definition of reduction than LR.Manfred Schmidt-Schauß; David Sabel; Elena Machkasovaworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/27005Mon, 29 Oct 2012 15:35:55 +0100On conservativity of concurrent Haskell
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/24252
The calculus CHF models Concurrent Haskell extended by
concurrent, implicit futures. It is a process calculus with concurrent threads, monadic concurrent evaluation, and includes a pure functional
lambda-calculus which comprises data constructors, case-expressions,
letrec-expressions, and Haskell’s seq. Futures can be implemented in Concurrent
Haskell using the primitive unsafeInterleaveIO, which is available in most implementations of Haskell. Our main result is conservativity
of CHF, that is, all equivalences of pure functional expressions are
also valid in CHF. This implies that compiler optimizations and transformations
from pure Haskell remain valid in Concurrent Haskell even if
it is extended by futures. We also show that this is no longer valid if Concurrent
Haskell is extended by the arbitrary use of unsafeInterleaveIO.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/24252Tue, 07 Feb 2012 16:16:23 +0100Pattern matching of compressed terms and contexts and polynomial rewriting
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22717
A generalization of the compressed string pattern match that applies to terms with variables is investigated: Given terms s and t compressed by singleton tree grammars, the task is to find an instance of s that occurs as a subterm in t. We show that this problem is in NP and that the task can be performed in time O(ncjVar(s)j), including the construction of the compressed substitution, and a representation of all occurrences. We show that the special case where s is uncompressed can be performed in polynomial time. As a nice application we show that for an equational deduction of t to t0 by an equality axiom l = r (a rewrite) a single step can be performed in polynomial time in the size of compression of t and l; r if the number of variables is fixed in l. We also show that n rewriting steps can be performed in polynomial time, if the equational axioms are compressed and assumed to be constant for the rewriting sequence. Another potential application are querying mechanisms on compressed XML-data bases.Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22717Mon, 19 Sep 2011 12:50:52 +0200Computing overlappings by unification in the deterministic lambda calculus LR with letrec, case, constructors, seq and variable chains
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22714
Correctness of program transformations in extended lambda calculi with a contextual semantics is usually based on reasoning about the operational semantics which is a rewrite semantics. A successful approach to proving correctness is the combination of a context lemma with the computation of overlaps between program transformations and the reduction rules.The method is similar to the computation of critical pairs for the completion of term rewriting systems. We describe an effective unification algorithm to determine all overlaps of transformations with reduction rules for the lambda calculus LR which comprises a recursive let-expressions, constructor applications, case expressions and a seq construct for strict evaluation. The unification algorithm employs many-sorted terms, the equational theory of left-commutativity modeling multi-sets, context variables of different kinds and a mechanism for compactly representing binding chains in recursive let-expressions. As a result the algorithm computes a finite set of overlappings for the reduction rules of the calculus LR that serve as a starting point to the automatization of the analysis of program transformations.Conrad Rau; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22714Thu, 15 Sep 2011 14:20:40 +0200Fast equality test for straight-line compressed strings
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22713
The paper describes a simple and fast randomized test for equality of grammar-compressed strings. The thorough running time analysis is done by applying a logarithmic cost measure. Keywords: randomized algorithms, straight line programs, grammar-based compressionManfred Schmidt-Schauß; Georg Schnitgerworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22713Thu, 15 Sep 2011 14:16:18 +0200A contextual semantics for concurrent Haskell with futures
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22712
In this paper we analyze the semantics of a higher-order functional language with concurrent threads, monadic IO and synchronizing variables as in Concurrent Haskell. To assure declarativeness of concurrent programming we extend the language by implicit, monadic, and concurrent futures. As semantic model we introduce and analyze the process calculus CHF, which represents a typed core language of Concurrent Haskell extended by concurrent futures. Evaluation in CHF is defined by a small-step reduction relation. Using contextual equivalence based on may- and should-convergence as program equivalence, we show that various transformations preserve program equivalence. We establish a context lemma easing those correctness proofs. An important result is that call-by-need and call-by-name evaluation are equivalent in CHF, since they induce the same program equivalence. Finally we show that the monad laws hold in CHF under mild restrictions on Haskell’s seq-operator, which for instance justifies the use of the do-notation.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22712Thu, 15 Sep 2011 14:04:58 +0200A termination proof of reduction in a simply typed calculus with constructors
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22711
The well-known proof of termination of reduction in simply typed calculi is adapted to a monomorphically typed lambda-calculus with case and constructors and recursive data types. The proof differs at several places from the standard proof. Perhaps it is useful and can be extended also to more complex calculi.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22711Thu, 15 Sep 2011 13:56:24 +0200Towards correctness of program transformations through unification and critical pair computation
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/7940
Correctness of program transformations in extended lambda-calculi with a contextual semantics is usually based on reasoning about the operational semantics which is a rewrite semantics. A successful approach is the combination of a context lemma with the computation of overlaps between program transformations and the reduction rules, which results in so-called complete sets of diagrams. The method is similar to the computation of critical pairs for the completion of term rewriting systems. We explore cases where the computation of these overlaps can be done in a first order way by variants of critical pair computation that use unification algorithms. As a case study of an application we describe a finitary and decidable unification algorithm for the combination of the equational theory of left-commutativity modelling multi-sets, context variables and many-sorted unification. Sets of equations are restricted to be almost linear, i.e. every variable and context variable occurs at most once, where we allow one exception: variables of a sort without ground terms may occur several times. Every context variable must have an argument-sort in the free part of the signature. We also extend the unification algorithm by the treatment of binding-chains in let- and letrec-environments and by context-classes. This results in a unification algorithm that can be applied to all overlaps of normal-order reductions and transformations in an extended lambda calculus with letrec that we use as a case study.Conrad Rau; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/7940Sat, 04 Sep 2010 13:04:42 +0200Simulation in the call-by-need lambda-calculus with letrec
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/7828
This paper shows the equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in the deterministic call-by-need lambda calculus with letrec. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations. Although this property may be a natural one to expect, to the best of our knowledge, this paper is the first one providing a proof. The proof technique is to transfer the contextual approximation into Abramsky's lazy lambda calculus by a fully abstract and surjective translation. This also shows that the natural embedding of Abramsky's lazy lambda calculus into the call-by-need lambda calculus with letrec is an isomorphism between the respective term-models.We show that the equivalence property proven in this paper transfers to a call-by-need letrec calculus developed by Ariola and Felleisen.Manfred Schmidt-Schauß; David Sabel; Elena Machkasovaworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/7828Tue, 22 Jun 2010 17:47:16 +0200Counterexamples to simulation in non-deterministic call-by-need lambda-calculi with letrec
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/7827
This note shows that in non-deterministic extended lambda calculi with letrec, the tool of applicative (bi)simulation is in general not usable for contextual equivalence, by giving a counterexample adapted from data flow analysis. It also shown that there is a flaw in a lemma and a theorem concerning finite simulation in a conference paper by the first two authors.Manfred Schmidt-Schauß; Elena Machkasova; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/7827Tue, 22 Jun 2010 17:45:50 +0200Reconstruction a logic for inductive proofs of properties of functional programs
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/7826
A logical framework consisting of a polymorphic call-by-value functional language and a first-order logic on the values is presented, which is a reconstruction of the logic of the verification system VeriFun. The reconstruction uses contextual semantics to define the logical value of equations. It equates undefinedness and non-termination, which is a standard semantical approach. The main results of this paper are: Meta-theorems about the globality of several classes of theorems in the logic, and proofs of global correctness of transformations and deduction rules. The deduction rules of VeriFun are globally correct if rules depending on termination are appropriately formulated. The reconstruction also gives hints on generalizations of the VeriFun framework: reasoning on nonterminating expressions and functions, mutual recursive functions and abstractions in the data values, and formulas with arbitrary quantifier prefix could be allowed.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/7826Tue, 22 Jun 2010 17:44:13 +0200Contextual equivalence in lambda-calculi extended with letrec and with a parametric polymorphic type system
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/6667
This paper describes a method to treat contextual equivalence in polymorphically typed lambda-calculi, and also how to transfer equivalences from the untyped versions of lambda-calculi to their typed variant, where our specific calculus has letrec, recursive types and is nondeterministic. An addition of a type label to every subexpression is all that is needed, together with some natural constraints for the consistency of the type labels and well-scopedness of expressions. One result is that an elementary but typed notion of program transformation is obtained and that untyped contextual equivalences also hold in the typed calculus as long as the expressions are well-typed. In order to have a nice interaction between reduction and typing, some reduction rules have to be accompanied with a type modification by generalizing or instantiating types.Manfred Schmidt-Schauß; David Sabel; Frederik Harwathworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/6667Tue, 07 Jul 2009 13:30:55 +0200On correctness of buffer implementations in a concurrent lambda calculus with futures
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/6666
Motivated by the question of correctness of a specific implementation of concurrent buffers in the lambda calculus with futures underlying Alice ML, we prove that concurrent buffers and handled futures can correctly encode each other. Correctness means that our encodings preserve and reflect the observations of may- and must-convergence. This also shows correctness wrt. program semantics, since the encodings are adequate translations wrt. contextual semantics. While these translations encode blocking into queuing and waiting, we also provide an adequate encoding of buffers in a calculus without handles, which is more low-level and uses busy-waiting instead of blocking. Furthermore we demonstrate that our correctness concept applies to the whole compilation process from high-level to low-level concurrent languages, by translating the calculus with buffers, handled futures and data constructors into a small core language without those constructs.Jan Schwinghammer; David Sabel; Joachim Niehren; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/6666Tue, 07 Jul 2009 13:22:26 +0200Closures of may and must convergence for contextual equivalence
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/6176
We show on an abstract level that contextual equivalence in non-deterministic program calculi defined by may- and must-convergence is maximal in the following sense. Using also all the test predicates generated by the Boolean, forall- and existential closure of may- and must-convergence does not change the contextual equivalence. The situation is different if may- and total must-convergence is used, where an expression totally must-converges if all reductions are finite and terminate with a value: There is an infinite sequence of test-predicates generated by the Boolean, forall- and existential closure of may- and total must-convergence, which also leads to an infinite sequence of different contextual equalities.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/6176Thu, 29 Jan 2009 09:16:03 +0100On proving the equivalence of concurrency primitives
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/6036
Various concurrency primitives have been added to sequential programming languages, in order to turn them concurrent. Prominent examples are concurrent buffers for Haskell, channels in Concurrent ML, joins in JoCaml, and handled futures in Alice ML. Even though one might conjecture that all these primitives provide the same expressiveness, proving this equivalence is an open challenge in the area of program semantics. In this paper, we establish a first instance of this conjecture. We show that concurrent buffers can be encoded in the lambda calculus with futures underlying Alice ML. Our correctness proof results from a systematic method, based on observational semantics with respect to may and must convergence.Jan Schwinghammer; David Sabel; Joachim Niehren; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/6036Thu, 20 Nov 2008 16:20:55 +0100Adequacy of compositional translations for observational semantics
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/192
We investigate methods and tools for analysing translations between programming languages with respect to observational semantics. The behaviour of programs is observed in terms of may- and must-convergence in arbitrary contexts, and adequacy of translations, i.e., the reﬂection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extension.Manfred Schmidt-Schauß; Joachim Niehren; Jan Schwinghammer; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/192Fri, 07 Mar 2008 16:07:51 +0100A finite simulation method in a non-deterministic call-by-need calculus with letrec, constructors and case
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/191
The paper proposes a variation of simulation for checking and proving contextual equivalence in a non-deterministic call-by-need lambda-calculus with constructors, case, seq, and a letrec with cyclic dependencies. It also proposes a novel method to prove its correctness. The calculus' semantics is based on a small-step rewrite semantics and on may-convergence. The cyclic nature of letrec bindings, as well as non-determinism, makes known approaches to prove that simulation implies contextual equivalence, such as Howe's proof technique, inapplicable in this setting. The basic technique for the simulation as well as the correctness proof is called pre-evaluation, which computes a set of answers for every closed expression. If simulation succeeds in finite computation depth, then it is guaranteed to show contextual preorder of expressions.Manfred Schmidt-Schauß; Elena Machkasovaworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/191Fri, 07 Mar 2008 16:05:42 +0100On equivalences and standardization in a non-deterministic call-by-need lambda calculus
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/510
The goal of this report is to prove correctness of a considerable subset of transformations w.r.t. contextual equivalence in a an extended lambda-calculus with case, constructors, seq, let, and choice, with a simple set of reduction rules. Unfortunately, a direct proof appears to be impossible. The correctness proof is by defining another calculus comprising the complex variants of copy, case-reduction and seq-reductions that use variablebinding chains. This complex calculus has well-behaved diagrams and allows a proof that of correctness of transformations, and also that the simple calculus defines an equivalent contextual order.Manfred Schmidt-Schauß; Matthias Mannworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/510Mon, 22 Oct 2007 12:53:00 +0200Program transformation for functional circuit descriptions
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/1569
We model sequential synchronous circuits on the logical level by signal-processing programs in an extended lambda calculus Lpor with letrec, constructors, case and parallel or (por) employing contextual equivalence. The model describes gates as (parallel) boolean operators, memory using a delay, which in turn is modeled as a shift of the list of signals, and permits also constructive cycles due to the parallel or. It opens the possibility of a large set of program transformations that correctly transform the expressions and thus the represented circuits and provides basic tools for equivalence testing and optimizing circuits. A further application is the correct manipulation by transformations of software components combined with circuits. The main part of our work are proof methods for correct transformations of expressions in the lambda calculus Lpor, and to propose the appropriate program transformations.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/1569Wed, 28 Feb 2007 16:45:20 +0100Correctness of copy in calculi with letrec, case, constructors and por
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/1568
This paper extends the internal frank report 28 as follows: It is shown that for a call-by-need lambda calculus LRCCP-Lambda extending the calculus LRCC-Lambda by por, i.e in a lambda-calculus with letrec, case, constructors, seq and por, copying can be done without restrictions, and also that call-by-need and call-by-name strategies are equivalent w.r.t. contextual equivalence.Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/1568Wed, 28 Feb 2007 16:43:27 +0100Correctness of copy in calculi with letrec, case and constructors
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/1567
Call-by-need lambda calculi with letrec provide a rewritingbased operational semantics for (lazy) call-by-name functional languages. These calculi model the sharing behavior during evaluation more closely than let-based calculi that use a fixpoint combinator. In a previous paper we showed that the copy-transformation is correct for the small calculus LR-Lambda. In this paper we demonstrate that the proof method based on a calculus on infinite trees for showing correctness of instantiation operations can be extended to the calculus LRCC-Lambda with case and constructors, and show that copying at compile-time can be done without restrictions. We also show that the call-by-need and call-by-name strategies are equivalent w.r.t. contextual equivalence. A consequence is correctness of all the transformations like instantiation, inlining, specialization and common subexpression elimination in LRCC-Lambda. We are confident that the method scales up for proving correctness of copy-related transformations in non-deterministic lambda calculi if restricted to "deterministic" subterms.Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/1567Wed, 28 Feb 2007 16:41:03 +0100On generic context lemmas for lambda calculi with sharing
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/1573
This paper proves several generic variants of context lemmas and thus contributes to improving the tools to develop observational semantics that is based on a reduction semantics for a language. The context lemmas are provided for may- as well as two variants of mustconvergence and a wide class of extended lambda calculi, which satisfy certain abstract conditions. The calculi must have a form of node sharing, e.g. plain beta reduction is not permitted. There are two variants, weakly sharing calculi, where the beta-reduction is only permitted for arguments that are variables, and strongly sharing calculi, which roughly correspond to call-by-need calculi, where beta-reduction is completely replaced by a sharing variant. The calculi must obey three abstract assumptions, which are in general easily recognizable given the syntax and the reduction rules. The generic context lemmas have as instances several context lemmas already proved in the literature for specific lambda calculi with sharing. The scope of the generic context lemmas comprises not only call-by-need calculi, but also call-by-value calculi with a form of built-in sharing. Investigations in other, new variants of extended lambda-calculi with sharing, where the language or the reduction rules and/or strategy varies, will be simplified by our result, since specific context lemmas are immediately derivable from the generic context lemma, provided our abstract conditions are met.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/1573Wed, 28 Feb 2007 16:37:52 +0100Program Equivalence for a Concurrent Lambda Calculus with Futures
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/2181
Reasoning about the correctness of program transformations requires a notion of program equivalence. We present an observational semantics for the concurrent lambda calculus with futures Lambda(fut), which formalizes the operational semantics of the programming language Alice ML. We show that natural program optimizations, as well as partial evaluation with respect to deterministic rules, are correct for Lambda(fut). This relies on a number of fundamental properties that we establish for our observational semantics.Joachim Niehren; David Sabel; Manfred Schmidt-Schauß; Jan Schwinghammerworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/2181Mon, 30 Oct 2006 08:54:30 +0100Equivalence of Call-By-Name and Call-By-Need for Lambda-Calculi with Letrec
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/2222
We develop a proof method to show that in a (deterministic) lambda calculus with letrec and equipped with contextual equivalence the call-by-name and the call-by-need evaluation are equivalent, and also that the unrestricted copy-operation is correct. Given a let-binding x = t, the copy-operation replaces an occurrence of the variable x by the expression t, regardless of the form of t. This gives an answer to unresolved problems in several papers, it adds a strong method to the tool set for reasoning about contextual equivalence in higher-order calculi with letrec, and it enables a class of transformations that can be used as optimizations. The method can be used in different kind of lambda calculi with cyclic sharing. Probably it can also be used in non-deterministic lambda calculi if the variable x is "deterministic", i.e., has no interference with non-deterministic executions. The main technical idea is to use a restricted variant of the infinitary lambda-calculus, whose objects are the expressions that are unrolled w.r.t. let, to define the infinite developments as a reduction calculus on the infinite trees and showing a standardization theorem.Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/2222Fri, 29 Sep 2006 12:13:44 +0200