Technical report Frank / Johann-Wolfgang-Goethe-Universität, Fachbereich Informatik und Mathematik, Institut für Informatik
Refine
Year of publication
Document Type
- Working Paper (95)
Language
- English (95)
Has Fulltext
- yes (95)
Is part of the Bibliography
- no (95)
Keywords
- Lambda-Kalkül (18)
- Formale Semantik (10)
- Operationale Semantik (8)
- Programmiersprache (7)
- lambda calculus (7)
- Nebenläufigkeit (6)
- functional programming (6)
- concurrency (5)
- pi-calculus (5)
- Logik (4)
Institute
- Informatik (95)
24
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb, which is locally bottom-avoiding. We use a small-step operational semantics in form of a normal order reduction. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into an arbitrary program context their termination behaviour is the same. We use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We evolve different proof tools for proving correctness of program transformations. We provide a context lemma for may- as well as must- convergence which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations keep contextual equivalence. In contrast to other approaches our syntax as well as semantics does not make use of a heap for sharing expressions. Instead we represent these expressions explicitely via letrec-bindings.
23 [version 2.0]
Various static analyses of functional programming languages that permit infinite data structures make use of set constants like Top, Inf, and Bot, denoting all terms, all lists not eventually ending in Nil, and all non-terminating programs, respectively. We use a set language that permits union, constructors and recursive definition of set constants with a greatest fixpoint semantics in the set of all, also infinite, computable trees, where all term constructors are non-strict. This internal report proves decidability, in particular DEXPTIME-completeness, of inclusion of co-inductively defined sets by using algorithms and results from tree automata and set constraints, and contains detailed proofs. The test for set inclusion is required by certain strictness analysis algorithms in lazy functional programming languages and could also be the basis for further set-based analyses.
23
Static analysis of different non-strict functional programming languages makes use of set constants like Top, Inf, and Bot denoting all expressions, all lists without a last Nil as tail, and all non-terminating programs, respectively. We use a set language that permits union, constructors and recursive definition of set constants with a greatest fixpoint semantics. This paper proves decidability, in particular EXPTIMEcompleteness, of subset relationship of co-inductively defined sets by using algorithms and results from tree automata. This shows decidability of the test for set inclusion, which is required by certain strictness analysis algorithms in lazy functional programming languages.
22
Extending the method of Howe, we establish a large class of untyped higher-order calculi, in particular such with call-by-need evaluation, where similarity, also called applicative simulation, can be used as a proof tool for showing contextual preorder. The paper also demonstrates that Mann’s approach using an intermediate “approximation” calculus scales up well from a basic call-by-need non-deterministic lambdacalculus to more expressive lambda calculi. I.e., it is demonstrated, that after transferring the contextual preorder of a non-deterministic call-byneed lambda calculus to its corresponding approximation calculus, it is possible to apply Howe’s method to show that similarity is a precongruence. The transfer is not treated in this paper. The paper also proposes an optimization of the similarity-test by cutting off redundant computations. Our results also applies to deterministic or non-deterministic call-by-value lambda-calculi, and improves upon previous work insofar as it is proved that only closed values are required as arguments for similaritytesting instead of all closed expressions.
21
Sharing of substructures like subterms and subcontexts in terms is a common method for space-efficient representation of terms, which allows for example to represent exponentially large terms in polynomial space, or to represent terms with iterated substructures in a compact form. We present singleton tree grammars as a general formalism for the treatment of sharing in terms. Singleton tree grammars (STG) are recursion-free context-free tree grammars without alternatives for non-terminals and at most unary second-order nonterminals. STGs generalize Plandowski's singleton context free grammars to terms (trees). We show that the test, whether two different nonterminals in an STG generate the same term can be done in polynomial time, which implies that the equality test of terms with shared terms and contexts, where composition of contexts is permitted, can be done in polynomial time in the size of the representation. This will allow polynomial-time algorithms for terms exploiting sharing. We hope that this technique will lead to improved upper complexity bounds for variants of second order unification algorithms, in particular for variants of context unification and bounded second order unification.
20
This paper proves correctness of Nöcker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt did on the correctness of the abstract reduction rules. Our method fully considers the cycle detection rules, which are the main strength of Nöcker's strictness analysis. Our algorithm SAL is a reformulation of Nöcker's strictness analysis algorithm in a higher-order call-by-need lambda-calculus with case, constructors, letrec, and seq, extended by set constants like Top or Inf, denoting sets of expressions. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a non-termination checker. The proof of its correctness and hence of Nöcker's strictness analysis is based mainly on an exact analysis of the lengths of normal order reduction sequences. The main measure being the number of 'essential' reductions in a normal order reduction sequence. Our tools and results provide new insights into call-by-need lambda-calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for Nöcker's strictness analysis in Clean, and also for its use in Haskell.
19
This paper proves correctness of Nocker s method of strictness analysis, implemented for Clean, which is an e ective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt, which addresses correctness of the abstract reduction rules. Our method also addresses the cycle detection rules, which are the main strength of Nocker s strictness analysis. We reformulate Nocker s strictness analysis algorithm in a higherorder lambda-calculus with case, constructors, letrec, and a nondeterministic choice operator used as a union operator. Furthermore, the calculus is expressive enough to represent abstract constants like Top or Inf. The operational semantics is a small-step semantics and equality of expressions is defined by a contextual semantics that observes termination of expressions. The correctness of several reductions is proved using a context lemma and complete sets of forking and commuting diagrams. The proof is based mainly on an exact analysis of the lengths of normal order reductions. However, there remains a small gap: Currently, the proof for correctness of strictness analysis requires the conjecture that our behavioral preorder is contained in the contextual preorder. The proof is valid without referring to the conjecture, if no abstract constants are used in the analysis.
18
Work on proving congruence of bisimulation in functional programming languages often refers to [How89,How96], where Howe gave a highly general account on this topic in terms of so-called lazy computation systems . Particularly in implementations of lazy functional languages, sharing plays an eminent role. In this paper we will show how the original work of Howe can be extended to cope with sharing. Moreover, we will demonstrate the application of our approach to the call-by-need lambda-calculus lambda-ND which provides an erratic non-deterministic operator pick and a non-recursive let. A definition of a bisimulation is given, which has to be based on a further calculus named lambda-~, since the na1ve bisimulation definition is useless. The main result is that this bisimulation is a congruence and contained in the contextual equivalence. This might be a step towards defining useful bisimulation relations and proving them to be congruences in calculi that extend the lambda-ND-calculus.
17
In this paper we demonstrate how to relate the semantics given by the nondeterministic call-by-need calculus FUNDIO [SS03] to Haskell. After introducing new correct program transformations for FUNDIO, we translate the core language used in the Glasgow Haskell Compiler into the FUNDIO language, where the IO construct of FUNDIO corresponds to direct-call IO-actions in Haskell. We sketch the investigations of [Sab03b] where a lot of program transformations performed by the compiler have been shown to be correct w.r.t. the FUNDIO semantics. This enabled us to achieve a FUNDIO-compatible Haskell-compiler, by turning o not yet investigated transformations and the small set of incompatible transformations. With this compiler, Haskell programs which use the extension unsafePerformIO in arbitrary contexts, can be compiled in a "safe" manner.
16
This paper proposes a non-standard way to combine lazy functional languages with I/O. In order to demonstrate the usefulness of the approach, a tiny lazy functional core language FUNDIO , which is also a call-by-need lambda calculus, is investigated. The syntax of FUNDIO has case, letrec, constructors and an IO-interface: its operational semantics is described by small-step reductions. A contextual approximation and equivalence depending on the input-output behavior of normal order reduction sequences is defined and a context lemma is proved. This enables to study a semantics of FUNDIO and its semantic properties. The paper demonstrates that the technique of complete reduction diagrams enables to show a considerable set of program transformations to be correct. Several optimizations of evaluation are given, including strictness optimizations and an abstract machine, and shown to be correct w.r.t. contextual equivalence. Correctness of strictness optimizations also justifies correctness of parallel evaluation. Thus this calculus has a potential to integrate non-strict functional programming with a non-deterministic approach to input-output and also to provide a useful semantics for this combination. It is argued that monadic IO and unsafePerformIO can be combined in Haskell, and that the result is reliable, if all reductions and transformations are correct w.r.t. to the FUNDIO-semantics. Of course, we do not address the typing problems the are involved in the usage of Haskell s unsafePerformIO. The semantics can also be used as a novel semantics for strict functional languages with IO, where the sequence of IOs is not fixed.