OPUS 4 Latest Documents RSS FeedLatest documents
http://publikationen.ub.uni-frankfurt.de/index/index/
Thu, 07 Aug 2014 15:04:26 +0200Thu, 07 Aug 2014 15:04:26 +0200Correctness of an STM Haskell implementation
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34440
A concurrent implementation of software transactional memory in Concurrent Haskell using a call-by-need functional language with processes and futures is given. The description of the small-step operational semantics is precise and explicit, and employs an early abort of conflicting transactions. A proof of correctness of the implementation is given for a contextual semantics with may- and should-convergence. This implies that our implementation is a correct evaluator for an abstract specification equipped with a big-step semantics.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34440Tue, 08 Jul 2014 15:04:26 +0200Reconstructing a logic for inductive proofs of properties of functional programs
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34438
David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34438Tue, 08 Jul 2014 14:44:41 +0200Reconstruction of a logic for inductive proofs of properties of functional programs
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34437
The interactive verification system VeriFun is based on a polymorphic call-by-value functional language and on a first-order logic with initial model semantics w.r.t. constructors. It is designed to perform automatic induction proofs and can also deal with partial functions. This paper provides a reconstruction of the corresponding logic and semantics using the standard treatment of undefinedness which adapts and improves the VeriFun-logic by allowing reasoning on nonterminating expressions and functions. Equality of expressions is defined as contextual equivalence based on observing termination in all closing contexts. The reconstruction shows that several restrictions of the VeriFun framework can easily be removed, by natural generalizations: mutual recursive functions, abstractions in the data values, and formulas with arbitrary quantifier prefix can be formulated. The main results of this paper are: an extended set of deduction rules usable in VeriFun under the adapted semantics is proved to be correct, i.e. they respect the observational equivalence in all extensions of a program. We also show that certain classes of theorems are conservative under extensions, like universally quantified equations. Also other special classes of theorems are analyzed for conservativity.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34437Tue, 08 Jul 2014 14:35:53 +0200Reconstruction of a logic for inductive proofs of properties of functional programs
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34436
The interactive verification system VeriFun is based on a polymorphic call-by-value functional language and on a first-order logic with initial model semantics w.r.t. constructors. This paper provides a reconstruction of the corresponding logic when partial functions are permitted. Typing is polymorphic for the definition of functions but monomorphic for terms in formulas. Equality of terms is defined as contextual equivalence based on observing termination in all contexts. The reconstruction also allows several generalizations of the functional language like mutual recursive functions and abstractions in the data values. The main results are: Correctness of several program transformations for all extensions of a program, which have a potential usage in a deduction system. We also proved that universally quantified equations are conservative, i.e. if a universally quantified equation is valid w.r.t. a program P, then it remains valid if the program is extended by new functions and/or new data types.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34436Tue, 08 Jul 2014 14:25:24 +0200On correctness of buffer implementations in a concurrent lambda calculus with futures
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34435
Motivated by the question of correctness of a specific implementation of concurrent buffers in the lambda calculus with futures underlying Alice ML, we prove that concurrent buffers and handled futures can correctly encode each other. Correctness means that our encodings preserve and reflect the observations of may- and must-convergence, and as a consequence also yields soundness of the encodings with respect to a contextually defined notion of program equivalence. While these translations encode blocking into queuing and waiting, we also describe an adequate encoding of buffers in a calculus without handles, which is more low-level and uses busy-waiting instead of blocking. Furthermore we demonstrate that our correctness concept applies to the whole compilation process from high-level to low-level concurrent languages, by translating the calculus with buffers, handled futures and data constructors into a small core language without those constructs.Jan Schwinghammer; David Sabel; Joachim Niehren; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34435Tue, 08 Jul 2014 14:12:40 +0200Adequacy of compositional translations for observational semantics
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34433
We investigate methods and tools for analyzing translations between programming languages with respect to observational semantics. The behavior of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.Manfred Schmidt-Schauß; Joachim Niehren; Jan Schwinghammer; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34433Tue, 08 Jul 2014 14:02:58 +0200Adequacy of compositional translations for observational semantics
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34432
We investigate methods and tools for analyzing translations between programming languages with respect to observational semantics. The behavior of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.Manfred Schmidt-Schauß; Joachim Niehren; Jan Schwinghammer; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34432Tue, 08 Jul 2014 13:55:58 +0200Adequacy of compositional translations for observational semantics
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34431
We investigate methods and tools for analysing translations between programming languages with respect to observational semantics. The behaviour of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.
Manfred Schmidt-Schauß; Joachim Niehren; Jan Schwinghammer; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34431Tue, 08 Jul 2014 13:48:09 +0200Adequacy of compositional translations for observational semantics
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34430
We investigate methods and tools for analysing translations between programming languages with respect to observational semantics. The behaviour of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.Manfred Schmidt-Schauß; Joachim Niehren; Jan Schwinghammer; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34430Tue, 08 Jul 2014 13:41:34 +0200On generic context lemmas for lambda calculi with sharing
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34428
This paper proves several generic variants of context lemmas and thus contributes to improving the tools for observational semantics of deterministic and non-deterministic higher-order calculi that use a small-step reduction semantics. The generic (sharing) context lemmas are provided for may- as well as two variants of must-convergence, which hold in a broad class of extended process- and extended lambda calculi, if the calculi satisfy certain natural conditions. As a guide-line, the proofs of the context lemmas are valid in call-by-need calculi, in callby-value calculi if substitution is restricted to variable-by-variable and in process calculi like variants of the π-calculus. For calculi employing beta-reduction using a call-by-name or call-by-value strategy or similar reduction rules, some iu-variants of ciu-theorems are obtained from our context lemmas. Our results reestablish several context lemmas already proved in the literature, and also provide some new context lemmas as well as some new variants of the ciu-theorem. To make the results widely applicable, we use a higher-order abstract syntax that allows untyped calculi as well as certain simple typing schemes. The approach may lead to a unifying view of higher-order calculi, reduction, and observational equality.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34428Tue, 08 Jul 2014 13:09:22 +0200On generic context lemmas for lambda calculi with sharing
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34427
This paper proves several generic variants of context lemmas and thus contributes to improving the tools to develop observational semantics that is based on a reduction semantics for a language. The context lemmas are provided for may- as well as two variants of mustconvergence and a wide class of extended lambda calculi, which satisfy certain abstract conditions. The calculi must have a form of node sharing, e.g. plain beta reduction is not permitted. There are two variants, weakly sharing calculi, where the beta-reduction is only permitted for arguments that are variables, and strongly sharing calculi, which roughly correspond to call-by-need calculi, where beta-reduction is completely replaced by a sharing variant. The calculi must obey three abstract assumptions, which are in general easily recognizable given the syntax and the reduction rules. The generic context lemmas have as instances several context lemmas already proved in the literature for specific lambda calculi with sharing. The scope of the generic context lemmas comprises not only call-by-need calculi, but also call-by-value calculi with a form of built-in sharing. Investigations in other, new variants of extended lambda-calculi with sharing, where the language or the reduction rules and/or strategy varies, will be simplified by our result, since specific context lemmas are immediately derivable from the generic context lemma, provided our abstract conditions are met.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34427Tue, 08 Jul 2014 13:02:37 +0200A call-by-need lambda-calculus with locally bottom-avoiding choice: context lemma and correctness of transformations
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34426
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34426Tue, 08 Jul 2014 12:52:05 +0200A call-by-need lambda-calculus with locally bottom-avoiding choice: context lemma and correctness of transformations
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34425
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34425Tue, 08 Jul 2014 12:38:20 +0200On conservativity of concurrent Haskell
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34312
The calculus CHF models Concurrent Haskell extended by concurrent, implicit futures. It is a process calculus with concurrent threads, monadic concurrent evaluation, and includes a pure functional lambda-calculus which comprises data constructors, case-expressions, letrec-expressions, and Haskell’s seq. Futures can be implemented in Concurrent Haskell using the primitive unsafeInterleaveIO, which is available in most implementations of Haskell. Our main result is conservativity of CHF, that is, all equivalences of pure functional expressions are also valid in CHF. This implies that compiler optimizations and transformations from pure Haskell remain valid in Concurrent Haskell even if it is extended by futures. We also show that this is no longer valid if Concurrent Haskell is extended by the arbitrary use of unsafeInterleaveIO.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34312Thu, 03 Jul 2014 14:49:03 +0200Simulation in the call-by-need lambda-calculus with letrec, case, constructors, and seq
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34234
This paper shows equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in LR, the deterministic call-by-need lambda calculus with letrec extended by data constructors, case-expressions and Haskell's seqoperator. LR models an untyped version of the core language of Haskell. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations.
The proof is by a fully abstract and surjective transfer of the contextual approximation into a call-by-name calculus, which is an extension of Abramsky's lazy lambda calculus. In the latter calculus equivalence of similarity and contextual approximation can be shown by Howe's method. Using an equivalent but inductive definition of behavioral preorder we then transfer similarity back to the calculus LR.
The translation from the call-by-need letrec calculus into the extended call-by-name lambda calculus is the composition of two translations. The first translation replaces the call-by-need strategy by a call-by-name strategy and its correctness is shown by exploiting infinite tress, which emerge by unfolding the letrec expressions. The second translation encodes letrec-expressions by using multi-fixpoint combinators and its correctness is shown syntactically by comparing reductions of both calculi. A further result of this paper is an isomorphism between the mentioned calculi, and also with a call-by-need letrec calculus with a less complex definition of reduction than LR.Manfred Schmidt-Schauß; David Sabel; Elena Machkasovaworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/34234Thu, 03 Jul 2014 14:36:22 +0200Applicative may- and should-simulation in the call-by-value
lambda calculus with amb
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33609
Motivated by the question whether sound and expressive applicative similarities for program calculi with should-convergence exist, this paper investigates expressive applicative similarities for the untyped call-by-value lambda-calculus extended with McCarthy's ambiguous choice operator amb. Soundness of the applicative similarities w.r.t. contextual equivalence based on may-and should-convergence is proved by adapting Howe's method to should-convergence. As usual for nondeterministic calculi, similarity is not complete w.r.t. contextual equivalence which requires a rather complex counter example as a witness. Also the call-by-value lambda-calculus with the weaker nondeterministic construct erratic choice is analyzed and sound applicative similarities are provided. This justifies the expectation that also for more expressive and call-by-need higher-order calculi there are sound and powerful similarities for should-convergence.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33609Tue, 24 Jun 2014 16:53:05 +0200Contextual equivalence for the pi-calculus that can stop
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33608
The pi-calculus is a well-analyzed model for mobile processes and mobile computations.
While a lot of other process and lambda calculi that are core languages of higher-order concurrent and/or functional programming languages use a contextual semantics observing the termination behavior of programs in all program contexts, traditional program equivalences in the pi-calculus are bisimulations and barbed testing equivalences, which observe the communication capabilities of processes under reduction and in contexts.
There is a distance between these two approaches to program equivalence which makes it hard to compare the pi-calculus with other languages. In this paper we contribute to bridging this gap by investigating a contextual semantics of the synchronous pi-calculus with replication and without sums.
To transfer contextual equivalence to the pi-calculus we add a process Stop as constant which indicates success and is used as the base to define and analyze the contextual equivalence which observes may- and should-convergence of processes.
We show as a main result that contextual equivalence in the pi-calculus with Stop conservatively extends barbed testing equivalence in the (Stop-free) pi-calculus. This implies that results on contextual equivalence can be directly transferred to the (Stop-free) pi-calculus with barbed testing equivalence.
We analyze the contextual ordering, prove some nontrivial process equivalences, and provide proof tools for showing contextual equivalences. Among them are a context lemma, and new notions of sound applicative similarities for may- and should-convergence.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33608Tue, 24 Jun 2014 16:37:11 +0200Observational program calculi and the correctness of translations
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33607
Motivated by our experience in analyzing properties of translations between programming languages with observational semantics, this paper clarifies the notions, the relevant questions, and the methods, constructs a general framework, and provides several tools for proving various correctness properties of translations like adequacy and full abstractness. The presented framework can directly be applied to the observational equivalences derived from the operational semantics of programming calculi, and also to other situations, and thus has a wide range of applications.Manfred Schmidt-Schauß; David Sabel; Joachim Niehren; Jan Schwinghammerworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33607Tue, 24 Jun 2014 16:18:13 +0200Extending Abramsky's lazy lambda calculus: (non)-conservativity of embeddings
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33606
Our motivation is the question whether the lazy lambda calculus, a pure lambda calculus with the leftmost outermost rewriting strategy, considered under observational semantics, or extensions thereof, are an adequate model for semantic equivalences in real-world purely functional programming languages, in particular for a pure core language of Haskell. We explore several extensions of the lazy lambda calculus: addition of a seq-operator, addition of data constructors and case-expressions, and their combination, focusing on conservativity of these extensions. In addition to untyped calculi, we study their monomorphically and polymorphically typed versions. For most of the extensions we obtain non-conservativity which we prove by providing counterexamples. However, we prove conservativity of the extension by data constructors and case in the monomorphically typed scenario.Manfred Schmidt-Schauß; Elena Machkasova; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33606Tue, 24 Jun 2014 16:06:24 +0200Extending Abramsky's lazy lambda calculus: (non)-conservativity of embeddings
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33605
Our motivation is the question whether the lazy lambda calculus, a pure lambda calculus with the leftmost outermost rewriting strategy, considered under observational semantics, or extensions thereof, are an adequate model for semantic equivalences in real-world purely functional programming languages, in particular for a pure core language of Haskell. We explore several extensions of the lazy lambda calculus: addition of a seq-operator, addition of data constructors and case-expressions, and their combination, focusing on conservativity of these extensions. In addition to untyped calculi, we study their monomorphically and polymorphically typed versions. For most of the extensions we obtain non-conservativity which we prove by providing counterexamples. However, we prove conservativity of the extension by data constructors and case in the monomorphically typed scenario.Manfred Schmidt-Schauß; Elena Machkasova; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/33605Tue, 24 Jun 2014 15:58:44 +0200Correctness of an STM Haskell implementation
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/27007
A concurrent implementation of software transactional memory in Concurrent Haskell using a call-by-need functional language with processes and futures is given. The description of the small-step operational semantics is precise and explicit, and employs an early abort of conflicting transactions. A proof of correctness of the implementation is given for a contextual semantics with may- and should-convergence. This implies that our implementation is a correct evaluator for an abstract specification equipped with a big-step semantics.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/27007Mon, 29 Oct 2012 15:45:08 +0100Simulation in the call-by-need lambda-calculus with letrec, case, constructors, and seq
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/27005
This paper shows equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in LR, the deterministic call-by-need lambda calculus with letrec extended by data constructors, case-expressions and Haskell's seqoperator. LR models an untyped version of the core language of Haskell. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations.
The proof is by a fully abstract and surjective transfer of the contextual approximation into a call-by-name calculus, which is an extension of Abramsky's lazy lambda calculus. In the latter calculus equivalence of similarity and contextual approximation can be shown by Howe's method. Using an equivalent but inductive definition of behavioral preorder we then transfer similarity back to the calculus LR.
The translation from the call-by-need letrec calculus into the extended call-by-name lambda calculus is the composition of two translations. The first translation replaces the call-by-need strategy by a call-by-name strategy and its correctness is shown by exploiting infinite tress, which emerge by unfolding the letrec expressions. The second translation encodes letrec-expressions by using multi-fixpoint combinators and its correctness is shown syntactically by comparing reductions of both calculi. A further result of this paper is an isomorphism between the mentioned calculi, and also with a call-by-need letrec calculus with a less complex definition of reduction than LR.Manfred Schmidt-Schauß; David Sabel; Elena Machkasovaworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/27005Mon, 29 Oct 2012 15:35:55 +0100On conservativity of concurrent Haskell
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/24252
The calculus CHF models Concurrent Haskell extended by concurrent, implicit futures. It is a process calculus with concurrent threads, monadic concurrent evaluation, and includes a pure functional lambda-calculus which comprises data constructors, case-expressions, letrec-expressions, and Haskell’s seq. Futures can be implemented in Concurrent Haskell using the primitive unsafeInterleaveIO, which is available in most implementations of Haskell. Our main result is conservativity of CHF, that is, all equivalences of pure functional expressions are also valid in CHF. This implies that compiler optimizations and transformations from pure Haskell remain valid in Concurrent Haskell even if it is extended by futures. We also show that this is no longer valid if Concurrent Haskell is extended by the arbitrary use of unsafeInterleaveIO.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/24252Tue, 07 Feb 2012 16:16:23 +0100An abstract machine for concurrent Haskell with futures
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/24253
We show how Sestoft’s abstract machine for lazy evaluation of purely functional programs can be extended to evaluate expressions of the calculus CHF – a process calculus that models Concurrent Haskell extended by imperative and implicit futures. The abstract machine is modularly constructed by first adding monadic IO-actions to the machine and then in a second step we add concurrency. Our main result is that the abstract machine coincides with the original operational semantics of CHF, w.r.t. may- and should-convergence.David Sabelworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/24253Tue, 07 Feb 2012 16:14:01 +0100A contextual semantics for concurrent Haskell with futures
http://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22712
In this paper we analyze the semantics of a higher-order functional language with concurrent threads, monadic IO and synchronizing variables as in Concurrent Haskell. To assure declarativeness of concurrent programming we extend the language by implicit, monadic, and concurrent futures. As semantic model we introduce and analyze the process calculus CHF, which represents a typed core language of Concurrent Haskell extended by concurrent futures. Evaluation in CHF is defined by a small-step reduction relation. Using contextual equivalence based on may- and should-convergence as program equivalence, we show that various transformations preserve program equivalence. We establish a context lemma easing those correctness proofs. An important result is that call-by-need and call-by-name evaluation are equivalent in CHF, since they induce the same program equivalence. Finally we show that the monad laws hold in CHF under mild restrictions on Haskell’s seq-operator, which for instance justifies the use of the do-notation.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.ub.uni-frankfurt.de/frontdoor/index/index/docId/22712Thu, 15 Sep 2011 14:04:58 +0200