Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
Publications List: Jesper Jørgensen

Abstracts


  • Efficient analyses for realistic off-line partial evaluation (D-153).
    Journal version of D-164. Written together with Anders Bondorf.

    Abstract
    Three efficient analysis (the evaluation order dependency analysis is excluded) for partial evaluation preprocessing are described. The analyses are based on based on constraint solving and are shown to have almost-linear complexity. Benchmarks are given.

  • Generating a Pattern Matching Compiler by Partial Evaluation (D-59).
    Presented at Glasgow Workshop on Functional Programming 1990 (Ullapool).

    Abstract
    The paper describes the development of a compiler for a small strict combinator language with pattern matching. The compiler is generated by partial evaluation and the focus is on generating efficient code for the pattern matching.

  • Generating a Compiler for a Lazy Language by Partial Evaluation (D-98).
    Presented at POPL'92. An abstract of parts of my master's thesis.

    Abstract
    The paper describes an implementation of a lazy functional language by partial evaluation. It describes the methods used, especially the binding time improvements. The resulting compiler is compared with a compiler for Miranda and a compiler for LML.

  • Formally Optimal Boxing (D-179).
    Presented at POPL'94. Written together with Fritz Henglein.

    Abstract
    Representation analysis seeks to optimize the run-time representation of elements of data types in high-level programming languages. Using a language with explicit representation types and boxing/unboxing operations we axiomatize equationally the set of all explicitly boxed versions called completions. We give the equality axioms a rewriting interpretation that captures eliminating boxing/unboxing operations without relying on a specific implementation or even semantics of the underlying language. The rewriting system produces a unique ``optimal'' completion.

  • Generating Optimizing Specializers (D-186).
    Presented at ICCL'94. Written together with Robert Glück.

    Abstract
    We propose a new method for improving the specialization of programs by inserting an interpreter between a subject program and a specializer. We demonstrate that this interpretive approach may achieve stronger specializations than using a partial evaluator alone. Furthermore, we formulate three specializer projections which enable us to generate specializers from interpreters and self-applicable program specializers.

  • Generating Transformers for Deforestation and Supercompilation (D-195).
    Presented at SAS'94. Written together with Robert Glück.

    Abstract
    The paper uses a new interpretive approach - inserting an interpreter between a source program and a program specializer - to improve the transformation of programs and to automatically generate program transformers. As defined by the specializer projections one may generate stand-alone transformers by self-application of a program specializer. We show that one can generate Wadler's deforestation algorithm and a version of Turchin's supercompiler this way.

  • Efficient Multi-level Generating Extensions for program Specialization.
    Presented at PLILP'95. Written together with Robert Glück.

    Abstract
    The papers presents an effective solution to the problem of automatically generating multi-level generating extensions by generalizing conventional off-line partial evaluation and integrating the cogen approach with a multi-level binding-time analysis. We demonstrate that efficient multi-level generators can be built that automatically produce multi-level generating extensions from arbitrary programs. This new mmulti-cogen approach solves two fundamental problems of self-applicable partial evaluation: the run-time problem and the generator-size problem. Last but not least, the multi-level generator has been implemented and used for automatically generating compiler-generators from meta-interpreters (impossible with conventional partial evaluators).

  • Efficiently Generating Efficient Generating Extensions in Prolog.
    To be presented at the Dagstuhl Seminar: "Partial Evaluation". Written together with Michael Leuschel.

    Abstract
    The so called ``cogen approach'', writing a compiler generator instead of a specialiser, to program specialisation has been used with considerable success in partial evaluation of both functional and imperative languages. This paper demonstrates that this approach is also applicable to partial evaluation of logic programming languages, also called partial deduction. Self-application has not been as much in focus in partial deduction as in partial evaluation of functional and imperative languages, and the attempts to self-apply partial deduction systems have, of yet, not been altogether that succesful. So especially for partial deduction the cogen approach could prove to have a considerable importance when it comes to practical applications. It is demonstrated that using the cogen approach one gets very efficient compiler generators which generate very efficient generating extensions which in turn yield very good and non-trivial specialisation.

  • Fast Multi-Level Binding-Time Analysis for Multiple Program Specialization.
    To be presented at PSI'96, Andrei Ershov Second International Conference "Perspectives of System Informatics", June 1996, Novosibirsk, Russia. Written together with Robert Glück.

    Abstract
    Multiple program specialization can divide a computation into several computation stages. In this paper we present the key ingredient of our approach: an accurate and fast multi-level binding-time analysis. This paper presents three efficient program analyses for higher-order, functional languages based on constraint systems that run almost-linear in the size of the analysed programs. The three constraint normalizations are proved correct (soundness, completeness, termination, existence of best solution). The analyses have all been implemented for a substantial, higher-order subset of Scheme (augmented with user-defined constructors). Experiments with widely-available example programs confirm the excellent run-time behavior of the normalization algorithms.

  • Controlling Conjunctive Partial Deduction.
    To be presented at PLILP'96. Written together with Robert Glück, Bern Martens and Morten H. Sørensen

    Abstract
    ``Classical'' partial deduction, within the framework by Lloyd and Shepherdson, computes partial deduction for separate atoms independently. As a consequence, a number of program optimisations, known from unfold/fold transformations and supercompilation, cannot be achieved.
    In this paper, we show that this restriction can be lifted through (polygenetic) specialisation of entire atom conjunctions. We present a generic algorithm for such partial deduction and discuss its correctness in an extended formal framework. We concentrate on novel control challenges specific to this ``conjunctive'' partial deduction. We refine the generic algorithm into a fully automatic concrete one that registers partially deduced conjunctions in a global tree, and prove its termination and correctness. We discuss some further control refinements and illustrate the operation of the concrete algorithm and/or some of its possible variants on interesting transformation examples.

  • Conjunctive Partial Deduction in Practice (extended abstract).
    To be presented at LOPSTR'96. Written together with Michael Leuschel and Bern Martens

    Abstract
    Recently, partial deduction of logic programs has been extended to conceptually embed folding. To this end, partial deductions are no longer computed of single atoms, but rather of entire conjunctions; Hence the term ``conjunctive partial deduction''. Conjunctive partial deduction aims at achieving unfold/fold-like program transformations such as tupling and deforestation within fully automated partial deduction. However, its merits greatly surpass that limited context: Also other major efficiency improvements are obtained through considerably improved side-ways information propagation. In this extended abstract, we investigate conjunctive partial deduction in practice. We describe the concrete options used in the implementation(s), point out a tricky aspect of abstraction, include and discuss an extensive set of benchmark results. From these, we can conclude that conjunctive partial deduction indeed pays off in practice, thoroughly beating its conventional precursor on a wide range of small to medium size programs. However, controlling it in a perfect way proves far from obvious, and a range of challenging open problems remain as topics for further research.

  • Efficient Multi-level Generating Extensions.
    An abstract of the paper above. Presented at the Atlantique Post PEPM95 Workshop at Lajolla June 95.

  • Efficient analyses for realistic off-line partial evaluation: Extended version (D-164).
    Technical report. Written together with Anders Bondorf. Is for unexplainable reasons split up into two parts (sorry):

  • DVI-part. The whole of the ducument except page 9.

  • PS-part. Page 9 of document.
  • Abstract
    Four efficient analyses for partial evaluation preprocessing are described. The analyses are based on based on constraint solving and are shown to have almost-linear complexity. Benchmarks are given. The analyses used in the partial evaluator Similix. Extended version of D-153.

  • Efficiently Generating Efficient Generating Extensions in Prolog.
    Report CW 221, K.U.Leuven, February 1996.. Written together with Michael Leuschel.

    Abstract
    The so called ``cogen approach'', writing a compiler generator instead of a specialiser, to program specialisation has been used with considerable success in partial evaluation of both functional and imperative languages. This paper demonstrates that this approach is also applicable to partial evaluation of logic programming languages, also called partial deduction. Self-application has not been as much in focus in partial deduction as in partial evaluation of functional and imperative languages, and the attempts to self-apply partial deduction systems have, of yet, not been altogether that succesful. So especially for partial deduction the cogen approach could prove to have a considerable importance when it comes to practical applications. It is demonstrated that using the cogen approach one gets very efficient compiler generators which generate very efficient generating extensions which in turn yield very good and non-trivial specialisation.

  • Controlling Conjunctive Partial Deduction of Definite Logic Programs.
    Report CW 226, K.U.Leuven, February 1996. Written together with Robert Glück, Bern Martens and Morten H. Sørensen

    Abstract
    ``Classical'' partial deduction, within the framework by Lloyd and Shepherdson, computes partial deduction for separate atoms independently. As a consequence, a number of program optimisations, known from unfold/fold transformations and supercompilation, cannot be achieved.
    In this paper, we show that this restriction can be lifted through (polygenetic) specialisation of entire atom conjunctions. We present a generic algorithm for such partial deduction and discuss its correctness in an extended formal framework. We concentrate on novel control challenges specific to this ``conjunctive'' partial deduction. We refine the generic algorithm into a fully automatic concrete one that registers partially deduced conjunctions in a global tree, and prove its termination and correctness. We discuss some further control refinements and illustrate the operation of the concrete algorithm and/or some of its possible variants on interesting transformation examples.

  • Fast Multi-Level Binding-Time Analysis for Multiple Program Specialization.
    Report CW 228, K.U.Leuven, April 1996. Written together with Robert Glück

    Abstract
    Multiple program specialization can divide a computation into several computation stages. We present the key ingredient of our approach to multiple specialization: an accurate and fast multi-level binding-time analysis. Three efficient program analyses for higher-order, functional languages are presented based on constraint systems that run almost-linear in the size of the analyzed programs. The three constraint normalizations are proved correct (soundness, completeness, termination, existence of best solution). The analyses have all been implemented for a substantial, higher-order subset of Scheme (augmented with user-defined constructors). Experiments with widely-available example programs confirm the excellent run-time behavior of the normalization algorithms.

  • A Program Generator for Multi-Level Specialization
    Report CW 230, K.U.Leuven, July 1996. Written together with Robert Glück

    Abstract
    Program specialization can divide a computation into several computation stages. This paper presents the key ingredients for efficient multi-level specialization: a generalization of standard techniques for offline partial evaluation and their integration with the ``cogen approach''. The program generator which we have designed and implemented converts programs into very compact multi-level generating extensions, a generalization of Ershov's generating extension that guarantees fast successive specializations. The theoretical limitations and practical problems of standard specialization tools are studied and the basic methods involved in constructing the program generator are presented. Then a multi-level binding-time analysis for a untyped higher-order functional language is defined and the conversion of annotated programs into executable multi-level generating extensions is described. Finally, we assess experimental results and conclude with comparative remarks.

  • Compiler Generation by Partial Evaluation (D-95).
    My master's thesis.

    Abstract
    Descripes various aspects of compiler generation using partial evaluation and a major example. The example is the generation of a compiler for a lazy functional language. Describes also how to generate optimizing compilers all using partial evaluation.

  • A Calculus for Boxing Analysis of Polymorphically Typed Languages.
    My PhD. thesis.

    Abstract
    An important decision when implementing languages with polymorphic types, such as Standard ML or Haskell, is whether to represent data in boxed or unboxed form and when to transform them from one representation to the other. Using a language with explicit representation types and boxing/unboxing operations we axiomatize equationally the set of all explicitly boxed versions, called completions, of a given source program. In a two-stage process we give some of the equations a rewriting interpretation that captures eliminating boxing/unboxing operations without relying on a specific implementation or even the semantics of the underlying language. The resulting reduction systems operate on congruence classes of completions defined by the remaining equations E, which can be understood as moving boxing/unboxing operations along data flow paths in the source program. We call a completion e formally optimal if every other completion for the same program (and at the same representation type) reduces to e under this two-stage reduction. We show that every source program has formally optimal completions, which are unique modulo E. This is accomplished by first "polarizing" the equations in E and orienting them to obtain two canonical (confluent and strongly normalizing) rewriting systems.

  • J. Jørgensen - K.U.Leuven - (jesper@cs.kuleuven.ac.be) [Last changed: 24/7-1996]