Archive for the 'Languages and libraries' Category

Unification of simply-typed λ-terms as logic programming

By Dale Miller, from ICLP 1991, available from CiteSeer X:

The unification of simply typed λ-terms modulo the rules of β- and η-conversions is often called “higher-order” unification because of the possible presence of variables of functional type. This kind of unification is undecidable in general and if unifiers exist, most general unifiers may not exist. In this paper, we show that such unification problems can be coded as a query of the logic programming language Lλ in a natural and clear fashion. In a sense, the translation only involves explicitly axiomatizing in Lλ the notions of equality and substitution of the simply
typed λ-calculus: the rest of the unification process can be viewed as simply an interpreter of Lλ searching for proofs using those axioms.

Advertisements

Relating nominal and higher-order abstract syntax specifications

By Andrew Gacek, from PPDP 2010, available from Andrew Gacek’s website:

Nominal abstract syntax and higher-order abstract syntax provide a means for describing binding structure which is higher-level than traditional techniques. These approaches have spawned two different communities which have developed along similar lines but with subtle differences that make them difficult to relate. The nominal abstract syntax community has devices like names, freshness, name-abstractions with variable capture, and the new-quantifier, whereas the higher-order abstract syntax community has devices like lambda-binders, lambda conversion, raising, and the nabla-quantifier. This paper aims to unify these communities and provide a concrete correspondence between their different devices. In particular, we develop a semantics-preserving translation from alpha-Prolog, a nominal abstract syntax based logic programming language, to G-, a higher-order abstract syntax based logic programming language. We also discuss higher-order judgments, a common and powerful tool for specifications with higher-order abstract syntax, and we show how these can be incorporated into G-. This establishes G- as a language with the power of higher-order abstract syntax, the fine-grained variable control of nominal specifications, and the desirable properties of higher-order judgments.

The nabla-calculus. Functional programming with higher-order encodings.

By Carsten Schurmann, Adam Poswolsky and Jeffrey Sarnat, from TLCA 2005, available from the Elphin website:

Higher-order encodings use functions provided by one language to represent variable binders of another. They lead to concise and elegant representations, which historically have been difficult to analyze and manipulate.
In this paper we present the nabla-calculus, a calculus for defining general recursive functions over higher-order encodings. To avoid problems commonly associated with using the same function space for representations and computations, we separate one from the other. The simply-typed λ-calculus plays the role of the representation level. The computation level contains not only the usual computational primitives but also an embedding of the representation-level. It distinguishes itself from similar systems by allowing recursion under representation-level λ-binders while permitting a natural style of programming which we believe scales to other logical frameworks. Sample programs include bracket abstraction, parallel reduction, and an evaluator for a simple language with first-class continuations.

Higher-order Lazy Narrowing Calculus: a Solver for Higher-order Equations

By Tetsuo Ida, Mircea Marin, and Taro Suzuki, from EUROCAST 2001, available from the SCORE website:

This paper introduces a higher-order lazy narrowing calculus (HOLN for short) that solves higher-order equations over the domain of simply typed λ-terms. HOLN is an extension and refinement of Prehofer’s higher-order narrowing calculus LN using the techniques developed in the refinement of a first-order lazy narrowing calculus LNC. HOLN is defined to deal with both unoriented and oriented equations. It keeps track of the variables which are to be bound to normalized answers. We discuss the operating principle of HOLN, its main properties, i.e. soundness and completeness, and its further refinements. The solving capability of HOLN is illustrated with an example of program calculation.

Resolving Inductive Definitions with Binders in Higher-Order Typed Functional Programming

By Matthew R. Lakin and Andrew M. Pitts, from ESOP 2009, available from Andrew M. Pitts’ website:

This paper studies inductive definitions involving binders, in which aliasing between free and bound names is permitted. Such aliasing occurs in informal specifications of operational semantics, but is excluded by the common representation of binding as meta-level λ-abstraction. Drawing upon ideas from functional logic programming, we represent such definitions with aliasing as recursively defined functions in a higher-order typed functional programming language that extends core ML with types for name-binding, a type of “semi-decidable propositions” and existential quantification for types with decidable equality. We show that the representation is sound and complete with respect to the language’s operational semantics, which combines the use of evaluation contexts with constraint programming. We also give a new and simple proof that the associated constraint problem is NP-complete.

Unembedding domain specific languages

By Robert Atkey, Sam Lindley and Jeremy Yallop, from ACM SIGPLAN Haskell Symposium 2009, available from Robert Atkey’s website:

Higher-order abstract syntax provides a convenient way of embedding domain-specific languages, but is awkward to analyse and manipulate directly.

We explore the boundaries of higher-order abstract syntax. Our key tool is the unembedding of embedded terms as de Bruijn terms, enabling intensional analysis. As part of our solution we present techniques for separating the definition of an embedded program from its interpretation, giving modular extensions of the embedded language, and different ways to encode the types of the embedded language.

Associated slides are available here, a video is available here, and Haskell code is available here.

LNGen: Tool Support for Locally Nameless Representations

By Brian Aydemir and Stephanie Weirich, preprint, available from Brian Aydemir’s website:

Given the complexity of the metatheoretic reasoning about current programming languages and their type systems, techniques for mechanical formalization and checking of such metatheory have received
much recent attention. In previous work, we advocated a combination of locally nameless representation and cofinite quantification as a light- weight style for carrying out such formalizations in the Coq proof assistant. As part of the presentation of that methodology, we described a number of operations associated with variable binding and listed a number of properties, called “infrastructure lemmas”, about those operations that needed to be shown. The proofs of these infrastructure lemmas are straightforward but tedious.
In this work, we present LNgen, a prototype tool for automatically generating statements and proofs of infrastructure lemmas from Ott language specifications. Furthermore, the tool also generates a recursion scheme for defining functions over syntax, which was not available in our previous work. LNgen works in concert with Ott to effectively alleviate much of the tedium of working with locally nameless syntax. For the case of untyped lambda terms, we show that the combined output from the two tools is sound and complete, with LNgen automatically proving many of the key lemmas. We prove the soundness of our representation with respect to a fully concrete representation, and we argue that the representation is complete—that we generate the right set of lemmas—with respect to Gordon and Melham’s “Five Axioms of Alpha-Conversion.”

Associated Coq code can be found here.