Reflections on monadic lenses

Bidirectional transformations (bx) have primarily been modeled as pure
functions, and do not account for the possibility of the side-effects
that are available in most programming languages. Recently several
formulations of bx that use monads to account for effects have been
proposed, both among practitioners and in academic research. The
combination of bx with effects turns out to be surprisingly subtle,
leading to problems with some of these proposals and increasing the
complexity of others. This paper reviews the proposals for monadic
lenses to date, and offers some improved definitions, paying
particular attention to the obstacles to naively adding monadic
effects to existing definitions of pure bx such as lenses and
symmetric lenses, and the subtleties of equivalence of symmetric
bidirectional transformations in the presence of effects.

Conflation confers concurrency

Session types provide a static guarantee that concurrent programs
respect communication protocols. Recent work has explored a
correspondence between proof rules and cut reduction in linear logic
and typing and evaluation of process calculi. This paper considers two
approaches to extend logically-founded process calculi. First, we
consider extensions of the process calculus to more closely resemble
pi-calculus. Second, inspired by denotational models of process
calculi, we consider conflating dual types. Most interestingly, we
observe that these approaches coincide: conflating tensor and par
allows processes to share multiple channels; conflating plus and with
provides nondeterminism; and finally, conflating of-course and why-not
yields access points, a rendezvous mechanism for initiating
session-typed communication. Access points are particularly
expressive: for example, they are sufficient to encode concurrent
state and general recursion.

Counting successes: effects and transformations for non-deterministic programs

We give a simple effect system for non-deterministic programs,
tracking static approximations to the number of results that may be
produced by each computation. A relational semantics for the effect
system establishes the soundness of both the analysis and its use in
effect-based program transformations.

The essence of inheritance

Programming languages serve a dual purpose: to communicate programs to
computers, and to communicate programs to human beings. Indeed, it is
this dual purpose that makes programming language design a constrained
and challenging problem. The thesis of this paper is that inheritance
is an essential aspect of that second purpose: it is a tool to improve
communication. Our argument is that most humans understand new
concepts most readily by first looking at a number of concrete
examples, and later abstracting over those examples. Inheritance
mirrors this process: it provides a formal mechanism for moving from
the concrete to the abstract.

Pointlessness is better than listlessness

Extensible sparse functional arrays (ESFA) is a persistent data
structure with an implementation that performs every operation in O(1)
time and space. There is no requirement for single threading of
arrays, and no performance penalty for sharing. The implementation is
an example of hardware/software co-design and also of active data
structures. This makes the work interdisciplinary, and it builds on
many ideas from functional programming. These include the definition
of new array operations, observations about mutability and purity,
monads, program transformations to remove unnecessary data structures,
equational reasoning, and fixing space leaks in the garbage
collector. This paper summarises a recently published exposition of
the system, focusing on drawing connections with some of these
foundational ideas. It also presents some new results on a simulation
of ESFA on a GPU. A surprising result is that although the simulation
has high overheads and limited parallelism, it can outperform a state
of the art red black tree implementation when there is a large amount
of sharing between arrays.

The lambda calculus: practice and principle

The Lambda Calculus has perplexed students of computer science for
millennia, rendering many incapable of understanding even the most
basic precepts of functional programming. This paper gently introduces
the core concepts to the lay reader, assuming only a minimum of
background knowledge in category theory, quantum chromodynamics, and
paleomagnetism.

In addition, this paper goes on to its main results, showing how the Lambda Calculus can be used to easily prove the termination of Leibniz' Hailstone numbers for all $n > 0$, to show that matrix multiplication is possible in linear time, and to guarantee Scottish independence.

In addition, this paper goes on to its main results, showing how the Lambda Calculus can be used to easily prove the termination of Leibniz' Hailstone numbers for all $n > 0$, to show that matrix multiplication is possible in linear time, and to guarantee Scottish independence.

Subtyping supports safe session substitution

Session types describe the structure of bi-directional point-to-point
communication channels by specifying the sequence and format of
messages on each channel. A session type defines a communication
protocol. Type systems that include session types are able to
statically verify that communication-based code generates, and
responds to, messages according to a specified protocol. It is natural
to consider subtyping for session types, but the literature contains
conflicting definitions. It is now folklore that one definition is
based on safe substitutability of channels, while the other is based
on safe substitutability of processes. We explain this point in more
detail, and show how to unify the two views.

Proof relevant parametricity

Parametricity is one of the foundational principles which underpin our
understanding of modern programming languages. Roughly speaking,
parametricity expresses the hidden invariants of programs satisfy by
formalising the intuition that programs map related inputs to related
outputs. Traditionally parametricity is formulated with
proof-irrelevant relations but programming in Type Theory requires an
extension to proof-relevant relations. But then one might ask … can
our proofs be parametric? This paper shows how this can be done and,
excitingly, our answer requires a trip into the world of higher
dimensional parametricity.

Comprehending ringads

List comprehensions are a widely used programming construct, in
languages such as Haskell and Python and in technologies such as
Microsoft's Language Integrated Query. They generalize from lists to
arbitrary monads, yielding a lightweight idiom of imperative
programming in a pure functional language. When the monad has the
additional structure of a so-called ringad, corresponding to "empty"
and "union" operations, then it can be seen as some kind of collection
type, and the comprehension notation can also be extended to
incorporate aggregations. Ringad comprehensions represent a convenient
notation for expressing database queries. The ringad structure alone
does not provide a good explanation or an efficient implementation of
relational joins; but by allowing heterogeneous comprehensions,
involving both bag and indexed table ringads, we show how to
accommodate these too.

Dragging proofs out of pictures

String diagrams provide category theory with a new and very
distinctive visual flavour. We demonstrate that they are an effective
tool for equational reasoning using a variety of examples evolving
around the composition of monads. A deductive approach is followed,
discovering necessary concepts as we go along. In particular, we show
that the Yang--Baxter equation arises naturally when composing three
monads. We are also concerned with the pragmatics of string
diagrams. Diagrams are carefully arranged to provide geometric
intution for the proof steps being followed. We introduce thick wires
to distinguish composite functors and suggest a two-dimensional
analogue of parenthesis to partion diagrams.

Experiences with QuickCheck: testing the hard stuff and staying sane

QuickCheck, well known in the Haskell community, generates random
tests from formal specifications, and simplifies failing tests to
minimal counterexamples. Almost ten years ago, I founded Quviq AB to
build a commercial tool based on the same ideas, and use it to help
improve the quality of customers' testing. The need to test different
kinds of customer software has forced a number of extensions, many
based on state machine models. In this paper I explain how these
models work using a simple example in C, and describe how they were
used in a large project for the car industry, and to track down a race
condition that was plaguing one of Sweden's fastest growing
companies. These are war stories that both entertain and inform!

Cutting out continuations.

In the field of program transformation, one often transforms programs
into continuation-passing style to make their flow of control
explicit, and then immediately removes the resulting continuations
using defunctionalisation to make the programs first-order. In this
article, we show how these two transformations can be fused together
into a single transformation step that cuts out the need to first
introduce and then eliminate continuations. Our approach is
calculational, uses standard equational reasoning techniques, and is
widely applicable.

I got plenty o’ nuttin’

Work to date on combining linear types and dependent types has
deliberately and successfully avoided doing so. Entirely fit for their
own purposes, such systems wisely insist that types depend only on the
replicable sublanguage, thus sidestepping the issue of counting uses
of limited-use data either within types or in ways which are only
really needed to shut the typechecker up. As a result, the linear
implication (‘lollipop’) stubbornly remains a non-dependent S -o
T. This paper defines and establishes the basic metatheory of a type
theory supporting a ‘dependent lollipop’ (x:S) -o T[x], where what the
input used to be is in some way commemorated by the type of the
output. Usage is tracked with resource annotations belonging to an
arbitrary rig, or ‘riNg without Negation’. The key insight is to use
the rig’s zero to mark information in contexts which is present for
purposes of contemplation rather than consumption, like a meal we
remember fondly but cannot eat twice. We can have plenty of nothing
with no additional runtime resource, and nothing is plenty for the
construction of dependent types.

The essence of dependent object types

Focusing on path dependent types, the paper develops foundations for
Scala from first principles. Starting from a simple calculus D$_{<:}$
of dependent functions, it adds records, intersections and recursion
to arrive at DOT, a calculus for dependent object types. The paper
shows an encoding of System F with subtyping in D$_{<:}$ and
demonstrates the expressiveness of DOT by modeling a range of Scala
constructs in it.

Linear lambda-mu is CP (more or less)

In this paper we compare Wadler's CP calculus for classical linear
processes to a linear version of Parigot's lambda-mu calculus for
classical logic. We conclude that linear lambda-mu is ``more or less''
CP, in that it equationally corresponds to a polarized version of
CP. The comparison is made by extending a technique from Mellies and
Tabareau's tensor logic that correlates negation with
polarization. The polarized CP, which is written CP+- and pronounced
``CP more or less,'' is an interesting bridge in the landscape of
Curry-Howard interpretations of logic.

A reflection on types

The ability to perform type tests at runtime blurs the line between
statically-typed and dynamically-typed languages. Recent developments
in Haskell's type system allow even programs that use reflection to
themselves be statically typed, using a type-indexed runtime
representation of types called TypeRep.

The essence of multi-stage evaluation in LMS

Embedded domain-specific languages (DSLs) are the subject of
wide-spread interest, and a variety of implementation techniques
exist. Some of them have been invented, and some of them
discovered. Many are based on a form of \emph{generative} or
\emph{multi-stage} programming, where the host language program builds
up DSL terms during its evaluation. In this paper, we examine the
execution model of LMS (Lightweight Modular Staging), a framework for
embedded DSLs in Scala, and link it to evaluation in a two-stage
lambda calculus. This clarifies the semantics of certain ad-hoc
implementation choices, and provides guidance for implementing similar
multi-stage evaluation facilities in other languages.

1ML with special effects

We take another look at 1ML, a language in the ML tradition, but with
core and modules merged into one unified language, rendering all
modules first-class values. 1ML already comes with a simple form of
effect system that distinguishes pure from impure computations. Now we
enrich it with effect polymorphism: by introducing effect declarations
and, more interestingly, abstract effect specifications, effects can
be parameterised over, and treated as abstract or concrete in the type
system, very much like types themselves. Because type generativity in
1ML is controlled by (im)purity effects, this yields a somewhat exotic
novel notion of _generativity polymorphism_ -- that is, a given
functor can be asked to behave as either ``generative'' or
``applicative''. And this time, we even get to define an interesting
(poly)monad for that!

The computer scientist nightmare

This text relates the story of particularly shocking bug that occurred
during the development of a Web application. The bug is first
described. The battle to understand and fix it is then presented. This
experimental report concludes with some questionings about the way we
conceive programming languages and programming environments.

A branding strategy for business types

In the course of building a compiler from business rules to a database
run-time, we encounter the need for a type system that includes a
class hierarchy and subtyping in the presence of complex record
operations. Since our starting point is based on structural typing and
targets a data-centric language, we develop an approach inspired by
Wadler's work on XML types. Our proposed type system has strong
similarities with brand objects, combining nominal and structural
typing, and is designed to support a rich set of operations on records
commonly found in query languages. We show soundness of the type
system and illustrate its use on two of the languages involved in our
compiler for business rules: a calculus for pattern matching with
aggregation (CAMP) that captures rules semantics, and the nested
relational algebra (NRA) used for optimization and code generation. We
show type soundness for both languages. The approach and correctness
proofs are fully mechanized using the Coq proof-assistant.

The recursive union of some gradual types

We study union types and recursive types in the setting of a gradually
typed lambda calculus. Our goal is to obtain a foundational account
for languages that enable recursively defined data structures to be
passed between static and dynamically typed regions of a program. We
discuss how traditional sum types are not appropriate for this purpose
and instead study a form of true union in the tradition of soft typing
[Cartwright and Fagan, 1991] and occurrence typing [Tobin-Hochstadt
and Felleisen, 2008]. Regarding recursive types, our formulation is
based on the axiomatization of subtyping by Brand and Henglein
(1998).

This paper defines three artifacts. First, in the context of the simply typed lambda calculus, we define the semantics of our unions and integrate them with equi-recursive types. Second, we add a dynamic type to obtain a gradually typed lambda calculus. The semantics of this calculus is defined by translation to the third artifact, a blame calculus [Wadler and Findler, 2009] extended with unions and equi-recursive types.

This paper defines three artifacts. First, in the context of the simply typed lambda calculus, we define the semantics of our unions and integrate them with equi-recursive types. Second, we add a dynamic type to obtain a gradually typed lambda calculus. The semantics of this calculus is defined by translation to the third artifact, a blame calculus [Wadler and Findler, 2009] extended with unions and equi-recursive types.

Certifying data in multiparty session types

Multiparty session types (MPST) are a typing discipline for ensuring
the coordination and orchestration of multi-agent communication in
concurrent and distributed programs. However, by mostly focusing on
the communication aspects of concurrency, MPST are often unable to
capture important data invariants in programs. In this work we propose
to increase the expressiveness of MPST by considering a notion of
value dependencies in order to certify invariants of exchanged data in
concurrent and distributed settings.

A delta for hybrid type checking

A hybrid type checker defers parts of the type checking to run
time. The static part of the checker performs a best effort at
verifying subtyping constraints. Constraints that cannot be proved by
the static checker, are reified as run-time casts in an intermediate
blame calculus.

The goal of this work is to simplify these casts by exploiting context information. To this end, we develop a coercion calculus that corresponds to the blame calculus via a pair of translations and we define the formal framework to simplify these coercions. We give a concrete instance of the calculus and demonstrate that simplification can be regarded as a synthesis problem.

The goal of this work is to simplify these casts by exploiting context information. To this end, we develop a coercion calculus that corresponds to the blame calculus via a pair of translations and we define the formal framework to simplify these coercions. We give a concrete instance of the calculus and demonstrate that simplification can be regarded as a synthesis problem.

Recursion equations as a programming language

This paper was written in 1981 and published in “Functional
Programming and its Applications”, Cambridge University Press, January
1982, editors Darlington, Henderson and Turner. The volume comprised
the lecture notes from a summer school on Functional Programming held
at Newcastle University in July 1981, attended by 80 people. The paper
includes an overview of Kent Recursive Calculator, a simple functional
programming system based on higher order recursion equations and a
series of programming examples. It is probably the earliest paper
using list comprehensions applied to lazy lists, and has the first
published account of the “list of successes” method of eliminating
backtracking, here applied to the eight queens problem. The method
didn’t yet have a name. It was Phil Wadler who saw its importance and
coined the phrase “list of successes” in his 1985 paper. It was also
Phil who invented the term “list comprehensions” for what are here
called “ZF expressions”.