History of logic and programming languages

Philip Wadler


Papers we love: John Reynolds, Definitional Interpreters for Higher-Order Programming Languages

Papers We Love, Skills Matter, London, 7 June 2016

Certain papers change your life. McCarthy's 'Recursive Functions of Symbolic Expressions and their Computation by Machine (Part I)' (1960) changed mine, and so did Landin's 'The Next 700 Programming Languages' (1966). And I remember the moment, halfway through my graduate career, when Guy Steele handed me Reynolds's 'Definitional Interpreters for Higher-Order Programming Languages' (1972).

It is now common to explicate the structure of a programming language by presenting an interpreter for that language. If the language interpreted is the same as the language doing the interpreting, the interpreter is called meta-circular.

Interpreters may be written at differing levels of detail, to explicate different implementation strategies. For instance, the interpreter may be written in a continuation-passing style; or some of the higher-order functions may be represented explicitly using data-structures, via defunctionalisation.

More elaborate interpreters may be derived from simpler versions, thus providing a methodology for discovering an implementation strategy and showing it correct. Each of these techniques has become a mainstay of the study of programming languages, and all of them were introduced in this single paper by Reynolds.

Related material

# Available in: pdf, keynote, video.


Propositions as Types

Philip Wadler. Communications of the ACM, 58(12):75–84, December, 2015.

The principle of Propositions as Types links logic to computation. At first sight it appears to be a simple coincidence---almost a pun---but it turns out to be remarkably robust, inspiring the design of theorem provers and programming languages, and continuing to influence the forefronts of computing. Propositions as Types has many names and many origins, and is a notion with depth, breadth, and mystery.

# Available in: pdf, doi, video.


Church's Coincidences

Philip Wadler. Turing Centennial Celebration, Princeton, 10–12 May 2012 and keynote SICSA PhD Conference, Glasgow, 20–22 June 2012.

The foundations of computing lay in a coincidence: Church's lambda calculus (1933), Herbrand and Godel's recursive functions (1934), and Turing's machines (1935) all define the same model of computation. Another coincidence: Gentzen's intuitionistic natural deduction (1935) and Church's simply-typed lambda calculus (1940) define isomorphic systems. We review the history and significance of these coincidences, with an eye to Turing's role.

# Available in: pdf, video.


The unreasonable effectiveness of logic

Philip Wadler. Inaugural Lecture, University of Edinburgh, 16 November 2004.

People might be forgiven for thinking that computing is not so much a science as an industry. Ask someone to name a prominent computer scientist and you are more likely to hear the name Bill Gates than Alan Turing. In fact, computing is both a science and an industry, each stimulating the other.

Everyone knows that logic and computing have something to do with each other, but few understand the remarkable correspondence that links them. A model of logic and a model of computing, each published at the dawn of the computer era, turned out, half a century later, to coincide exactly. We will follow this correspondence through three strands of work, connecting three varieties of logic, three researchers at Edinburgh's Laboratory for the Foundations of Computer Science, and three applications to web technology.

Computing is in its infancy, laying the foundations for the work to come. Two hundred and fifty years ago, Edinburgh nurtured the work of David Hume, James Hutton, and Adam Smith. Two hundred and fifty years from now, to what extent will Edinburgh be recognized as founding the theory and practice of Informatics?

# Available in: pdf.


As Natural as 0,1,2

Philip Wadler. Edinburgh University Informatics Jamboree 20 May 2004, Bard College Distinguished Scientist Lecture, 10 April 2003, and University of Utah Evans and Sutherland Distinguished Lecture, 20 November 2002.

Whether a visitor comes from another place, another planet, or another plane of being we can be sure that he, she, or it will count just as we do: though their symbols vary, the numbers are universal. The history of logic and computing suggests a programming language that is equally natural. The language, called lambda calculus, is in exact correspondence with a formulation of the laws of reason, called natural deduction. Lambda calculus and natural deduction were devised, independently of each other, around 1930, just before the development of the first stored program computer. Yet the correspondence between them was not recognized until decades later, and not published until 1980. Today, languages based on lambda calculus have a few thousand users. Tomorrow, reliable use of the Internet may depend on languages with logical foundations.

# Available in: pdf, (older versions: pdf, pdf, pdf)


From Frege to Gosling: 19'th century logic and 21'st century programming languages

Philip Wadler. Alan J. Perlis Symposium, Programming Languages: Theory Meets the Real World, Yale University, 27 April 2000 (invited talk).

Logics of the 19'th century, developed by philosophers like Gottlieb Frege concerned with the nature of truth, have direct impact on programming languages for the 21'st century, designed by engineers like James Gosling concerned with programming the web. The type systems developed by logicians underlie the safety properties of Java that make exchange of programs possible. One of the most fascinating tales of the 20'th century is how Gentzen's natural deduction (a model of logic) and Church's lambda calculus (a model of programming) were discovered, a half century after their invention, to be in precise correspondence. One aspect of the correspondence is an association between second-order quantification and generic types. The talk describes GJ, which is based on this correspondence, and serves as the basis of Sun's plans to add generics to Java. The talk also describes Featherweight Java, a model of Java that is comparable in simplicity and power to lambda calculus, and serves as a tool for designing the generation of languages to follow Java.

(Based on work with Martin Odersky, Gilad Bracha, David Stoutamire, Benjamin Pierce, and Atsushi Igarashi.)

# Available in: pdf.


Proofs are Programs: 19th Century Logic and 21st Century Computing

Philip Wadler. (This is a variant of >New Languages, Old Logic, which appeared in Dr Dobbs Journal, special supplement on Software in the 21st century, December 2000.)

As the 19th century drew to a close, logicians formalized an ideal notion of proof. They were driven by nothing other than an abiding interest in truth, and their proofs were as ethereal as the mind of God. Yet within decades these mathematical abstractions were realized by the hand of man, in the digital stored-program computer. How it came to be recognized that proofs and programs are the same thing is a story that spans a century, a chase with as many twists and turns as a thriller. At the end of the story is a new principle for designing programming languages that will guide computers into the 21st century.

Available in: pdf.


Philip Wadler,