Learning Continuous Semantic Representations of Symbolic Expressions

Slides, papers, code for Charles Sutton’s talk at the
International Conference on Machine Learning (ICML) 2017
Syndey, Australia, August 2017

Citation

Learning Continuous Semantic Representations of Symbolic Expressions. Miltiadis Allamanis, Pankajan Chanthirasegaran, Pushmeet Kohli and Charles Sutton. In Interational Conference on Machine Learning (ICML). 2017.

[ .pdf | bib ]

TL;CE (too long; checked email)

Formal Abstract

Combining abstract, symbolic reasoning with continuous neural reasoning is a grand challenge of representation learning. As a step in this direction, we propose a new architecture, called neural equivalence networks, for the problem of learning continuous semantic representations of algebraic and logical expressions. These networks are trained to represent semantic equivalence, even of expressions that are syntactically very different. The challenge is that semantic representations must be computed in a syntax-directed manner, because semantics is compositional, but at the same time, small changes in syntax can lead to very large changes in semantics, which can be difficult for continuous neural architectures. We perform an exhaustive evaluation on the task of checking equivalence on a highly diverse class of symbolic algebraic and boolean expression types, showing that our model significantly outperforms existing architectures.

Slides

Slides as pdf

Visualizations and Data

We have another web page that contains our data sets and an interactive visualizaton of our learned representations.

Code

Why am I doing this?

I’ve written a bit about the philosophy of these pages on my talks page.