Learning to discover efficient mathematical identities

Wojciech Zaremba, Karol Kurach, Rob Fergus

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper we explore how machine learning techniques can be applied to the discovery of efficient mathematical identities. We introduce an attribute grammar framework for representing symbolic expressions. Given a grammar of math operators, we build trees that combine them in different ways, looking for compositions that are analytically equivalent to a target expression but of lower computational complexity. However, as the space of trees grows exponentially with the complexity of the target expression, brute force search is impractical for all but the simplest of expressions. Consequently, we introduce two novel learning approaches that are able to learn from simpler expressions to guide the tree search. The first of these is a simple n-gram model, the other being a recursive neural-network. We show how these approaches enable us to derive complex identities, beyond reach of brute-force search, or human derivation.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
Pages1278-1286
Number of pages9
Volume2
EditionJanuary
StatePublished - 2014
Event28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014 - Montreal, Canada
Duration: Dec 8 2014Dec 13 2014

Other

Other28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014
CountryCanada
CityMontreal
Period12/8/1412/13/14

    Fingerprint

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Zaremba, W., Kurach, K., & Fergus, R. (2014). Learning to discover efficient mathematical identities. In Advances in Neural Information Processing Systems (January ed., Vol. 2, pp. 1278-1286). Neural information processing systems foundation.