Learning to discover efficient mathematical identities

Wojciech Zaremba, Karol Kurach, Rob Fergus

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper we explore how machine learning techniques can be applied to the discovery of efficient mathematical identities. We introduce an attribute grammar framework for representing symbolic expressions. Given a grammar of math operators, we build trees that combine them in different ways, looking for compositions that are analytically equivalent to a target expression but of lower computational complexity. However, as the space of trees grows exponentially with the complexity of the target expression, brute force search is impractical for all but the simplest of expressions. Consequently, we introduce two novel learning approaches that are able to learn from simpler expressions to guide the tree search. The first of these is a simple n-gram model, the other being a recursive neural-network. We show how these approaches enable us to derive complex identities, beyond reach of brute-force search, or human derivation.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
Pages1278-1286
Number of pages9
Volume2
EditionJanuary
StatePublished - 2014
Event28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014 - Montreal, Canada
Duration: Dec 8 2014Dec 13 2014

Other

Other28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014
CountryCanada
CityMontreal
Period12/8/1412/13/14

Fingerprint

Learning systems
Computational complexity
Neural networks
Chemical analysis

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Zaremba, W., Kurach, K., & Fergus, R. (2014). Learning to discover efficient mathematical identities. In Advances in Neural Information Processing Systems (January ed., Vol. 2, pp. 1278-1286). Neural information processing systems foundation.

Learning to discover efficient mathematical identities. / Zaremba, Wojciech; Kurach, Karol; Fergus, Rob.

Advances in Neural Information Processing Systems. Vol. 2 January. ed. Neural information processing systems foundation, 2014. p. 1278-1286.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Zaremba, W, Kurach, K & Fergus, R 2014, Learning to discover efficient mathematical identities. in Advances in Neural Information Processing Systems. January edn, vol. 2, Neural information processing systems foundation, pp. 1278-1286, 28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014, Montreal, Canada, 12/8/14.
Zaremba W, Kurach K, Fergus R. Learning to discover efficient mathematical identities. In Advances in Neural Information Processing Systems. January ed. Vol. 2. Neural information processing systems foundation. 2014. p. 1278-1286
Zaremba, Wojciech ; Kurach, Karol ; Fergus, Rob. / Learning to discover efficient mathematical identities. Advances in Neural Information Processing Systems. Vol. 2 January. ed. Neural information processing systems foundation, 2014. pp. 1278-1286
@inproceedings{4fc80f36723548c2895c5b4da7c36785,
title = "Learning to discover efficient mathematical identities",
abstract = "In this paper we explore how machine learning techniques can be applied to the discovery of efficient mathematical identities. We introduce an attribute grammar framework for representing symbolic expressions. Given a grammar of math operators, we build trees that combine them in different ways, looking for compositions that are analytically equivalent to a target expression but of lower computational complexity. However, as the space of trees grows exponentially with the complexity of the target expression, brute force search is impractical for all but the simplest of expressions. Consequently, we introduce two novel learning approaches that are able to learn from simpler expressions to guide the tree search. The first of these is a simple n-gram model, the other being a recursive neural-network. We show how these approaches enable us to derive complex identities, beyond reach of brute-force search, or human derivation.",
author = "Wojciech Zaremba and Karol Kurach and Rob Fergus",
year = "2014",
language = "English (US)",
volume = "2",
pages = "1278--1286",
booktitle = "Advances in Neural Information Processing Systems",
publisher = "Neural information processing systems foundation",
edition = "January",

}

TY - GEN

T1 - Learning to discover efficient mathematical identities

AU - Zaremba, Wojciech

AU - Kurach, Karol

AU - Fergus, Rob

PY - 2014

Y1 - 2014

N2 - In this paper we explore how machine learning techniques can be applied to the discovery of efficient mathematical identities. We introduce an attribute grammar framework for representing symbolic expressions. Given a grammar of math operators, we build trees that combine them in different ways, looking for compositions that are analytically equivalent to a target expression but of lower computational complexity. However, as the space of trees grows exponentially with the complexity of the target expression, brute force search is impractical for all but the simplest of expressions. Consequently, we introduce two novel learning approaches that are able to learn from simpler expressions to guide the tree search. The first of these is a simple n-gram model, the other being a recursive neural-network. We show how these approaches enable us to derive complex identities, beyond reach of brute-force search, or human derivation.

AB - In this paper we explore how machine learning techniques can be applied to the discovery of efficient mathematical identities. We introduce an attribute grammar framework for representing symbolic expressions. Given a grammar of math operators, we build trees that combine them in different ways, looking for compositions that are analytically equivalent to a target expression but of lower computational complexity. However, as the space of trees grows exponentially with the complexity of the target expression, brute force search is impractical for all but the simplest of expressions. Consequently, we introduce two novel learning approaches that are able to learn from simpler expressions to guide the tree search. The first of these is a simple n-gram model, the other being a recursive neural-network. We show how these approaches enable us to derive complex identities, beyond reach of brute-force search, or human derivation.

UR - http://www.scopus.com/inward/record.url?scp=84937961368&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84937961368&partnerID=8YFLogxK

M3 - Conference contribution

VL - 2

SP - 1278

EP - 1286

BT - Advances in Neural Information Processing Systems

PB - Neural information processing systems foundation

ER -