Spectral networks and deep locally connected networks on graphs

Joan Bruna Estrach, Wojciech Zaremba, Arthur Szlam, Yann LeCun

Research output: Contribution to conferencePaper

Abstract

Convolutional Neural Networks are extremely efficient architectures in image and audio recognition tasks, thanks to their ability to exploit the local translational invariance of signal classes over their domain. In this paper we consider possible generalizations of CNNs to signals defined on more general domains without the action of a translation group. In particular, we propose two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian. We show through experiments that for low-dimensional graphs it is possible to learn convolutional layers with a number of parameters independent of the input size, resulting in efficient deep architectures.

Original languageEnglish (US)
StatePublished - Jan 1 2014
Event2nd International Conference on Learning Representations, ICLR 2014 - Banff, Canada
Duration: Apr 14 2014Apr 16 2014

Conference

Conference2nd International Conference on Learning Representations, ICLR 2014
CountryCanada
CityBanff
Period4/14/144/16/14

Fingerprint

CNN
Invariance
neural network
Neural networks
experiment
ability
Group
Experiments
Spectrality
Graph

ASJC Scopus subject areas

  • Linguistics and Language
  • Language and Linguistics
  • Education
  • Computer Science Applications

Cite this

Bruna Estrach, J., Zaremba, W., Szlam, A., & LeCun, Y. (2014). Spectral networks and deep locally connected networks on graphs. Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada.

Spectral networks and deep locally connected networks on graphs. / Bruna Estrach, Joan; Zaremba, Wojciech; Szlam, Arthur; LeCun, Yann.

2014. Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada.

Research output: Contribution to conferencePaper

Bruna Estrach, J, Zaremba, W, Szlam, A & LeCun, Y 2014, 'Spectral networks and deep locally connected networks on graphs' Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada, 4/14/14 - 4/16/14, .
Bruna Estrach J, Zaremba W, Szlam A, LeCun Y. Spectral networks and deep locally connected networks on graphs. 2014. Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada.
Bruna Estrach, Joan ; Zaremba, Wojciech ; Szlam, Arthur ; LeCun, Yann. / Spectral networks and deep locally connected networks on graphs. Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada.
@conference{e5c314849c2746efa998257d80cb248e,
title = "Spectral networks and deep locally connected networks on graphs",
abstract = "Convolutional Neural Networks are extremely efficient architectures in image and audio recognition tasks, thanks to their ability to exploit the local translational invariance of signal classes over their domain. In this paper we consider possible generalizations of CNNs to signals defined on more general domains without the action of a translation group. In particular, we propose two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian. We show through experiments that for low-dimensional graphs it is possible to learn convolutional layers with a number of parameters independent of the input size, resulting in efficient deep architectures.",
author = "{Bruna Estrach}, Joan and Wojciech Zaremba and Arthur Szlam and Yann LeCun",
year = "2014",
month = "1",
day = "1",
language = "English (US)",
note = "2nd International Conference on Learning Representations, ICLR 2014 ; Conference date: 14-04-2014 Through 16-04-2014",

}

TY - CONF

T1 - Spectral networks and deep locally connected networks on graphs

AU - Bruna Estrach, Joan

AU - Zaremba, Wojciech

AU - Szlam, Arthur

AU - LeCun, Yann

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Convolutional Neural Networks are extremely efficient architectures in image and audio recognition tasks, thanks to their ability to exploit the local translational invariance of signal classes over their domain. In this paper we consider possible generalizations of CNNs to signals defined on more general domains without the action of a translation group. In particular, we propose two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian. We show through experiments that for low-dimensional graphs it is possible to learn convolutional layers with a number of parameters independent of the input size, resulting in efficient deep architectures.

AB - Convolutional Neural Networks are extremely efficient architectures in image and audio recognition tasks, thanks to their ability to exploit the local translational invariance of signal classes over their domain. In this paper we consider possible generalizations of CNNs to signals defined on more general domains without the action of a translation group. In particular, we propose two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian. We show through experiments that for low-dimensional graphs it is possible to learn convolutional layers with a number of parameters independent of the input size, resulting in efficient deep architectures.

UR - http://www.scopus.com/inward/record.url?scp=85070855328&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85070855328&partnerID=8YFLogxK

M3 - Paper

ER -