Density modeling of images using a generalized normalization transformation

Johannes Ballé, Valero Laparra, Eero Simoncelli

Research output: Contribution to conferencePaper

Abstract

We introduce a parametric nonlinear transformation that is well-suited for Gaussianizing data from natural images. The data are linearly transformed, and each component is then normalized by a pooled activity measure, computed by ex-ponentiating a weighted sum of rectified and exponentiated components and a constant. We optimize the parameters of the full transformation (linear transform, exponents, weights, constant) over a database of natural images, directly minimizing the negentropy of the responses. The optimized transformation substantially Gaussianizes the data, achieving a significantly smaller mutual information between transformed components than alternative methods including ICA and radial Gaussianization. The transformation is differentiable and can be efficiently inverted, and thus induces a density model on images. We show that samples of this model are visually similar to samples of natural image patches. We demonstrate the use of the model as a prior probability density that can be used to remove additive noise. Finally, we show that the transformation can be cascaded, with each layer optimized using the same Gaussianization objective, thus offering an unsupervised method of optimizing a deep network architecture.

Original languageEnglish (US)
StatePublished - Jan 1 2016
Event4th International Conference on Learning Representations, ICLR 2016 - San Juan, Puerto Rico
Duration: May 2 2016May 4 2016

Conference

Conference4th International Conference on Learning Representations, ICLR 2016
CountryPuerto Rico
CitySan Juan
Period5/2/165/4/16

Fingerprint

normalization
Linear transformations
Additive noise
Independent component analysis
Network architecture
Normalization
Modeling

ASJC Scopus subject areas

  • Education
  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Cite this

Ballé, J., Laparra, V., & Simoncelli, E. (2016). Density modeling of images using a generalized normalization transformation. Paper presented at 4th International Conference on Learning Representations, ICLR 2016, San Juan, Puerto Rico.

Density modeling of images using a generalized normalization transformation. / Ballé, Johannes; Laparra, Valero; Simoncelli, Eero.

2016. Paper presented at 4th International Conference on Learning Representations, ICLR 2016, San Juan, Puerto Rico.

Research output: Contribution to conferencePaper

Ballé, J, Laparra, V & Simoncelli, E 2016, 'Density modeling of images using a generalized normalization transformation' Paper presented at 4th International Conference on Learning Representations, ICLR 2016, San Juan, Puerto Rico, 5/2/16 - 5/4/16, .
Ballé J, Laparra V, Simoncelli E. Density modeling of images using a generalized normalization transformation. 2016. Paper presented at 4th International Conference on Learning Representations, ICLR 2016, San Juan, Puerto Rico.
Ballé, Johannes ; Laparra, Valero ; Simoncelli, Eero. / Density modeling of images using a generalized normalization transformation. Paper presented at 4th International Conference on Learning Representations, ICLR 2016, San Juan, Puerto Rico.
@conference{c7634985295f47f6849c7dd3ccee313b,
title = "Density modeling of images using a generalized normalization transformation",
abstract = "We introduce a parametric nonlinear transformation that is well-suited for Gaussianizing data from natural images. The data are linearly transformed, and each component is then normalized by a pooled activity measure, computed by ex-ponentiating a weighted sum of rectified and exponentiated components and a constant. We optimize the parameters of the full transformation (linear transform, exponents, weights, constant) over a database of natural images, directly minimizing the negentropy of the responses. The optimized transformation substantially Gaussianizes the data, achieving a significantly smaller mutual information between transformed components than alternative methods including ICA and radial Gaussianization. The transformation is differentiable and can be efficiently inverted, and thus induces a density model on images. We show that samples of this model are visually similar to samples of natural image patches. We demonstrate the use of the model as a prior probability density that can be used to remove additive noise. Finally, we show that the transformation can be cascaded, with each layer optimized using the same Gaussianization objective, thus offering an unsupervised method of optimizing a deep network architecture.",
author = "Johannes Ball{\'e} and Valero Laparra and Eero Simoncelli",
year = "2016",
month = "1",
day = "1",
language = "English (US)",
note = "4th International Conference on Learning Representations, ICLR 2016 ; Conference date: 02-05-2016 Through 04-05-2016",

}

TY - CONF

T1 - Density modeling of images using a generalized normalization transformation

AU - Ballé, Johannes

AU - Laparra, Valero

AU - Simoncelli, Eero

PY - 2016/1/1

Y1 - 2016/1/1

N2 - We introduce a parametric nonlinear transformation that is well-suited for Gaussianizing data from natural images. The data are linearly transformed, and each component is then normalized by a pooled activity measure, computed by ex-ponentiating a weighted sum of rectified and exponentiated components and a constant. We optimize the parameters of the full transformation (linear transform, exponents, weights, constant) over a database of natural images, directly minimizing the negentropy of the responses. The optimized transformation substantially Gaussianizes the data, achieving a significantly smaller mutual information between transformed components than alternative methods including ICA and radial Gaussianization. The transformation is differentiable and can be efficiently inverted, and thus induces a density model on images. We show that samples of this model are visually similar to samples of natural image patches. We demonstrate the use of the model as a prior probability density that can be used to remove additive noise. Finally, we show that the transformation can be cascaded, with each layer optimized using the same Gaussianization objective, thus offering an unsupervised method of optimizing a deep network architecture.

AB - We introduce a parametric nonlinear transformation that is well-suited for Gaussianizing data from natural images. The data are linearly transformed, and each component is then normalized by a pooled activity measure, computed by ex-ponentiating a weighted sum of rectified and exponentiated components and a constant. We optimize the parameters of the full transformation (linear transform, exponents, weights, constant) over a database of natural images, directly minimizing the negentropy of the responses. The optimized transformation substantially Gaussianizes the data, achieving a significantly smaller mutual information between transformed components than alternative methods including ICA and radial Gaussianization. The transformation is differentiable and can be efficiently inverted, and thus induces a density model on images. We show that samples of this model are visually similar to samples of natural image patches. We demonstrate the use of the model as a prior probability density that can be used to remove additive noise. Finally, we show that the transformation can be cascaded, with each layer optimized using the same Gaussianization objective, thus offering an unsupervised method of optimizing a deep network architecture.

UR - http://www.scopus.com/inward/record.url?scp=85071016603&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85071016603&partnerID=8YFLogxK

M3 - Paper

ER -