Stochastic pooling for regularization of deep convolutional neural networks

Matthew D. Zeiler, Robert Fergus

Research output: Contribution to conferencePaper

Abstract

We introduce a simple and effective method for regularizing large convolutional neural networks. We replace the conventional deterministic pooling operations with a stochastic procedure, randomly picking the activation within each pooling region according to a multinomial distribution, given by the activities within the pooling region. The approach is hyper-parameter free and can be combined with other regularization approaches, such as dropout and data augmentation. We achieve state-of-the-art performance on four image datasets, relative to other approaches that do not utilize data augmentation.

Original languageEnglish (US)
StatePublished - Jan 1 2013
Event1st International Conference on Learning Representations, ICLR 2013 - Scottsdale, United States
Duration: May 2 2013May 4 2013

Conference

Conference1st International Conference on Learning Representations, ICLR 2013
CountryUnited States
CityScottsdale
Period5/2/135/4/13

Fingerprint

neural network
Chemical activation
Neural networks
drop-out
activation
performance
Neural Networks
Augmentation
Conventional
Drop out
Activation
Performance Art

ASJC Scopus subject areas

  • Education
  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Cite this

Zeiler, M. D., & Fergus, R. (2013). Stochastic pooling for regularization of deep convolutional neural networks. Paper presented at 1st International Conference on Learning Representations, ICLR 2013, Scottsdale, United States.

Stochastic pooling for regularization of deep convolutional neural networks. / Zeiler, Matthew D.; Fergus, Robert.

2013. Paper presented at 1st International Conference on Learning Representations, ICLR 2013, Scottsdale, United States.

Research output: Contribution to conferencePaper

Zeiler, MD & Fergus, R 2013, 'Stochastic pooling for regularization of deep convolutional neural networks', Paper presented at 1st International Conference on Learning Representations, ICLR 2013, Scottsdale, United States, 5/2/13 - 5/4/13.
Zeiler MD, Fergus R. Stochastic pooling for regularization of deep convolutional neural networks. 2013. Paper presented at 1st International Conference on Learning Representations, ICLR 2013, Scottsdale, United States.
Zeiler, Matthew D. ; Fergus, Robert. / Stochastic pooling for regularization of deep convolutional neural networks. Paper presented at 1st International Conference on Learning Representations, ICLR 2013, Scottsdale, United States.
@conference{18649f4a350c46ea966eae5bb59d3623,
title = "Stochastic pooling for regularization of deep convolutional neural networks",
abstract = "We introduce a simple and effective method for regularizing large convolutional neural networks. We replace the conventional deterministic pooling operations with a stochastic procedure, randomly picking the activation within each pooling region according to a multinomial distribution, given by the activities within the pooling region. The approach is hyper-parameter free and can be combined with other regularization approaches, such as dropout and data augmentation. We achieve state-of-the-art performance on four image datasets, relative to other approaches that do not utilize data augmentation.",
author = "Zeiler, {Matthew D.} and Robert Fergus",
year = "2013",
month = "1",
day = "1",
language = "English (US)",
note = "1st International Conference on Learning Representations, ICLR 2013 ; Conference date: 02-05-2013 Through 04-05-2013",

}

TY - CONF

T1 - Stochastic pooling for regularization of deep convolutional neural networks

AU - Zeiler, Matthew D.

AU - Fergus, Robert

PY - 2013/1/1

Y1 - 2013/1/1

N2 - We introduce a simple and effective method for regularizing large convolutional neural networks. We replace the conventional deterministic pooling operations with a stochastic procedure, randomly picking the activation within each pooling region according to a multinomial distribution, given by the activities within the pooling region. The approach is hyper-parameter free and can be combined with other regularization approaches, such as dropout and data augmentation. We achieve state-of-the-art performance on four image datasets, relative to other approaches that do not utilize data augmentation.

AB - We introduce a simple and effective method for regularizing large convolutional neural networks. We replace the conventional deterministic pooling operations with a stochastic procedure, randomly picking the activation within each pooling region according to a multinomial distribution, given by the activities within the pooling region. The approach is hyper-parameter free and can be combined with other regularization approaches, such as dropout and data augmentation. We achieve state-of-the-art performance on four image datasets, relative to other approaches that do not utilize data augmentation.

UR - http://www.scopus.com/inward/record.url?scp=85061138694&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85061138694&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:85061138694

ER -