Regularization of neural networks using DropConnect

Li Wan, Matthew Zeiler, Sixin Zhang, Yann LeCun, Rob Fergus

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recognition benchmarks by aggregating multiple DropConnect-trained models.

Original languageEnglish (US)
Title of host publication30th International Conference on Machine Learning, ICML 2013
PublisherInternational Machine Learning Society (IMLS)
Pages2095-2103
Number of pages9
EditionPART 3
StatePublished - 2013
Event30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States
Duration: Jun 16 2013Jun 21 2013

Other

Other30th International Conference on Machine Learning, ICML 2013
CountryUnited States
CityAtlanta, GA
Period6/16/136/21/13

Fingerprint

Image recognition
drop-out
neural network
Chemical activation
Neural networks
activation
performance

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Sociology and Political Science

Cite this

Wan, L., Zeiler, M., Zhang, S., LeCun, Y., & Fergus, R. (2013). Regularization of neural networks using DropConnect. In 30th International Conference on Machine Learning, ICML 2013 (PART 3 ed., pp. 2095-2103). International Machine Learning Society (IMLS).

Regularization of neural networks using DropConnect. / Wan, Li; Zeiler, Matthew; Zhang, Sixin; LeCun, Yann; Fergus, Rob.

30th International Conference on Machine Learning, ICML 2013. PART 3. ed. International Machine Learning Society (IMLS), 2013. p. 2095-2103.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Wan, L, Zeiler, M, Zhang, S, LeCun, Y & Fergus, R 2013, Regularization of neural networks using DropConnect. in 30th International Conference on Machine Learning, ICML 2013. PART 3 edn, International Machine Learning Society (IMLS), pp. 2095-2103, 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, United States, 6/16/13.
Wan L, Zeiler M, Zhang S, LeCun Y, Fergus R. Regularization of neural networks using DropConnect. In 30th International Conference on Machine Learning, ICML 2013. PART 3 ed. International Machine Learning Society (IMLS). 2013. p. 2095-2103
Wan, Li ; Zeiler, Matthew ; Zhang, Sixin ; LeCun, Yann ; Fergus, Rob. / Regularization of neural networks using DropConnect. 30th International Conference on Machine Learning, ICML 2013. PART 3. ed. International Machine Learning Society (IMLS), 2013. pp. 2095-2103
@inproceedings{922d9db80bf74ae99882cd46ba5887eb,
title = "Regularization of neural networks using DropConnect",
abstract = "We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recognition benchmarks by aggregating multiple DropConnect-trained models.",
author = "Li Wan and Matthew Zeiler and Sixin Zhang and Yann LeCun and Rob Fergus",
year = "2013",
language = "English (US)",
pages = "2095--2103",
booktitle = "30th International Conference on Machine Learning, ICML 2013",
publisher = "International Machine Learning Society (IMLS)",
edition = "PART 3",

}

TY - GEN

T1 - Regularization of neural networks using DropConnect

AU - Wan, Li

AU - Zeiler, Matthew

AU - Zhang, Sixin

AU - LeCun, Yann

AU - Fergus, Rob

PY - 2013

Y1 - 2013

N2 - We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recognition benchmarks by aggregating multiple DropConnect-trained models.

AB - We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recognition benchmarks by aggregating multiple DropConnect-trained models.

UR - http://www.scopus.com/inward/record.url?scp=84897550107&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84897550107&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84897550107

SP - 2095

EP - 2103

BT - 30th International Conference on Machine Learning, ICML 2013

PB - International Machine Learning Society (IMLS)

ER -