Gaussian margin machines

Koby Crammer, Mehryar Mohri, Fernando Pereira

Research output: Contribution to journalArticle

Abstract

We introduce Gaussian Margin Machines (GMMs), which maintain a Gaussian distribution over weight vectors for binary classification. The learning algorithm for these machines seeks the least informative distribution that will classify the training data correctly with high probability. One formulation can be expressed as a convex constrained optimization problem whose solution can be represented linearly in terms of training instances and their inner and outer products, supporting kernelization. The algorithm admits a natural PAC-Bayesian justification and is shown to minimize a quantity directly related to a PAC-Bayesian generalization bound. A preliminary evaluation on handwriting recognition data shows that our algorithm improves on SVMs for the same task, achieving lower test error and lower test error variance.

Original languageEnglish (US)
Pages (from-to)105-112
Number of pages8
JournalJournal of Machine Learning Research
Volume5
StatePublished - 2009

Fingerprint

Margin
Handwriting Recognition
Kernelization
Binary Classification
Convex optimization
Constrained optimization
Gaussian distribution
Constrained Optimization Problem
Convex Optimization
Justification
Learning algorithms
Learning Algorithm
Linearly
Classify
Minimise
Formulation
Evaluation
Training
Generalization

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Cite this

Crammer, K., Mohri, M., & Pereira, F. (2009). Gaussian margin machines. Journal of Machine Learning Research, 5, 105-112.

Gaussian margin machines. / Crammer, Koby; Mohri, Mehryar; Pereira, Fernando.

In: Journal of Machine Learning Research, Vol. 5, 2009, p. 105-112.

Research output: Contribution to journalArticle

Crammer, K, Mohri, M & Pereira, F 2009, 'Gaussian margin machines', Journal of Machine Learning Research, vol. 5, pp. 105-112.
Crammer, Koby ; Mohri, Mehryar ; Pereira, Fernando. / Gaussian margin machines. In: Journal of Machine Learning Research. 2009 ; Vol. 5. pp. 105-112.
@article{aab242e4666c44f595d22f2d37aa96fd,
title = "Gaussian margin machines",
abstract = "We introduce Gaussian Margin Machines (GMMs), which maintain a Gaussian distribution over weight vectors for binary classification. The learning algorithm for these machines seeks the least informative distribution that will classify the training data correctly with high probability. One formulation can be expressed as a convex constrained optimization problem whose solution can be represented linearly in terms of training instances and their inner and outer products, supporting kernelization. The algorithm admits a natural PAC-Bayesian justification and is shown to minimize a quantity directly related to a PAC-Bayesian generalization bound. A preliminary evaluation on handwriting recognition data shows that our algorithm improves on SVMs for the same task, achieving lower test error and lower test error variance.",
author = "Koby Crammer and Mehryar Mohri and Fernando Pereira",
year = "2009",
language = "English (US)",
volume = "5",
pages = "105--112",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Gaussian margin machines

AU - Crammer, Koby

AU - Mohri, Mehryar

AU - Pereira, Fernando

PY - 2009

Y1 - 2009

N2 - We introduce Gaussian Margin Machines (GMMs), which maintain a Gaussian distribution over weight vectors for binary classification. The learning algorithm for these machines seeks the least informative distribution that will classify the training data correctly with high probability. One formulation can be expressed as a convex constrained optimization problem whose solution can be represented linearly in terms of training instances and their inner and outer products, supporting kernelization. The algorithm admits a natural PAC-Bayesian justification and is shown to minimize a quantity directly related to a PAC-Bayesian generalization bound. A preliminary evaluation on handwriting recognition data shows that our algorithm improves on SVMs for the same task, achieving lower test error and lower test error variance.

AB - We introduce Gaussian Margin Machines (GMMs), which maintain a Gaussian distribution over weight vectors for binary classification. The learning algorithm for these machines seeks the least informative distribution that will classify the training data correctly with high probability. One formulation can be expressed as a convex constrained optimization problem whose solution can be represented linearly in terms of training instances and their inner and outer products, supporting kernelization. The algorithm admits a natural PAC-Bayesian justification and is shown to minimize a quantity directly related to a PAC-Bayesian generalization bound. A preliminary evaluation on handwriting recognition data shows that our algorithm improves on SVMs for the same task, achieving lower test error and lower test error variance.

UR - http://www.scopus.com/inward/record.url?scp=84862273709&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862273709&partnerID=8YFLogxK

M3 - Article

VL - 5

SP - 105

EP - 112

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -