### Abstract

We introduce Gaussian Margin Machines (GMMs), which maintain a Gaussian distribution over weight vectors for binary classification. The learning algorithm for these machines seeks the least informative distribution that will classify the training data correctly with high probability. One formulation can be expressed as a convex constrained optimization problem whose solution can be represented linearly in terms of training instances and their inner and outer products, supporting kernelization. The algorithm admits a natural PAC-Bayesian justification and is shown to minimize a quantity directly related to a PAC-Bayesian generalization bound. A preliminary evaluation on handwriting recognition data shows that our algorithm improves on SVMs for the same task, achieving lower test error and lower test error variance.

Original language | English (US) |
---|---|

Pages (from-to) | 105-112 |

Number of pages | 8 |

Journal | Journal of Machine Learning Research |

Volume | 5 |

State | Published - 2009 |

### Fingerprint

### ASJC Scopus subject areas

- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability

### Cite this

*Journal of Machine Learning Research*,

*5*, 105-112.

**Gaussian margin machines.** / Crammer, Koby; Mohri, Mehryar; Pereira, Fernando.

Research output: Contribution to journal › Article

*Journal of Machine Learning Research*, vol. 5, pp. 105-112.

}

TY - JOUR

T1 - Gaussian margin machines

AU - Crammer, Koby

AU - Mohri, Mehryar

AU - Pereira, Fernando

PY - 2009

Y1 - 2009

N2 - We introduce Gaussian Margin Machines (GMMs), which maintain a Gaussian distribution over weight vectors for binary classification. The learning algorithm for these machines seeks the least informative distribution that will classify the training data correctly with high probability. One formulation can be expressed as a convex constrained optimization problem whose solution can be represented linearly in terms of training instances and their inner and outer products, supporting kernelization. The algorithm admits a natural PAC-Bayesian justification and is shown to minimize a quantity directly related to a PAC-Bayesian generalization bound. A preliminary evaluation on handwriting recognition data shows that our algorithm improves on SVMs for the same task, achieving lower test error and lower test error variance.

AB - We introduce Gaussian Margin Machines (GMMs), which maintain a Gaussian distribution over weight vectors for binary classification. The learning algorithm for these machines seeks the least informative distribution that will classify the training data correctly with high probability. One formulation can be expressed as a convex constrained optimization problem whose solution can be represented linearly in terms of training instances and their inner and outer products, supporting kernelization. The algorithm admits a natural PAC-Bayesian justification and is shown to minimize a quantity directly related to a PAC-Bayesian generalization bound. A preliminary evaluation on handwriting recognition data shows that our algorithm improves on SVMs for the same task, achieving lower test error and lower test error variance.

UR - http://www.scopus.com/inward/record.url?scp=84862273709&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862273709&partnerID=8YFLogxK

M3 - Article

VL - 5

SP - 105

EP - 112

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -