Learning kernels using local Rademacher complexity

Corinna Cortes, Marius Kloft, Mehryar Mohri

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our algorithms thereby benefit from the sharper learning bounds based on that notion which, under certain general conditions, guarantee a faster convergence rate. We devise two new learning kernel algorithms: one based on a convex optimization problem for which we give an efficient solution using existing learning kernel techniques, and another one that can be formulated as a DC-programming problem for which we describe a solution in detail. We also report the results of experiments with both algorithms in both binary and multi-class classification tasks.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
StatePublished - 2013
Event27th Annual Conference on Neural Information Processing Systems, NIPS 2013 - Lake Tahoe, NV, United States
Duration: Dec 5 2013Dec 10 2013

Other

Other27th Annual Conference on Neural Information Processing Systems, NIPS 2013
CountryUnited States
CityLake Tahoe, NV
Period12/5/1312/10/13

Fingerprint

Convex optimization
Learning algorithms
Experiments

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Cortes, C., Kloft, M., & Mohri, M. (2013). Learning kernels using local Rademacher complexity. In Advances in Neural Information Processing Systems Neural information processing systems foundation.

Learning kernels using local Rademacher complexity. / Cortes, Corinna; Kloft, Marius; Mohri, Mehryar.

Advances in Neural Information Processing Systems. Neural information processing systems foundation, 2013.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Cortes, C, Kloft, M & Mohri, M 2013, Learning kernels using local Rademacher complexity. in Advances in Neural Information Processing Systems. Neural information processing systems foundation, 27th Annual Conference on Neural Information Processing Systems, NIPS 2013, Lake Tahoe, NV, United States, 12/5/13.
Cortes C, Kloft M, Mohri M. Learning kernels using local Rademacher complexity. In Advances in Neural Information Processing Systems. Neural information processing systems foundation. 2013
Cortes, Corinna ; Kloft, Marius ; Mohri, Mehryar. / Learning kernels using local Rademacher complexity. Advances in Neural Information Processing Systems. Neural information processing systems foundation, 2013.
@inproceedings{64304c564a3e4136bb31031097b11c19,
title = "Learning kernels using local Rademacher complexity",
abstract = "We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our algorithms thereby benefit from the sharper learning bounds based on that notion which, under certain general conditions, guarantee a faster convergence rate. We devise two new learning kernel algorithms: one based on a convex optimization problem for which we give an efficient solution using existing learning kernel techniques, and another one that can be formulated as a DC-programming problem for which we describe a solution in detail. We also report the results of experiments with both algorithms in both binary and multi-class classification tasks.",
author = "Corinna Cortes and Marius Kloft and Mehryar Mohri",
year = "2013",
language = "English (US)",
booktitle = "Advances in Neural Information Processing Systems",
publisher = "Neural information processing systems foundation",

}

TY - GEN

T1 - Learning kernels using local Rademacher complexity

AU - Cortes, Corinna

AU - Kloft, Marius

AU - Mohri, Mehryar

PY - 2013

Y1 - 2013

N2 - We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our algorithms thereby benefit from the sharper learning bounds based on that notion which, under certain general conditions, guarantee a faster convergence rate. We devise two new learning kernel algorithms: one based on a convex optimization problem for which we give an efficient solution using existing learning kernel techniques, and another one that can be formulated as a DC-programming problem for which we describe a solution in detail. We also report the results of experiments with both algorithms in both binary and multi-class classification tasks.

AB - We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our algorithms thereby benefit from the sharper learning bounds based on that notion which, under certain general conditions, guarantee a faster convergence rate. We devise two new learning kernel algorithms: one based on a convex optimization problem for which we give an efficient solution using existing learning kernel techniques, and another one that can be formulated as a DC-programming problem for which we describe a solution in detail. We also report the results of experiments with both algorithms in both binary and multi-class classification tasks.

UR - http://www.scopus.com/inward/record.url?scp=84898966246&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84898966246&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84898966246

BT - Advances in Neural Information Processing Systems

PB - Neural information processing systems foundation

ER -