### Abstract

This paper presents several novel generalization bounds for the problem of learning kernels based on a combinatorial analysis of the Rademacher complexity of the corresponding hypothesis sets. Our bound for learning kernels with a convex combination of p base kernels using Li regularization admits only a √logp dependency on the number of kernels, which is tight and considerably more favorable than the previous best bound given for the same problem. We also give a novel bound for learning with a non-negative combination of p base kernels with an L_{2} regularization whose dependency on p is also tight and only in p^{1/4}. We present similar results for L_{q} regularization with other values of q, and outline the relevance of our proof techniques to the analysis of the complexity of the class of linear functions. Experiments with a large number of kernels further validate the behavior of the generalization error as a function of p predicted by our bounds.

Original language | English (US) |
---|---|

Title of host publication | ICML 2010 - Proceedings, 27th International Conference on Machine Learning |

Pages | 247-254 |

Number of pages | 8 |

State | Published - 2010 |

Event | 27th International Conference on Machine Learning, ICML 2010 - Haifa, Israel Duration: Jun 21 2010 → Jun 25 2010 |

### Other

Other | 27th International Conference on Machine Learning, ICML 2010 |
---|---|

Country | Israel |

City | Haifa |

Period | 6/21/10 → 6/25/10 |

### Fingerprint

### ASJC Scopus subject areas

- Artificial Intelligence
- Education

### Cite this

*ICML 2010 - Proceedings, 27th International Conference on Machine Learning*(pp. 247-254)

**Generalization bounds for learning kernels.** / Cortes, Corinna; Mohri, Mehryar; Rostamizadeh, Afshin.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*ICML 2010 - Proceedings, 27th International Conference on Machine Learning.*pp. 247-254, 27th International Conference on Machine Learning, ICML 2010, Haifa, Israel, 6/21/10.

}

TY - GEN

T1 - Generalization bounds for learning kernels

AU - Cortes, Corinna

AU - Mohri, Mehryar

AU - Rostamizadeh, Afshin

PY - 2010

Y1 - 2010

N2 - This paper presents several novel generalization bounds for the problem of learning kernels based on a combinatorial analysis of the Rademacher complexity of the corresponding hypothesis sets. Our bound for learning kernels with a convex combination of p base kernels using Li regularization admits only a √logp dependency on the number of kernels, which is tight and considerably more favorable than the previous best bound given for the same problem. We also give a novel bound for learning with a non-negative combination of p base kernels with an L2 regularization whose dependency on p is also tight and only in p1/4. We present similar results for Lq regularization with other values of q, and outline the relevance of our proof techniques to the analysis of the complexity of the class of linear functions. Experiments with a large number of kernels further validate the behavior of the generalization error as a function of p predicted by our bounds.

AB - This paper presents several novel generalization bounds for the problem of learning kernels based on a combinatorial analysis of the Rademacher complexity of the corresponding hypothesis sets. Our bound for learning kernels with a convex combination of p base kernels using Li regularization admits only a √logp dependency on the number of kernels, which is tight and considerably more favorable than the previous best bound given for the same problem. We also give a novel bound for learning with a non-negative combination of p base kernels with an L2 regularization whose dependency on p is also tight and only in p1/4. We present similar results for Lq regularization with other values of q, and outline the relevance of our proof techniques to the analysis of the complexity of the class of linear functions. Experiments with a large number of kernels further validate the behavior of the generalization error as a function of p predicted by our bounds.

UR - http://www.scopus.com/inward/record.url?scp=77956550918&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77956550918&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781605589077

SP - 247

EP - 254

BT - ICML 2010 - Proceedings, 27th International Conference on Machine Learning

ER -