### Abstract

This paper studies the general problem of learning kernels based on a polynomial combination of base kernels. We analyze this problem in the case of regression and the kernel ridge regression algorithm. We examine the corresponding learning kernel optimization problem, show how that minimax problem can be reduced to a simpler minimization problem, and prove that the global solution of this problem always lies on the boundary. We give a projection-based gradient descent algorithm for solving the optimization problem, shown empirically to converge in few iterations. Finally, we report the results of extensive experiments with this algorithm using several publicly available datasets demonstrating the effectiveness of our technique.

Original language | English (US) |
---|---|

Title of host publication | Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference |

Pages | 396-404 |

Number of pages | 9 |

State | Published - 2009 |

Event | 23rd Annual Conference on Neural Information Processing Systems, NIPS 2009 - Vancouver, BC, Canada Duration: Dec 7 2009 → Dec 10 2009 |

### Other

Other | 23rd Annual Conference on Neural Information Processing Systems, NIPS 2009 |
---|---|

Country | Canada |

City | Vancouver, BC |

Period | 12/7/09 → 12/10/09 |

### Fingerprint

### ASJC Scopus subject areas

- Information Systems

### Cite this

*Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference*(pp. 396-404)

**Learning non-linear combinations of kernels.** / Cortes, Corinna; Mohri, Mehryar; Rostamizadeh, Afshin.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference.*pp. 396-404, 23rd Annual Conference on Neural Information Processing Systems, NIPS 2009, Vancouver, BC, Canada, 12/7/09.

}

TY - GEN

T1 - Learning non-linear combinations of kernels

AU - Cortes, Corinna

AU - Mohri, Mehryar

AU - Rostamizadeh, Afshin

PY - 2009

Y1 - 2009

N2 - This paper studies the general problem of learning kernels based on a polynomial combination of base kernels. We analyze this problem in the case of regression and the kernel ridge regression algorithm. We examine the corresponding learning kernel optimization problem, show how that minimax problem can be reduced to a simpler minimization problem, and prove that the global solution of this problem always lies on the boundary. We give a projection-based gradient descent algorithm for solving the optimization problem, shown empirically to converge in few iterations. Finally, we report the results of extensive experiments with this algorithm using several publicly available datasets demonstrating the effectiveness of our technique.

AB - This paper studies the general problem of learning kernels based on a polynomial combination of base kernels. We analyze this problem in the case of regression and the kernel ridge regression algorithm. We examine the corresponding learning kernel optimization problem, show how that minimax problem can be reduced to a simpler minimization problem, and prove that the global solution of this problem always lies on the boundary. We give a projection-based gradient descent algorithm for solving the optimization problem, shown empirically to converge in few iterations. Finally, we report the results of extensive experiments with this algorithm using several publicly available datasets demonstrating the effectiveness of our technique.

UR - http://www.scopus.com/inward/record.url?scp=84858743760&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84858743760&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84858743760

SN - 9781615679119

SP - 396

EP - 404

BT - Advances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference

ER -