On the impact of kernel approximation on learning accuracy

Corinna Cortes, Mehryar Mohri, Ameet Talwalkar

Research output: Contribution to journalArticle

Abstract

Kernel approximation is commonly used to scale kernel-based algorithms to applications containing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, including SVMs, KRR, and graph Laplacian-based regularization algorithms. These bounds help determine the degree of approximation that can be tolerated in the estimation of the kernel matrix. Our analysis is general and applies to arbitrary approximations of the kernel matrix. However, we also give a specific analysis of the Nyström low-rank approximation in this context and report the results of experiments evaluating the quality of the Nyström low-rank kernel approximation when used with ridge regression. 9.

Original languageEnglish (US)
Pages (from-to)113-120
Number of pages8
JournalJournal of Machine Learning Research
Volume9
StatePublished - 2010

Fingerprint

kernel
Approximation
Learning algorithms
Graph Laplacian
Low-rank Approximation
Degree of Approximation
Ridge Regression
Learning
Experiments
Learning Algorithm
Regularization
Norm
Arbitrary
Experiment

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Cite this

On the impact of kernel approximation on learning accuracy. / Cortes, Corinna; Mohri, Mehryar; Talwalkar, Ameet.

In: Journal of Machine Learning Research, Vol. 9, 2010, p. 113-120.

Research output: Contribution to journalArticle

@article{0b33e227a6964b1e88f0cda83a7197b0,
title = "On the impact of kernel approximation on learning accuracy",
abstract = "Kernel approximation is commonly used to scale kernel-based algorithms to applications containing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, including SVMs, KRR, and graph Laplacian-based regularization algorithms. These bounds help determine the degree of approximation that can be tolerated in the estimation of the kernel matrix. Our analysis is general and applies to arbitrary approximations of the kernel matrix. However, we also give a specific analysis of the Nystr{\"o}m low-rank approximation in this context and report the results of experiments evaluating the quality of the Nystr{\"o}m low-rank kernel approximation when used with ridge regression. 9.",
author = "Corinna Cortes and Mehryar Mohri and Ameet Talwalkar",
year = "2010",
language = "English (US)",
volume = "9",
pages = "113--120",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - On the impact of kernel approximation on learning accuracy

AU - Cortes, Corinna

AU - Mohri, Mehryar

AU - Talwalkar, Ameet

PY - 2010

Y1 - 2010

N2 - Kernel approximation is commonly used to scale kernel-based algorithms to applications containing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, including SVMs, KRR, and graph Laplacian-based regularization algorithms. These bounds help determine the degree of approximation that can be tolerated in the estimation of the kernel matrix. Our analysis is general and applies to arbitrary approximations of the kernel matrix. However, we also give a specific analysis of the Nyström low-rank approximation in this context and report the results of experiments evaluating the quality of the Nyström low-rank kernel approximation when used with ridge regression. 9.

AB - Kernel approximation is commonly used to scale kernel-based algorithms to applications containing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, including SVMs, KRR, and graph Laplacian-based regularization algorithms. These bounds help determine the degree of approximation that can be tolerated in the estimation of the kernel matrix. Our analysis is general and applies to arbitrary approximations of the kernel matrix. However, we also give a specific analysis of the Nyström low-rank approximation in this context and report the results of experiments evaluating the quality of the Nyström low-rank kernel approximation when used with ridge regression. 9.

UR - http://www.scopus.com/inward/record.url?scp=84862278427&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862278427&partnerID=8YFLogxK

M3 - Article

VL - 9

SP - 113

EP - 120

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -