Learning bounds for importance weighting

Corinna Cortes, Yishay Mansour, Mehryar Mohri

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper presents an analysis of importance weighting for learning from finite samples and gives a series of theoretical and algorithmic results. We point out simple cases where importance weighting can fail, which suggests the need for an analysis of the properties of this technique. We then give both upper and lower bounds for generalization with bounded importance weights and, more significantly, give learning guarantees for the more common case of unbounded importance weights under the weak assumption that the second moment is bounded, a condition related to the Rényi divergence of the training and test distributions. These results are based on a series of novel and general bounds we derive for unbounded loss functions, which are of independent interest. We use these bounds to guide the definition of an alternative reweighting algorithm and report the results of experiments demonstrating its benefits. Finally, we analyze the properties of normalized importance weights which are also commonly used.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
StatePublished - 2010
Event24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada
Duration: Dec 6 2010Dec 9 2010

Other

Other24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
CountryCanada
CityVancouver, BC
Period12/6/1012/9/10

Fingerprint

Experiments

ASJC Scopus subject areas

  • Information Systems

Cite this

Cortes, C., Mansour, Y., & Mohri, M. (2010). Learning bounds for importance weighting. In Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010

Learning bounds for importance weighting. / Cortes, Corinna; Mansour, Yishay; Mohri, Mehryar.

Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 2010.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Cortes, C, Mansour, Y & Mohri, M 2010, Learning bounds for importance weighting. in Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010, Vancouver, BC, Canada, 12/6/10.
Cortes C, Mansour Y, Mohri M. Learning bounds for importance weighting. In Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 2010
Cortes, Corinna ; Mansour, Yishay ; Mohri, Mehryar. / Learning bounds for importance weighting. Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 2010.
@inproceedings{5ea798ac52ca44e989842517c4a89ac2,
title = "Learning bounds for importance weighting",
abstract = "This paper presents an analysis of importance weighting for learning from finite samples and gives a series of theoretical and algorithmic results. We point out simple cases where importance weighting can fail, which suggests the need for an analysis of the properties of this technique. We then give both upper and lower bounds for generalization with bounded importance weights and, more significantly, give learning guarantees for the more common case of unbounded importance weights under the weak assumption that the second moment is bounded, a condition related to the R{\'e}nyi divergence of the training and test distributions. These results are based on a series of novel and general bounds we derive for unbounded loss functions, which are of independent interest. We use these bounds to guide the definition of an alternative reweighting algorithm and report the results of experiments demonstrating its benefits. Finally, we analyze the properties of normalized importance weights which are also commonly used.",
author = "Corinna Cortes and Yishay Mansour and Mehryar Mohri",
year = "2010",
language = "English (US)",
isbn = "9781617823800",
booktitle = "Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010",

}

TY - GEN

T1 - Learning bounds for importance weighting

AU - Cortes, Corinna

AU - Mansour, Yishay

AU - Mohri, Mehryar

PY - 2010

Y1 - 2010

N2 - This paper presents an analysis of importance weighting for learning from finite samples and gives a series of theoretical and algorithmic results. We point out simple cases where importance weighting can fail, which suggests the need for an analysis of the properties of this technique. We then give both upper and lower bounds for generalization with bounded importance weights and, more significantly, give learning guarantees for the more common case of unbounded importance weights under the weak assumption that the second moment is bounded, a condition related to the Rényi divergence of the training and test distributions. These results are based on a series of novel and general bounds we derive for unbounded loss functions, which are of independent interest. We use these bounds to guide the definition of an alternative reweighting algorithm and report the results of experiments demonstrating its benefits. Finally, we analyze the properties of normalized importance weights which are also commonly used.

AB - This paper presents an analysis of importance weighting for learning from finite samples and gives a series of theoretical and algorithmic results. We point out simple cases where importance weighting can fail, which suggests the need for an analysis of the properties of this technique. We then give both upper and lower bounds for generalization with bounded importance weights and, more significantly, give learning guarantees for the more common case of unbounded importance weights under the weak assumption that the second moment is bounded, a condition related to the Rényi divergence of the training and test distributions. These results are based on a series of novel and general bounds we derive for unbounded loss functions, which are of independent interest. We use these bounds to guide the definition of an alternative reweighting algorithm and report the results of experiments demonstrating its benefits. Finally, we analyze the properties of normalized importance weights which are also commonly used.

UR - http://www.scopus.com/inward/record.url?scp=84860631607&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84860631607&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781617823800

BT - Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010

ER -