Algorithms and theory for multiple-source adaptation

Judy Hoffman, Mehryar Mohri, Ningshan Zhang

Research output: Contribution to journalConference article

Abstract

We present a number of novel contributions to the multiple-source adaptation problem. We derive new normalized solutions with strong theoretical guarantees for the cross-entropy loss and other similar losses. We also provide new guarantees that hold in the case where the conditional probabilities for the source domains are distinct. Moreover, we give new algorithms for determining the distribution-weighted combination solution for the cross-entropy loss and other losses. We report the results of a series of experiments with real-world datasets. We find that our algorithm outperforms competing approaches by producing a single robust model that performs well on any target mixture distribution. Altogether, our theory, algorithms, and empirical results provide a full solution for the multiple-source adaptation problem with very practical benefits.

Original languageEnglish (US)
Pages (from-to)8246-8256
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - Jan 1 2018
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: Dec 2 2018Dec 8 2018

Fingerprint

Entropy
Experiments

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Algorithms and theory for multiple-source adaptation. / Hoffman, Judy; Mohri, Mehryar; Zhang, Ningshan.

In: Advances in Neural Information Processing Systems, Vol. 2018-December, 01.01.2018, p. 8246-8256.

Research output: Contribution to journalConference article

Hoffman, Judy ; Mohri, Mehryar ; Zhang, Ningshan. / Algorithms and theory for multiple-source adaptation. In: Advances in Neural Information Processing Systems. 2018 ; Vol. 2018-December. pp. 8246-8256.
@article{1b773bea955f433bb88b4d20614a519a,
title = "Algorithms and theory for multiple-source adaptation",
abstract = "We present a number of novel contributions to the multiple-source adaptation problem. We derive new normalized solutions with strong theoretical guarantees for the cross-entropy loss and other similar losses. We also provide new guarantees that hold in the case where the conditional probabilities for the source domains are distinct. Moreover, we give new algorithms for determining the distribution-weighted combination solution for the cross-entropy loss and other losses. We report the results of a series of experiments with real-world datasets. We find that our algorithm outperforms competing approaches by producing a single robust model that performs well on any target mixture distribution. Altogether, our theory, algorithms, and empirical results provide a full solution for the multiple-source adaptation problem with very practical benefits.",
author = "Judy Hoffman and Mehryar Mohri and Ningshan Zhang",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
volume = "2018-December",
pages = "8246--8256",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - Algorithms and theory for multiple-source adaptation

AU - Hoffman, Judy

AU - Mohri, Mehryar

AU - Zhang, Ningshan

PY - 2018/1/1

Y1 - 2018/1/1

N2 - We present a number of novel contributions to the multiple-source adaptation problem. We derive new normalized solutions with strong theoretical guarantees for the cross-entropy loss and other similar losses. We also provide new guarantees that hold in the case where the conditional probabilities for the source domains are distinct. Moreover, we give new algorithms for determining the distribution-weighted combination solution for the cross-entropy loss and other losses. We report the results of a series of experiments with real-world datasets. We find that our algorithm outperforms competing approaches by producing a single robust model that performs well on any target mixture distribution. Altogether, our theory, algorithms, and empirical results provide a full solution for the multiple-source adaptation problem with very practical benefits.

AB - We present a number of novel contributions to the multiple-source adaptation problem. We derive new normalized solutions with strong theoretical guarantees for the cross-entropy loss and other similar losses. We also provide new guarantees that hold in the case where the conditional probabilities for the source domains are distinct. Moreover, we give new algorithms for determining the distribution-weighted combination solution for the cross-entropy loss and other losses. We report the results of a series of experiments with real-world datasets. We find that our algorithm outperforms competing approaches by producing a single robust model that performs well on any target mixture distribution. Altogether, our theory, algorithms, and empirical results provide a full solution for the multiple-source adaptation problem with very practical benefits.

UR - http://www.scopus.com/inward/record.url?scp=85064817980&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85064817980&partnerID=8YFLogxK

M3 - Conference article

VL - 2018-December

SP - 8246

EP - 8256

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -