Generalization bounds for time series prediction with non-stationary processes

Vitaly Kuznetsov, Mehryar Mohri

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper presents the first generalization bounds for time series prediction with a non-stationary mixing stochastic process. We prove Rademacher complexity learning bounds for both averagepath generalization with non-stationary β-mixing processes and pathdependent generalization with non-stationary φ-mixing processes. Our guarantees are expressed in terms of β- or φ-mixing coefficients and a natural measure of discrepancy between training and target distributions. They admit as special cases previous Rademacher complexity bounds for non-i.i.d. stationary distributions, for independent but not identically distributed random variables, or for the i.i.d. case. We show that, using a new sub-sample selection technique we introduce, our bounds can be tightened under the natural assumption of convergent stochastic processes. We also prove that fast learning rates can be achieved by extending existing local Rademacher complexity analysis to non-i.i.d. setting.

Original languageEnglish (US)
Title of host publicationAlgorithmic Learning Theory - 25th International Conference, ALT 2014, Proceedings
PublisherSpringer Verlag
Pages260-274
Number of pages15
Volume8776
ISBN (Print)9783319116617
StatePublished - 2014
Event25th International Conference on Algorithmic Learning Theory, ALT 2014 - Bled, Slovenia
Duration: Oct 8 2014Oct 10 2014

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume8776
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other25th International Conference on Algorithmic Learning Theory, ALT 2014
CountrySlovenia
CityBled
Period10/8/1410/10/14

Fingerprint

Nonstationary Processes
Time Series Prediction
Mixing Processes
Time series
Random processes
Stochastic Processes
Sample Selection
Learning Rate
Complexity Analysis
Stationary Distribution
Random variables
Identically distributed
Discrepancy
Random variable
Target
Generalization
Coefficient

Keywords

  • Fast rates
  • Generalization bounds
  • Local Rademacher complexity
  • Mixing
  • Stationary processes
  • Time series

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Kuznetsov, V., & Mohri, M. (2014). Generalization bounds for time series prediction with non-stationary processes. In Algorithmic Learning Theory - 25th International Conference, ALT 2014, Proceedings (Vol. 8776, pp. 260-274). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 8776). Springer Verlag.

Generalization bounds for time series prediction with non-stationary processes. / Kuznetsov, Vitaly; Mohri, Mehryar.

Algorithmic Learning Theory - 25th International Conference, ALT 2014, Proceedings. Vol. 8776 Springer Verlag, 2014. p. 260-274 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 8776).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Kuznetsov, V & Mohri, M 2014, Generalization bounds for time series prediction with non-stationary processes. in Algorithmic Learning Theory - 25th International Conference, ALT 2014, Proceedings. vol. 8776, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 8776, Springer Verlag, pp. 260-274, 25th International Conference on Algorithmic Learning Theory, ALT 2014, Bled, Slovenia, 10/8/14.
Kuznetsov V, Mohri M. Generalization bounds for time series prediction with non-stationary processes. In Algorithmic Learning Theory - 25th International Conference, ALT 2014, Proceedings. Vol. 8776. Springer Verlag. 2014. p. 260-274. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Kuznetsov, Vitaly ; Mohri, Mehryar. / Generalization bounds for time series prediction with non-stationary processes. Algorithmic Learning Theory - 25th International Conference, ALT 2014, Proceedings. Vol. 8776 Springer Verlag, 2014. pp. 260-274 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{aceea7bb6312431794d6fcd05507b2ea,
title = "Generalization bounds for time series prediction with non-stationary processes",
abstract = "This paper presents the first generalization bounds for time series prediction with a non-stationary mixing stochastic process. We prove Rademacher complexity learning bounds for both averagepath generalization with non-stationary β-mixing processes and pathdependent generalization with non-stationary φ-mixing processes. Our guarantees are expressed in terms of β- or φ-mixing coefficients and a natural measure of discrepancy between training and target distributions. They admit as special cases previous Rademacher complexity bounds for non-i.i.d. stationary distributions, for independent but not identically distributed random variables, or for the i.i.d. case. We show that, using a new sub-sample selection technique we introduce, our bounds can be tightened under the natural assumption of convergent stochastic processes. We also prove that fast learning rates can be achieved by extending existing local Rademacher complexity analysis to non-i.i.d. setting.",
keywords = "Fast rates, Generalization bounds, Local Rademacher complexity, Mixing, Stationary processes, Time series",
author = "Vitaly Kuznetsov and Mehryar Mohri",
year = "2014",
language = "English (US)",
isbn = "9783319116617",
volume = "8776",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "260--274",
booktitle = "Algorithmic Learning Theory - 25th International Conference, ALT 2014, Proceedings",

}

TY - GEN

T1 - Generalization bounds for time series prediction with non-stationary processes

AU - Kuznetsov, Vitaly

AU - Mohri, Mehryar

PY - 2014

Y1 - 2014

N2 - This paper presents the first generalization bounds for time series prediction with a non-stationary mixing stochastic process. We prove Rademacher complexity learning bounds for both averagepath generalization with non-stationary β-mixing processes and pathdependent generalization with non-stationary φ-mixing processes. Our guarantees are expressed in terms of β- or φ-mixing coefficients and a natural measure of discrepancy between training and target distributions. They admit as special cases previous Rademacher complexity bounds for non-i.i.d. stationary distributions, for independent but not identically distributed random variables, or for the i.i.d. case. We show that, using a new sub-sample selection technique we introduce, our bounds can be tightened under the natural assumption of convergent stochastic processes. We also prove that fast learning rates can be achieved by extending existing local Rademacher complexity analysis to non-i.i.d. setting.

AB - This paper presents the first generalization bounds for time series prediction with a non-stationary mixing stochastic process. We prove Rademacher complexity learning bounds for both averagepath generalization with non-stationary β-mixing processes and pathdependent generalization with non-stationary φ-mixing processes. Our guarantees are expressed in terms of β- or φ-mixing coefficients and a natural measure of discrepancy between training and target distributions. They admit as special cases previous Rademacher complexity bounds for non-i.i.d. stationary distributions, for independent but not identically distributed random variables, or for the i.i.d. case. We show that, using a new sub-sample selection technique we introduce, our bounds can be tightened under the natural assumption of convergent stochastic processes. We also prove that fast learning rates can be achieved by extending existing local Rademacher complexity analysis to non-i.i.d. setting.

KW - Fast rates

KW - Generalization bounds

KW - Local Rademacher complexity

KW - Mixing

KW - Stationary processes

KW - Time series

UR - http://www.scopus.com/inward/record.url?scp=84910026013&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84910026013&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9783319116617

VL - 8776

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 260

EP - 274

BT - Algorithmic Learning Theory - 25th International Conference, ALT 2014, Proceedings

PB - Springer Verlag

ER -