Semistochastic quadratic bound methods

Aleksandr Aravkin, Anna Choromanska, Tony Jebara, Dimitri Kanevsky

Research output: Contribution to conferencePaper

Abstract

Partition functions arise in a variety of settings, including conditional random fields, logistic regression, and latent gaussian models. In this paper, we consider semistochastic quadratic bound (SQB) methods for maximum likelihood estimation based on partition function optimization. Batch methods based on the quadratic bound were recently proposed for this class of problems, and performed favorably in comparison to state-of-the-art techniques. Semistochastic methods fall in between batch algorithms, which use all the data, and stochastic gradient type methods, which use small random selections at each iteration. We build semistochastic quadratic bound-based methods, and prove both global convergence (to a stationary point) under very weak assumptions, and linear convergence rate under stronger assumptions on the objective. To make the proposed methods faster and more stable, we consider inexact subproblem minimization and batch-size selection schemes. The efficacy of SQB methods is demonstrated via comparison with several state-of-the-art techniques on commonly used datasets.

Original languageEnglish (US)
StatePublished - Jan 1 2014
Event2nd International Conference on Learning Representations, ICLR 2014 - Banff, Canada
Duration: Apr 14 2014Apr 16 2014

Conference

Conference2nd International Conference on Learning Representations, ICLR 2014
CountryCanada
CityBanff
Period4/14/144/16/14

Fingerprint

Maximum likelihood estimation
Logistics
logistics
regression

ASJC Scopus subject areas

  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics
  • Education

Cite this

Aravkin, A., Choromanska, A., Jebara, T., & Kanevsky, D. (2014). Semistochastic quadratic bound methods. Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada.

Semistochastic quadratic bound methods. / Aravkin, Aleksandr; Choromanska, Anna; Jebara, Tony; Kanevsky, Dimitri.

2014. Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada.

Research output: Contribution to conferencePaper

Aravkin, A, Choromanska, A, Jebara, T & Kanevsky, D 2014, 'Semistochastic quadratic bound methods', Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada, 4/14/14 - 4/16/14.
Aravkin A, Choromanska A, Jebara T, Kanevsky D. Semistochastic quadratic bound methods. 2014. Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada.
Aravkin, Aleksandr ; Choromanska, Anna ; Jebara, Tony ; Kanevsky, Dimitri. / Semistochastic quadratic bound methods. Paper presented at 2nd International Conference on Learning Representations, ICLR 2014, Banff, Canada.
@conference{ce8a65d079824ad98b9316ee95a57b9b,
title = "Semistochastic quadratic bound methods",
abstract = "Partition functions arise in a variety of settings, including conditional random fields, logistic regression, and latent gaussian models. In this paper, we consider semistochastic quadratic bound (SQB) methods for maximum likelihood estimation based on partition function optimization. Batch methods based on the quadratic bound were recently proposed for this class of problems, and performed favorably in comparison to state-of-the-art techniques. Semistochastic methods fall in between batch algorithms, which use all the data, and stochastic gradient type methods, which use small random selections at each iteration. We build semistochastic quadratic bound-based methods, and prove both global convergence (to a stationary point) under very weak assumptions, and linear convergence rate under stronger assumptions on the objective. To make the proposed methods faster and more stable, we consider inexact subproblem minimization and batch-size selection schemes. The efficacy of SQB methods is demonstrated via comparison with several state-of-the-art techniques on commonly used datasets.",
author = "Aleksandr Aravkin and Anna Choromanska and Tony Jebara and Dimitri Kanevsky",
year = "2014",
month = "1",
day = "1",
language = "English (US)",
note = "2nd International Conference on Learning Representations, ICLR 2014 ; Conference date: 14-04-2014 Through 16-04-2014",

}

TY - CONF

T1 - Semistochastic quadratic bound methods

AU - Aravkin, Aleksandr

AU - Choromanska, Anna

AU - Jebara, Tony

AU - Kanevsky, Dimitri

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Partition functions arise in a variety of settings, including conditional random fields, logistic regression, and latent gaussian models. In this paper, we consider semistochastic quadratic bound (SQB) methods for maximum likelihood estimation based on partition function optimization. Batch methods based on the quadratic bound were recently proposed for this class of problems, and performed favorably in comparison to state-of-the-art techniques. Semistochastic methods fall in between batch algorithms, which use all the data, and stochastic gradient type methods, which use small random selections at each iteration. We build semistochastic quadratic bound-based methods, and prove both global convergence (to a stationary point) under very weak assumptions, and linear convergence rate under stronger assumptions on the objective. To make the proposed methods faster and more stable, we consider inexact subproblem minimization and batch-size selection schemes. The efficacy of SQB methods is demonstrated via comparison with several state-of-the-art techniques on commonly used datasets.

AB - Partition functions arise in a variety of settings, including conditional random fields, logistic regression, and latent gaussian models. In this paper, we consider semistochastic quadratic bound (SQB) methods for maximum likelihood estimation based on partition function optimization. Batch methods based on the quadratic bound were recently proposed for this class of problems, and performed favorably in comparison to state-of-the-art techniques. Semistochastic methods fall in between batch algorithms, which use all the data, and stochastic gradient type methods, which use small random selections at each iteration. We build semistochastic quadratic bound-based methods, and prove both global convergence (to a stationary point) under very weak assumptions, and linear convergence rate under stronger assumptions on the objective. To make the proposed methods faster and more stable, we consider inexact subproblem minimization and batch-size selection schemes. The efficacy of SQB methods is demonstrated via comparison with several state-of-the-art techniques on commonly used datasets.

UR - http://www.scopus.com/inward/record.url?scp=85070893857&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85070893857&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:85070893857

ER -