Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization

Afonso Bandeira, K. Scheinberg, L. N. Vicente

Research output: Contribution to journalArticle

Abstract

Interpolation-based trust-region methods are an important class of algorithms forDerivative-Free Optimizationwhich rely on locally approximating an objective function by quadratic polynomial interpolation models, frequently built from less points than there are basis components. Often, in practical applications, the contribution of the problem variables to the objective function is such that many pairwise correlations between variables are negligible, implying, in the smooth case, a sparse structure in the Hessian matrix. To be able to exploit Hessian sparsity, existing optimization approaches require the knowledge of the sparsity structure. The goal of this paper is to develop and analyze a method where the sparse models are constructed automatically. The sparse recovery theory developed recently in the field of compressed sensing characterizes conditions under which a sparse vector can be accurately recovered from few random measurements. Such a recovery is achieved by minimizing the l1-norm of a vector subject to the measurements constraints. We suggest an approach for building sparse quadratic polynomial interpolationmodels byminimizing the l1-norm of the entries of themodel Hessian subject to the interpolation conditions. We show that this procedure recovers accurate models when the function Hessian is sparse, using relatively few randomly selected sample points. Motivated by this result, we developed a practical interpolation-based trust-region method using deterministic sample sets and minimum l1-norm quadratic models. Our computational results show that the new approach exhibits a promising numerical performance both in the general case and in the sparse one.

Original languageEnglish (US)
Pages (from-to)223-257
Number of pages35
JournalMathematical Programming
Volume134
Issue number1
DOIs
StatePublished - Aug 2012

Fingerprint

Derivative-free Optimization
Interpolation
Polynomials
Derivatives
Polynomial
L1-norm
Trust Region Method
Quadratic Polynomial
Interpolate
Sparsity
Recovery
Compressed sensing
Objective function
Sparse Polynomials
Compressed Sensing
Hessian matrix
Sample point
Polynomial Interpolation
Model
Computational Results

Keywords

  • Compressed sensing
  • Derivative-free optimization
  • Interpolation-based trust-region methods
  • L1-Minimization
  • Random sampling
  • Sparse recovery

ASJC Scopus subject areas

  • Mathematics(all)
  • Software

Cite this

Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization. / Bandeira, Afonso; Scheinberg, K.; Vicente, L. N.

In: Mathematical Programming, Vol. 134, No. 1, 08.2012, p. 223-257.

Research output: Contribution to journalArticle

Bandeira, Afonso ; Scheinberg, K. ; Vicente, L. N. / Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization. In: Mathematical Programming. 2012 ; Vol. 134, No. 1. pp. 223-257.
@article{a4f3594429d34d9fa39067341b84f4a1,
title = "Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization",
abstract = "Interpolation-based trust-region methods are an important class of algorithms forDerivative-Free Optimizationwhich rely on locally approximating an objective function by quadratic polynomial interpolation models, frequently built from less points than there are basis components. Often, in practical applications, the contribution of the problem variables to the objective function is such that many pairwise correlations between variables are negligible, implying, in the smooth case, a sparse structure in the Hessian matrix. To be able to exploit Hessian sparsity, existing optimization approaches require the knowledge of the sparsity structure. The goal of this paper is to develop and analyze a method where the sparse models are constructed automatically. The sparse recovery theory developed recently in the field of compressed sensing characterizes conditions under which a sparse vector can be accurately recovered from few random measurements. Such a recovery is achieved by minimizing the l1-norm of a vector subject to the measurements constraints. We suggest an approach for building sparse quadratic polynomial interpolationmodels byminimizing the l1-norm of the entries of themodel Hessian subject to the interpolation conditions. We show that this procedure recovers accurate models when the function Hessian is sparse, using relatively few randomly selected sample points. Motivated by this result, we developed a practical interpolation-based trust-region method using deterministic sample sets and minimum l1-norm quadratic models. Our computational results show that the new approach exhibits a promising numerical performance both in the general case and in the sparse one.",
keywords = "Compressed sensing, Derivative-free optimization, Interpolation-based trust-region methods, L1-Minimization, Random sampling, Sparse recovery",
author = "Afonso Bandeira and K. Scheinberg and Vicente, {L. N.}",
year = "2012",
month = "8",
doi = "10.1007/s10107-012-0578-z",
language = "English (US)",
volume = "134",
pages = "223--257",
journal = "Mathematical Programming",
issn = "0025-5610",
publisher = "Springer-Verlag GmbH and Co. KG",
number = "1",

}

TY - JOUR

T1 - Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization

AU - Bandeira, Afonso

AU - Scheinberg, K.

AU - Vicente, L. N.

PY - 2012/8

Y1 - 2012/8

N2 - Interpolation-based trust-region methods are an important class of algorithms forDerivative-Free Optimizationwhich rely on locally approximating an objective function by quadratic polynomial interpolation models, frequently built from less points than there are basis components. Often, in practical applications, the contribution of the problem variables to the objective function is such that many pairwise correlations between variables are negligible, implying, in the smooth case, a sparse structure in the Hessian matrix. To be able to exploit Hessian sparsity, existing optimization approaches require the knowledge of the sparsity structure. The goal of this paper is to develop and analyze a method where the sparse models are constructed automatically. The sparse recovery theory developed recently in the field of compressed sensing characterizes conditions under which a sparse vector can be accurately recovered from few random measurements. Such a recovery is achieved by minimizing the l1-norm of a vector subject to the measurements constraints. We suggest an approach for building sparse quadratic polynomial interpolationmodels byminimizing the l1-norm of the entries of themodel Hessian subject to the interpolation conditions. We show that this procedure recovers accurate models when the function Hessian is sparse, using relatively few randomly selected sample points. Motivated by this result, we developed a practical interpolation-based trust-region method using deterministic sample sets and minimum l1-norm quadratic models. Our computational results show that the new approach exhibits a promising numerical performance both in the general case and in the sparse one.

AB - Interpolation-based trust-region methods are an important class of algorithms forDerivative-Free Optimizationwhich rely on locally approximating an objective function by quadratic polynomial interpolation models, frequently built from less points than there are basis components. Often, in practical applications, the contribution of the problem variables to the objective function is such that many pairwise correlations between variables are negligible, implying, in the smooth case, a sparse structure in the Hessian matrix. To be able to exploit Hessian sparsity, existing optimization approaches require the knowledge of the sparsity structure. The goal of this paper is to develop and analyze a method where the sparse models are constructed automatically. The sparse recovery theory developed recently in the field of compressed sensing characterizes conditions under which a sparse vector can be accurately recovered from few random measurements. Such a recovery is achieved by minimizing the l1-norm of a vector subject to the measurements constraints. We suggest an approach for building sparse quadratic polynomial interpolationmodels byminimizing the l1-norm of the entries of themodel Hessian subject to the interpolation conditions. We show that this procedure recovers accurate models when the function Hessian is sparse, using relatively few randomly selected sample points. Motivated by this result, we developed a practical interpolation-based trust-region method using deterministic sample sets and minimum l1-norm quadratic models. Our computational results show that the new approach exhibits a promising numerical performance both in the general case and in the sparse one.

KW - Compressed sensing

KW - Derivative-free optimization

KW - Interpolation-based trust-region methods

KW - L1-Minimization

KW - Random sampling

KW - Sparse recovery

UR - http://www.scopus.com/inward/record.url?scp=84865690792&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84865690792&partnerID=8YFLogxK

U2 - 10.1007/s10107-012-0578-z

DO - 10.1007/s10107-012-0578-z

M3 - Article

AN - SCOPUS:84865690792

VL - 134

SP - 223

EP - 257

JO - Mathematical Programming

JF - Mathematical Programming

SN - 0025-5610

IS - 1

ER -