A quadratically convergent method for minimizing a sum of euclidean norms

Research output: Contribution to journalArticle

Abstract

We consider the problem of minimizing a sum of Euclidean norms. {Mathematical expression} here the residuals {ri(x)} are affine functions from Rn to R1 (n≥1≥2, m>-2). This arises in a number of applications, including single-and multi-facility location problems. The function F is, in general, not differentiable at x if at least one ri(x) is zero. Computational methods described in the literature converge quite slowly if the solution is at such a point. We present a new method which, at each iteration, computes a direction of search by solving the Newton system of equations, projected, if necessary, into a linear manifold along which F is locally differentiable. A special line search is used to obtain the next iterate. The algorithm is closely related to a method recently described by Calamai and Conn. The new method has quadratic convergence to a solution x under given conditions. The reason for this property depends on the nature of the solution. If none of the residuals is zero at*x, then F is differentiable at*x and the quadratic convergence follows from standard properties of Newton's method. If one of the residuals, say ri *x), is zero, then, as the iteration proceeds, the Hessian of F becomes extremely ill-conditioned. It is proved that this illconditioning, instead of creating difficulties, actually causes quadratic convergence to the manifold (xℛri(x)=0}. If this is a single point, the solution is thus identified. Otherwise it is necessary to continue the iteration restricted to this manifold, where the usual quadratic convergence for Newton's method applies. If several residuals are zero at*x, several stages of quadratic convergence take place as the correct index set is constructed. Thus the ill-conditioning property accelerates the identification of the residuals which are zero at the solution. Numerical experiments are presented, illustrating these results.

Original languageEnglish (US)
Pages (from-to)34-63
Number of pages30
JournalMathematical Programming
Volume27
Issue number1
DOIs
StatePublished - Sep 1983

Fingerprint

Euclidean norm
Quadratic Convergence
Zero
Differentiable
Newton-Raphson method
Iteration
Newton Methods
Ill-conditioning
Affine Function
Necessary
Facility Location Problem
Line Search
Computational methods
Iterate
Computational Methods
Accelerate
System of equations
Continue
Numerical Experiment
Converge

Keywords

  • Fermat Problem
  • Location Theory
  • Multifacility Location Problem
  • Nonsmooth Optimization
  • Sum of Norms, Sum of Distances
  • Weber Problem

ASJC Scopus subject areas

  • Mathematics(all)
  • Software

Cite this

A quadratically convergent method for minimizing a sum of euclidean norms. / Overton, Michael L.

In: Mathematical Programming, Vol. 27, No. 1, 09.1983, p. 34-63.

Research output: Contribution to journalArticle

@article{1b48e32cdb9b440dac33a7b2f5216826,
title = "A quadratically convergent method for minimizing a sum of euclidean norms",
abstract = "We consider the problem of minimizing a sum of Euclidean norms. {Mathematical expression} here the residuals {ri(x)} are affine functions from Rn to R1 (n≥1≥2, m>-2). This arises in a number of applications, including single-and multi-facility location problems. The function F is, in general, not differentiable at x if at least one ri(x) is zero. Computational methods described in the literature converge quite slowly if the solution is at such a point. We present a new method which, at each iteration, computes a direction of search by solving the Newton system of equations, projected, if necessary, into a linear manifold along which F is locally differentiable. A special line search is used to obtain the next iterate. The algorithm is closely related to a method recently described by Calamai and Conn. The new method has quadratic convergence to a solution x under given conditions. The reason for this property depends on the nature of the solution. If none of the residuals is zero at*x, then F is differentiable at*x and the quadratic convergence follows from standard properties of Newton's method. If one of the residuals, say ri *x), is zero, then, as the iteration proceeds, the Hessian of F becomes extremely ill-conditioned. It is proved that this illconditioning, instead of creating difficulties, actually causes quadratic convergence to the manifold (xℛri(x)=0}. If this is a single point, the solution is thus identified. Otherwise it is necessary to continue the iteration restricted to this manifold, where the usual quadratic convergence for Newton's method applies. If several residuals are zero at*x, several stages of quadratic convergence take place as the correct index set is constructed. Thus the ill-conditioning property accelerates the identification of the residuals which are zero at the solution. Numerical experiments are presented, illustrating these results.",
keywords = "Fermat Problem, Location Theory, Multifacility Location Problem, Nonsmooth Optimization, Sum of Norms, Sum of Distances, Weber Problem",
author = "Overton, {Michael L.}",
year = "1983",
month = "9",
doi = "10.1007/BF02591963",
language = "English (US)",
volume = "27",
pages = "34--63",
journal = "Mathematical Programming",
issn = "0025-5610",
publisher = "Springer-Verlag GmbH and Co. KG",
number = "1",

}

TY - JOUR

T1 - A quadratically convergent method for minimizing a sum of euclidean norms

AU - Overton, Michael L.

PY - 1983/9

Y1 - 1983/9

N2 - We consider the problem of minimizing a sum of Euclidean norms. {Mathematical expression} here the residuals {ri(x)} are affine functions from Rn to R1 (n≥1≥2, m>-2). This arises in a number of applications, including single-and multi-facility location problems. The function F is, in general, not differentiable at x if at least one ri(x) is zero. Computational methods described in the literature converge quite slowly if the solution is at such a point. We present a new method which, at each iteration, computes a direction of search by solving the Newton system of equations, projected, if necessary, into a linear manifold along which F is locally differentiable. A special line search is used to obtain the next iterate. The algorithm is closely related to a method recently described by Calamai and Conn. The new method has quadratic convergence to a solution x under given conditions. The reason for this property depends on the nature of the solution. If none of the residuals is zero at*x, then F is differentiable at*x and the quadratic convergence follows from standard properties of Newton's method. If one of the residuals, say ri *x), is zero, then, as the iteration proceeds, the Hessian of F becomes extremely ill-conditioned. It is proved that this illconditioning, instead of creating difficulties, actually causes quadratic convergence to the manifold (xℛri(x)=0}. If this is a single point, the solution is thus identified. Otherwise it is necessary to continue the iteration restricted to this manifold, where the usual quadratic convergence for Newton's method applies. If several residuals are zero at*x, several stages of quadratic convergence take place as the correct index set is constructed. Thus the ill-conditioning property accelerates the identification of the residuals which are zero at the solution. Numerical experiments are presented, illustrating these results.

AB - We consider the problem of minimizing a sum of Euclidean norms. {Mathematical expression} here the residuals {ri(x)} are affine functions from Rn to R1 (n≥1≥2, m>-2). This arises in a number of applications, including single-and multi-facility location problems. The function F is, in general, not differentiable at x if at least one ri(x) is zero. Computational methods described in the literature converge quite slowly if the solution is at such a point. We present a new method which, at each iteration, computes a direction of search by solving the Newton system of equations, projected, if necessary, into a linear manifold along which F is locally differentiable. A special line search is used to obtain the next iterate. The algorithm is closely related to a method recently described by Calamai and Conn. The new method has quadratic convergence to a solution x under given conditions. The reason for this property depends on the nature of the solution. If none of the residuals is zero at*x, then F is differentiable at*x and the quadratic convergence follows from standard properties of Newton's method. If one of the residuals, say ri *x), is zero, then, as the iteration proceeds, the Hessian of F becomes extremely ill-conditioned. It is proved that this illconditioning, instead of creating difficulties, actually causes quadratic convergence to the manifold (xℛri(x)=0}. If this is a single point, the solution is thus identified. Otherwise it is necessary to continue the iteration restricted to this manifold, where the usual quadratic convergence for Newton's method applies. If several residuals are zero at*x, several stages of quadratic convergence take place as the correct index set is constructed. Thus the ill-conditioning property accelerates the identification of the residuals which are zero at the solution. Numerical experiments are presented, illustrating these results.

KW - Fermat Problem

KW - Location Theory

KW - Multifacility Location Problem

KW - Nonsmooth Optimization

KW - Sum of Norms, Sum of Distances

KW - Weber Problem

UR - http://www.scopus.com/inward/record.url?scp=0003150632&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0003150632&partnerID=8YFLogxK

U2 - 10.1007/BF02591963

DO - 10.1007/BF02591963

M3 - Article

VL - 27

SP - 34

EP - 63

JO - Mathematical Programming

JF - Mathematical Programming

SN - 0025-5610

IS - 1

ER -