Sparse Regularization via Convex Analysis

Research output: Contribution to journalArticle

Abstract

Sparse approximate solutions to linear equations are classically obtained via L1 norm regularized least squares, but this method often underestimates the true solution. As an alternative to the L1 norm, this paper proposes a class of nonconvex penalty functions that maintain the convexity of the least squares cost function to be minimized, and avoids the systematic underestimation characteristic of L1 norm regularization. The proposed penalty function is a multivariate generalization of the minimax-concave penalty. It is defined in terms of a new multivariate generalization of the Huber function, which in turn is defined via infimal convolution. The proposed sparse-regularized least squares cost function can be minimized by proximal algorithms comprising simple computations.

Original languageEnglish (US)
Article number7938377
Pages (from-to)4481-4494
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume65
Issue number17
DOIs
StatePublished - Sep 1 2017

Fingerprint

Cost functions
Linear equations
Convolution

Keywords

  • convex function
  • denoising
  • optimization
  • sparse approximation
  • Sparse regularization

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Sparse Regularization via Convex Analysis. / Selesnick, Ivan.

In: IEEE Transactions on Signal Processing, Vol. 65, No. 17, 7938377, 01.09.2017, p. 4481-4494.

Research output: Contribution to journalArticle

@article{7f7842904a3a42e8834637b5e655976b,
title = "Sparse Regularization via Convex Analysis",
abstract = "Sparse approximate solutions to linear equations are classically obtained via L1 norm regularized least squares, but this method often underestimates the true solution. As an alternative to the L1 norm, this paper proposes a class of nonconvex penalty functions that maintain the convexity of the least squares cost function to be minimized, and avoids the systematic underestimation characteristic of L1 norm regularization. The proposed penalty function is a multivariate generalization of the minimax-concave penalty. It is defined in terms of a new multivariate generalization of the Huber function, which in turn is defined via infimal convolution. The proposed sparse-regularized least squares cost function can be minimized by proximal algorithms comprising simple computations.",
keywords = "convex function, denoising, optimization, sparse approximation, Sparse regularization",
author = "Ivan Selesnick",
year = "2017",
month = "9",
day = "1",
doi = "10.1109/TSP.2017.2711501",
language = "English (US)",
volume = "65",
pages = "4481--4494",
journal = "IEEE Transactions on Signal Processing",
issn = "1053-587X",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "17",

}

TY - JOUR

T1 - Sparse Regularization via Convex Analysis

AU - Selesnick, Ivan

PY - 2017/9/1

Y1 - 2017/9/1

N2 - Sparse approximate solutions to linear equations are classically obtained via L1 norm regularized least squares, but this method often underestimates the true solution. As an alternative to the L1 norm, this paper proposes a class of nonconvex penalty functions that maintain the convexity of the least squares cost function to be minimized, and avoids the systematic underestimation characteristic of L1 norm regularization. The proposed penalty function is a multivariate generalization of the minimax-concave penalty. It is defined in terms of a new multivariate generalization of the Huber function, which in turn is defined via infimal convolution. The proposed sparse-regularized least squares cost function can be minimized by proximal algorithms comprising simple computations.

AB - Sparse approximate solutions to linear equations are classically obtained via L1 norm regularized least squares, but this method often underestimates the true solution. As an alternative to the L1 norm, this paper proposes a class of nonconvex penalty functions that maintain the convexity of the least squares cost function to be minimized, and avoids the systematic underestimation characteristic of L1 norm regularization. The proposed penalty function is a multivariate generalization of the minimax-concave penalty. It is defined in terms of a new multivariate generalization of the Huber function, which in turn is defined via infimal convolution. The proposed sparse-regularized least squares cost function can be minimized by proximal algorithms comprising simple computations.

KW - convex function

KW - denoising

KW - optimization

KW - sparse approximation

KW - Sparse regularization

UR - http://www.scopus.com/inward/record.url?scp=85023744122&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85023744122&partnerID=8YFLogxK

U2 - 10.1109/TSP.2017.2711501

DO - 10.1109/TSP.2017.2711501

M3 - Article

AN - SCOPUS:85023744122

VL - 65

SP - 4481

EP - 4494

JO - IEEE Transactions on Signal Processing

JF - IEEE Transactions on Signal Processing

SN - 1053-587X

IS - 17

M1 - 7938377

ER -