We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Iteratively re-weighted least squares for Sums of Convex Functions

Formal Metadata

Title
Iteratively re-weighted least squares for Sums of Convex Functions
Alternative Title
Iteratively re-weighted least squares and ADMM methods for solving affine inclusions
Title of Series
Number of Parts
30
Author
License
CC Attribution - NonCommercial - NoDerivatives 4.0 International:
You are free to use, copy, distribute and transmit the work or content in unchanged form for any legal and non-commercial purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
We describe two matrix-free methods for solving large-scale affine inclusion problems on the product (or intersection) of convex sets. The first approach is a novel iterative re-weighting algorithm (IRWA) that iteratively minimizes quadratic models of relaxed subproblems while automatically updating a relaxation vector. The second approach is based on alternating direction augmented Lagrangian (ADAL) technology. The main computational costs of each algorithm are the repeated minimizations of convex quadratic functions which can be performed matrix-free. Both algorithms are globally convergent under loose assumptions, and each requires at most O(1/ε2) iterations to reach ε-optimality of the objective function. Numerical experiments show that both algorithms efficiently find inexact solutions. However, in certain cases, these experiments indicate that IRWA can be significantly more efficient than ADAL.