Typical applications in signal and image processing often require the numerical solution of large–scale linear least squares problems with simple constraints, related to an m × n nonnegative matrix A, m « n. When the size of A is such that the matrix is not available in memory and only the operators of the matrix-vector products involving A and AT can be computed, forward–backward methods combined with suitable accelerating techniques are very effective; in particular, the gradient projection methods can be improved by suitable step–length rules or by an extrapolation/inertial step. In this work, we propose a further acceleration technique for both schemes, based on the use of variable metrics tailored for the considered problems. The numerical effectiveness of the proposed approach is evaluated on randomly generated test problems and real data arising from a problem of fibre orientation estimation in diffusion MRI.
Skip Nav Destination
Research Article| October 20 2016
Scaled first–order methods for a class of large–scale constrained least square problems
Vanna Lisa Coli;
AIP Conf. Proc. 1776, 040002 (2016)
Vanna Lisa Coli, Valeria Ruggiero, Luca Zanni; Scaled first–order methods for a class of large–scale constrained least square problems. AIP Conf. Proc. 20 October 2016; 1776 (1): 040002. https://doi.org/10.1063/1.4965314
Download citation file: