par Loris, Ignace
Référence SIAM Conference on Optimization (20-23/07/2021: Online (was:Spokane, Washington, USA))
Publication Non publié, 2021-07-20
Communication à un colloque
Résumé : Some numerical optimization algorithms consisting of nested (double loop) iterations are proposed that can be used for the minimization of a convex cost function consisting of a sum of three parts: a differentiable part, a proximable part and the composition of a linear map with a proximable function. While the number of inner iterations is fixed in advance in these primal-dual algorithms, convergence of the algorithm is guaranteed by virtue of an inner loop warm-start strategy, showing that inner loop ``starting rules" can be just as effective as ``stopping rules'' for guaranteeing convergence. The algorithm itself is applicable to the numerical solution of convex optimization problems encountered in inverse problems, imaging and statistics. It reduces to the classical proximal gradient algorithm in certain special cases and it also generalizes other existing algorithms.