par Fest, Jean-Baptiste;Heikkilä, Tommi;Loris, Ignace ;Martin, Ségolène;Ratti, Lucca;Rebegoldi, Simone;Sarnighausen, Gesa
Référence Advanced Techniques in Optimization for Machine learning and Imaging (ATOMI 2022), Springer, Singapore, Vol. 61, page (15-30)
Publication Publié, 2024-10-03
Référence Advanced Techniques in Optimization for Machine learning and Imaging (ATOMI 2022), Springer, Singapore, Vol. 61, page (15-30)
Publication Publié, 2024-10-03
Partie d'ouvrage collectif
Résumé : | We consider a variation of the classical proximal-gradient algorithm for the iterative minimization of a cost function consisting of a sum of two terms, one smooth and the other prox-simple, and whose relative weight is determined by a penalty parameter. This so-called fixed-point continuation method allows one to approximate the problem’s trade-off curve, i.e. to compute the minimizers of the cost function for a whole range of values of the penalty parameter at once. The algorithm is shown to converge, and a rate of convergence of the cost function is also derived. Furthermore, it is shown that this method is related to iterative algorithms constructed on the basis of the $\epsilon$-subdifferential of the prox-simple term. Some numerical examples are provided. |