We consider a variable metric linesearch based proximal gradient method for the minimization of the sum of a smooth, possibly nonconvex function plus a convex, possibly nonsmooth term. We prove convergence of this iterative algorithm to a critical point if the objective function satisfies the Kurdyka-Łojasiewicz property at each point of its domain, under the assumption that a limit point exists. The proposed method is applied to a wide collection of image processing problems and our numerical tests show that our algorithm results to be flexible, robust and competitive when compared to recently proposed approaches able to address the optimization problems arising in the considered applications.
On the convergence of a linesearch based proximal-gradient method for nonconvex optimization / Bonettini S.; Loris I.; Porta F.; Prato M.; Rebegoldi S.. - In: INVERSE PROBLEMS. - ISSN 0266-5611. - ELETTRONICO. - 33:(2017), pp. 055005-055005. [10.1088/1361-6420/aa5bfd]
On the convergence of a linesearch based proximal-gradient method for nonconvex optimization
Rebegoldi S.Membro del Collaboration Group
2017
Abstract
We consider a variable metric linesearch based proximal gradient method for the minimization of the sum of a smooth, possibly nonconvex function plus a convex, possibly nonsmooth term. We prove convergence of this iterative algorithm to a critical point if the objective function satisfies the Kurdyka-Łojasiewicz property at each point of its domain, under the assumption that a limit point exists. The proposed method is applied to a wide collection of image processing problems and our numerical tests show that our algorithm results to be flexible, robust and competitive when compared to recently proposed approaches able to address the optimization problems arising in the considered applications.I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.