Adaptive cubic regularization methods for solving nonconvex problems need the efficient computation of the trial step, involving the minimization of a cubic model. We propose a new approach in which this model is minimized in a low dimensional subspace that, in contrast to classic approaches, is reused for a number of iterations. Whenever the trial step produced by the low-dimensional minimization process is unsatisfactory, we employ a regularized Newton step whose regularization parameter is a by-product of the model minimization over the low-dimensional subspace. We show that the worst-case complexity of classic cubic regularized methods is preserved, despite the possible regularized Newton steps. We focus on the large class of problems for which (sparse) direct linear system solvers are available and provide several experimental results showing the very large gains of our new approach when compared to standard implementations of adaptive cubic regularization methods based on direct linear solvers. Our first choice as projection space for the low-dimensional model minimization is the polynomial Krylov subspace; nonetheless, we also explore the use of rational Krylov subspaces in case where the polynomial ones lead to less competitive numerical results.
Regularized methods via cubic model subspace minimization for nonconvex optimization / Bellavia S.; Palitta D.; Porcelli M.; Simoncini V.. - In: COMPUTATIONAL OPTIMIZATION AND APPLICATIONS. - ISSN 0926-6003. - ELETTRONICO. - 90:(2025), pp. 801-837. [10.1007/s10589-025-00655-2]
Regularized methods via cubic model subspace minimization for nonconvex optimization
Bellavia S.;Porcelli M.;
2025
Abstract
Adaptive cubic regularization methods for solving nonconvex problems need the efficient computation of the trial step, involving the minimization of a cubic model. We propose a new approach in which this model is minimized in a low dimensional subspace that, in contrast to classic approaches, is reused for a number of iterations. Whenever the trial step produced by the low-dimensional minimization process is unsatisfactory, we employ a regularized Newton step whose regularization parameter is a by-product of the model minimization over the low-dimensional subspace. We show that the worst-case complexity of classic cubic regularized methods is preserved, despite the possible regularized Newton steps. We focus on the large class of problems for which (sparse) direct linear system solvers are available and provide several experimental results showing the very large gains of our new approach when compared to standard implementations of adaptive cubic regularization methods based on direct linear solvers. Our first choice as projection space for the low-dimensional model minimization is the polynomial Krylov subspace; nonetheless, we also explore the use of rational Krylov subspaces in case where the polynomial ones lead to less competitive numerical results.File | Dimensione | Formato | |
---|---|---|---|
bpps_COAP_2025_def.pdf
accesso aperto
Descrizione: Regularized methods via cubic model subspace minimization for nonconvex optimization
Tipologia:
Pdf editoriale (Version of record)
Licenza:
Open Access
Dimensione
761.39 kB
Formato
Adobe PDF
|
761.39 kB | Adobe PDF | |
bpps_correction_COAP_2025_def.pdf
accesso aperto
Descrizione: Correction: Regularized methods via cubic model subspace minimization for nonconvex optimization
Tipologia:
Pdf editoriale (Version of record)
Licenza:
Open Access
Dimensione
460.66 kB
Formato
Adobe PDF
|
460.66 kB | Adobe PDF |
I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.