We here adapt an extended version of the adaptive cubic regularization method with dynamic inexact Hessian information for nonconvex optimization in Bellavia et al. [Adaptive cubic regularization methods with dynamic inexact hessian information and applications to finite-sum minimization. IMA Journal of Numerical Analysis. 2021;41(1):764–799] to the stochastic optimization setting. While exact function evaluations are still considered, this novel variant inherits the innovative use of adaptive accuracy requirements for Hessian approximations introduced in the just quoted paper and additionally employs inexact computations of the gradient. Without restrictions on the variance of the errors, we assume that these approximations are available within a sufficiently large, but fixed, probability and we extend, in the spirit of Cartis and Scheinberg [Global convergence rate analysis of unconstrained optimization methods based on probabilistic models. Math Program Ser A. 2018;159(2):337–375], the deterministic analysis of the framework to its stochastic counterpart, showing that the expected number of iterations to reach a first-order stationary point matches the well-known worst-case optimal complexity. This is, in fact, still given by O(ϵ−3/2), with respect to the first-order ϵ tolerance. Finally, numerical tests on nonconvex finite-sum minimization confirm that using inexact first- and second-order derivatives can be beneficial in terms of the computational savings.

Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy / STEFANIA BELLAVIA; GIANMARCO GURIOLI. - In: OPTIMIZATION. - ISSN 0233-1934. - STAMPA. - 71:(2022), pp. 0-0. [10.1080/02331934.2021.1892104]

Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy.

STEFANIA BELLAVIA
;
GIANMARCO GURIOLI
2022

Abstract

We here adapt an extended version of the adaptive cubic regularization method with dynamic inexact Hessian information for nonconvex optimization in Bellavia et al. [Adaptive cubic regularization methods with dynamic inexact hessian information and applications to finite-sum minimization. IMA Journal of Numerical Analysis. 2021;41(1):764–799] to the stochastic optimization setting. While exact function evaluations are still considered, this novel variant inherits the innovative use of adaptive accuracy requirements for Hessian approximations introduced in the just quoted paper and additionally employs inexact computations of the gradient. Without restrictions on the variance of the errors, we assume that these approximations are available within a sufficiently large, but fixed, probability and we extend, in the spirit of Cartis and Scheinberg [Global convergence rate analysis of unconstrained optimization methods based on probabilistic models. Math Program Ser A. 2018;159(2):337–375], the deterministic analysis of the framework to its stochastic counterpart, showing that the expected number of iterations to reach a first-order stationary point matches the well-known worst-case optimal complexity. This is, in fact, still given by O(ϵ−3/2), with respect to the first-order ϵ tolerance. Finally, numerical tests on nonconvex finite-sum minimization confirm that using inexact first- and second-order derivatives can be beneficial in terms of the computational savings.
2022
71
0
0
Goal 9: Industry, Innovation, and Infrastructure
STEFANIA BELLAVIA; GIANMARCO GURIOLI
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1229172
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 11
  • ???jsp.display-item.citation.isi??? 9
social impact