This paper studies the numerical solution of strictly convex unconstrained optimization problems by linesearch Newton-CG methods. We focus on methods employing inexact evaluations of the objective function and inexact and possibly random gradient and Hessian estimates. The derivative estimates are not required to satisfy suitable accuracy requirements at each iteration but with sufficiently high probability. Concerning the evaluation of the objective function we first assume that the noise in the objective function evaluations is bounded in absolute value. Then, we analyze the case where the error satisfies prescribed dynamic accuracy requirements. We provide for both cases a complexity analysis and derive expected iteration complexity bounds. We finally focus on the specific case of finite-sum minimization which is typical of machine learning applications.

Linesearch Newton-CG methods for convex optimization with noise / Stefania Bellavia; Eugenio Fabrizi; Benedetta Morini. - In: ANNALI DELL'UNIVERSITÀ DI FERRARA. SCIENZE MATEMATICHE. - ISSN 1827-1510. - ELETTRONICO. - 68:(2022), pp. 0-0. [10.1007/s11565-022-00435-4]

Linesearch Newton-CG methods for convex optimization with noise

Stefania Bellavia
;
Eugenio Fabrizi;Benedetta Morini
2022

Abstract

This paper studies the numerical solution of strictly convex unconstrained optimization problems by linesearch Newton-CG methods. We focus on methods employing inexact evaluations of the objective function and inexact and possibly random gradient and Hessian estimates. The derivative estimates are not required to satisfy suitable accuracy requirements at each iteration but with sufficiently high probability. Concerning the evaluation of the objective function we first assume that the noise in the objective function evaluations is bounded in absolute value. Then, we analyze the case where the error satisfies prescribed dynamic accuracy requirements. We provide for both cases a complexity analysis and derive expected iteration complexity bounds. We finally focus on the specific case of finite-sum minimization which is typical of machine learning applications.
2022
68
0
0
Stefania Bellavia; Eugenio Fabrizi; Benedetta Morini
File in questo prodotto:
File Dimensione Formato  
Bellavia_et_al-2022-ANNALI_DELL'UNIVERSITA'_DI_FERRARA.pdf

accesso aperto

Descrizione: This paper studies the numerical solution of strictly convex unconstrained optimization problems by linesearch Newton-CG methods. We focus on methods employing inexact evaluations of the objective function and inexact and possibly random gradient and Hessian estimates. The derivative estimates are not required to satisfy suitable accuracy requirements at each iteration but with sufficiently high probability. Concerning the evaluation of the objective function we first assume that the noise in the objective function evaluations is bounded in absolute value. Then, we analyze the case where the error satisfies prescribed dynamic accuracy requirements. We provide for both cases a complexity analysis and derive expected iteration complexity bounds. We finally focus on the specific case of finite-sum minimization which is typical of machine learning applications.
Tipologia: Versione finale referata (Postprint, Accepted manuscript)
Licenza: Open Access
Dimensione 428.53 kB
Formato Adobe PDF
428.53 kB Adobe PDF

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1279928
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact