Training of neural networks can be reformulated in spectral space, by allowing eigenvalues and eigenvectors of the network to act as target of the optimization instead of the individual weights. Working in this setting, we show that the eigenvalues can be used to rank the nodes' importance within the ensemble. Indeed, we will prove that sorting the nodes based on their associated eigenvalues, enables effective pre- and post-processing pruning strategies to yield massively compacted networks (in terms of the number of composing neurons) with virtually unchanged performance. The proposed methods are tested for different architectures, with just a single or multiple hidden layers, and against distinct classification tasks of general interest.
Spectral pruning of fully connected layers / Buffoni, Lorenzo; Civitelli, Enrico; Giambagli, Lorenzo; Chicchi, Lorenzo; Fanelli, Duccio. - In: SCIENTIFIC REPORTS. - ISSN 2045-2322. - ELETTRONICO. - 12:(2022), pp. 0-0. [10.1038/s41598-022-14805-7]
Spectral pruning of fully connected layers
Buffoni, Lorenzo;Civitelli, Enrico;Giambagli, Lorenzo;Chicchi, Lorenzo;Fanelli, Duccio
2022
Abstract
Training of neural networks can be reformulated in spectral space, by allowing eigenvalues and eigenvectors of the network to act as target of the optimization instead of the individual weights. Working in this setting, we show that the eigenvalues can be used to rank the nodes' importance within the ensemble. Indeed, we will prove that sorting the nodes based on their associated eigenvalues, enables effective pre- and post-processing pruning strategies to yield massively compacted networks (in terms of the number of composing neurons) with virtually unchanged performance. The proposed methods are tested for different architectures, with just a single or multiple hidden layers, and against distinct classification tasks of general interest.I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.