Deep neural networks are usually trained in the space of the nodes, by adjusting the weights of existing links via suitable optimization protocols. We here propose a radically new approach which anchors the learning process to reciprocal space. Specifically, the training acts on the spectral domain and seeks to modify the eigenvalues and eigenvectors of transfer operators in direct space. The proposed method is ductile and can be tailored to return either linear or non-linear classifiers. Adjusting the eigenvalues, when freezing the eigenvectors entries, yields performances that are superior to those attained with standard methods restricted to operate with an identical number of free parameters. To recover a feed-forward architecture in direct space, we have postulated a nested indentation of the eigenvectors. Different non-orthogonal basis could be employed to export the spectral learning to other frameworks, as e.g. reservoir computing.

Machine learning in spectral domain / Giambagli L.; Buffoni L.; Carletti T.; Nocentini W.; Fanelli D.. - In: NATURE COMMUNICATIONS. - ISSN 2041-1723. - STAMPA. - 12:(2021), pp. 1-9. [10.1038/s41467-021-21481-0]

Machine learning in spectral domain

Giambagli L.;Buffoni L.;Carletti T.;Fanelli D.
2021

Abstract

Deep neural networks are usually trained in the space of the nodes, by adjusting the weights of existing links via suitable optimization protocols. We here propose a radically new approach which anchors the learning process to reciprocal space. Specifically, the training acts on the spectral domain and seeks to modify the eigenvalues and eigenvectors of transfer operators in direct space. The proposed method is ductile and can be tailored to return either linear or non-linear classifiers. Adjusting the eigenvalues, when freezing the eigenvectors entries, yields performances that are superior to those attained with standard methods restricted to operate with an identical number of free parameters. To recover a feed-forward architecture in direct space, we have postulated a nested indentation of the eigenvectors. Different non-orthogonal basis could be employed to export the spectral learning to other frameworks, as e.g. reservoir computing.
2021
12
1
9
Giambagli L.; Buffoni L.; Carletti T.; Nocentini W.; Fanelli D.
File in questo prodotto:
File Dimensione Formato  
s41467-021-21481-0.pdf

Accesso chiuso

Tipologia: Pdf editoriale (Version of record)
Licenza: Open Access
Dimensione 1.03 MB
Formato Adobe PDF
1.03 MB Adobe PDF   Richiedi una copia

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1261259
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 23
  • ???jsp.display-item.citation.isi??? 21
social impact