The Wilson-Cowan model for metapopulation, a Neural Mass Network Model, treats different subcortical regions of the brain as connected nodes, with connections representing various types of structural, functional, or effective neuronal connectivity between these regions. Each region comprises interacting populations of excitatory and inhibitory cells, consistent with the standard Wilson-Cowan model. In this paper, we show how to incorporate stable attractors into such a metapopulation model’s dynamics. By doing so, we transform the Neural Mass Network Model into a biologically inspired learning algorithm capable of solving different classification tasks. We test it on MNIST and Fashion MNIST in combination with convolutional neural networks, as well as on CIFAR-10 and TF-FLOWERS, and in combination with a transformer architecture (BERT) on IMDB, consistently achieving high classification accuracy.
Learning in Wilson-Cowan Model for Metapopulation / Raffaele Marino, Lorenzo Buffoni, Lorenzo Chicchi, Francesca Di Patti, Diego Febbe, Lorenzo Giambagli, Duccio Fanelli. - In: NEURAL COMPUTATION. - ISSN 0899-7667. - STAMPA. - 37:(2025), pp. 1-41. [10.1162/neco_a_01744]
Learning in Wilson-Cowan Model for Metapopulation
Raffaele Marino
;Lorenzo Buffoni;Lorenzo Chicchi;Francesca Di Patti;Diego Febbe;Lorenzo Giambagli;Duccio Fanelli
2025
Abstract
The Wilson-Cowan model for metapopulation, a Neural Mass Network Model, treats different subcortical regions of the brain as connected nodes, with connections representing various types of structural, functional, or effective neuronal connectivity between these regions. Each region comprises interacting populations of excitatory and inhibitory cells, consistent with the standard Wilson-Cowan model. In this paper, we show how to incorporate stable attractors into such a metapopulation model’s dynamics. By doing so, we transform the Neural Mass Network Model into a biologically inspired learning algorithm capable of solving different classification tasks. We test it on MNIST and Fashion MNIST in combination with convolutional neural networks, as well as on CIFAR-10 and TF-FLOWERS, and in combination with a transformer architecture (BERT) on IMDB, consistently achieving high classification accuracy.File | Dimensione | Formato | |
---|---|---|---|
WilsonCowan.pdf
accesso aperto
Tipologia:
Pdf editoriale (Version of record)
Licenza:
Creative commons
Dimensione
5.64 MB
Formato
Adobe PDF
|
5.64 MB | Adobe PDF |
I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.