Supervised learning in multilayered neural networks (MLN's) has been recently proposed through the well-known backpropagation (BP) algorithm. This is a gradient method that can get stuck in local minima, as simple examples can show. In this paper, some conditions on the network architecture and the learning environment, which ensure the convergence of the BP algorithm, are proposed. It is proven in particular that the convergence holds if the classes are linearly separable. In this case, the experience gained in several experiments shows that MLN's exceed perceptrons in generalization to new examples.
On the problem of local minima in backpropagation / M. Gori; A. Tesi. - In: IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE. - ISSN 0162-8828. - STAMPA. - 14:(1992), pp. 76-86. [10.1109/34.107014]
On the problem of local minima in backpropagation
TESI, ALBERTO
1992
Abstract
Supervised learning in multilayered neural networks (MLN's) has been recently proposed through the well-known backpropagation (BP) algorithm. This is a gradient method that can get stuck in local minima, as simple examples can show. In this paper, some conditions on the network architecture and the learning environment, which ensure the convergence of the BP algorithm, are proposed. It is proven in particular that the convergence holds if the classes are linearly separable. In this case, the experience gained in several experiments shows that MLN's exceed perceptrons in generalization to new examples.File | Dimensione | Formato | |
---|---|---|---|
Gori-Tesi-1992.pdf
Accesso chiuso
Tipologia:
Altro
Licenza:
Tutti i diritti riservati
Dimensione
926.29 kB
Formato
Adobe PDF
|
926.29 kB | Adobe PDF | Richiedi una copia |
I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.