A general problem in model selection is to obtain the right parameters that make a model fit observed data. If the model selected is a Multilayer Perceptron (MLP) trained with Backpropagation (BP), it is necessary to find appropriate initial weights and learning parameters. This paper proposes a method that combines Simulated Annealing (SimAnn) and BP to train MLPs with a single hidden layer, termed SA-Prop. SimAnn selects the initial weights and the learning rate of the network. SA-Prop combiens the advantages of the stochastic search performed by the Simulated Annealing over the MLP parameter space and the local search of the BP algorithm.
The application of the proposed methodto several real-world benchmark problems shows that MLPs evolved using SA-Prop achieve a higher level of generalization than other perceptron training algorithms, such as QuickPropagation (QP) or RPROP, and other evolutive algorithms, such as G-LVQ.