Classification is one-out-of several applications in the neural network (NN) world. Multilayer perceptron (MLP) is the common neural network architecture which is used for classification tasks. It is known for its error back propagation (EBP) algorithm, which opened the new way for solving classification problems given a set of empirical data. In this paper, we performed experiments using three different NN structures in order to find the best performing MLP neural network for the nonlinear classification of multiclass data sets. The three different MLP structures for solving classification problems having K classes are: one model/K output layer neurons, K separate models/One output layer neuron, and K joint models/One output layer neuron. A developed learning algorithm used here is the batch EBP algorithm which uses all the data as a single batch while updating the NN weights. The batch EBP speeds significantly the training up. The use of a pseudo-inverse in calculating the output layer weights is also contributing to faster training. The extensive series of experiments performed within the research proved that the best structure for solving multiclass classification problems is a K joint models/One output layer neuron structure.