Recently, the popularity of artificial neural networks (ANN) is increasing since its capacity to model very complex problems in the area of Machine Learning, Data Mining and Pattern Recognition. Improving training efficacy of ANN based algorithm is a dynamic area of research and several papers have been reviewed in the literature. The performance of Multi-layer Perceptrons (MLP) trained with Back Propagation Artificial Neural Network (BP-ANN) method is highly influenced by the size of the datasets and the data-preprocessing techniques used. This work analyses the benefits of using pre-processing datasets using different techniques in-order to improve the ANN convergence. Specifically Min-Max, Z-Score and Decimal Scaling Normalization preprocessing techniques were evaluated. The simulation results show that the computational efficiency of ANN training process is highly enhanced when coupled with different preprocessing techniques.