The field of artificial neural networks has a long history of several decades, where the theoretical contributions have progressed with advances in terms of power and memory in present day computers. Some old methods are now rebranded or represented, taking advantage of the power of present day computers. More particularly, we consider the current trend of Random Vector Functional Link Networks, which suggests that the architecture of a system and the learning algorithm should be properly decoupled. In this paper, we evaluate the performance of multi-layers Random Vector Functional Link Network (RVFL)/ extreme machine learning (EML) on four databases of handwritten characters. Particularly, we evaluate the impact of the architecture (number of neurons per hidden layer), and the robustness of the distribution of the results across different runs. By combining the classifier outputs from different runs, we show that such a maximum combination rule provides an accuracy of 95.97% for Arabic digits, 98.03% for Bangla, 98.64% for Devnagari, and 96.30% for Oriya digits. The results confirm that increasing the size of the hidden layers has a significant impact on the accuracy, and allows to reach state-of-the-art performance; however the performance reaches a plateau after a certain size of the hidden layers.