In this paper, an adequate set of input features is selected for functional expansion genetically for the purpose of solving the problem of classification in data mining using functional link neural network. The proposed method named as HFLNN aims to choose an optimal subset of input features by eliminating features with little or no predictive information and designs a more compact classifier. With an adequate set of basis functions, HFLNN overcomes the non-linearity of problems, which is a common phenomenon in single layer neural networks. The properties like simplicity of the architecture (i.e., no hidden layer) and the low computational complexity of the network (i.e., less number of weights to be learned) encourage us to use it in classification task of data mining. We present a mathematical analysis of the stability and convergence of the proposed method. Further the issue of statistical tests for comparison of algorithms on multiple datasets, which is even more essential in data mining studies, has been all but ignored. In this paper, we recommend a set of simple, yet safe, robust and non-parametric tests for statistical comparisons of the HFLNN with functional link neural network (FLNN) and radial basis function network (RBFN) classifiers over multiple datasets by an extensive set of simulation studies.