We propose a new class of neurofuzzy construction algorithms with the aim of maximizing generalization capability specifically for imbalanced data classification problems based on leave-one-out (LOO) cross-validation. The algorithms are in two stages: First, an initial rule base is constructed based on estimating the Gaussian mixture model with analysis of variance decomposition from input data; the second stage carries out the joint weighted least squares parameter estimation and rule selection using an orthogonal forward subspace selection (OFSS) procedure. We show how different LOO based rule selection criteria can be incorporated with OFSS and advocate either maximizing the LOO area under curve of the receiver operating characteristics or maximizing the LOO F-measure if the datasets exhibit imbalanced class distribution. Extensive comparative simulations illustrate the effectiveness of the proposed algorithms.