We propose a bootstrap-based iterative method for generating classifier ensembles called Iterative Classifier Selection Bagging (ICS-Bagging). Each iteration of ICS-Bagging has two phases: i) bootstrap sampling to generate a pool of classifiers; and, ii) selection of the best classifier of the pool using a fitness function based on the ensemble accuracy and diversity. The selected classifier is added to the final ensemble. The bootstrap sampling runs on each iteration and updates the probability of sampling per class based on the class accuracy. This process is repeated until the number of classifiers in the final ensemble is reached. For the specific case of imbalanced datasets, we also propose the SMOTE-ICS-Bagging, a variation of the ICS-Bagging that runs SMOTE at the beginning of each iteration in order to reduce the class imbalance before data sampling. We compared the proposed techniques with Bagging, Random Subspace and SMOTEBagging, using 15 imbalanced datasets from KEEL. The results show the proposed techniques outperform all other techniques in accuracy. Ranking diagrams revealed that the proposed algorithms achieved the highest rankings in accuracy, outperforming SMOTEBagging, a renowned ensemble generation method for imbalanced datasets.