Mining data streams for predictive analysis is one of the most interesting topics in machine learning. With the drifting data distributions, it becomes important to build adaptive systems which are dynamic and accurate. Although ensembles are powerful in improving accuracy of incremental learning, it is crucial to maintain a set of best suitable learners in the ensemble while considering the diversity between them. By adding diversity‐based pruning to the traditional accuracy‐based pruning, this paper proposes a novel concept drift handling approach named Two‐Level Pruning based Ensemble with Abstained Learners (TLP‐EnAbLe). In this approach, deferred similarity based pruning delays the removal of under performing similar learners until it is assured that they are no longer fit for prediction. The proposed scheme retains diverse learners that are well suited for current concept. Two‐level abstaining monitors performance of learners and chooses the best set of competent learners for participating in decision making. This is an enhancement to traditional majority voting system which dynamically chooses high performing learners and abstains the ones which are not suitable for prediction. In our experiments, it has been demonstrated that TLP‐EnAbLe handles concept drift more effectively than other state‐of‐the‐art algorithms on nineteen artificially drifting and ten real‐world datasets. Further, statistical tests conducted on various drift patterns which include gradual, abrupt, recurring and their combinations prove efficiency of the proposed approach.