In ensemble methods, pooling the decisions of multiple unstable classifiers often lead to improvements in the generalization performance substantially in many applications. We propose here a new ensemble method, Double SVMSBagging, which is a variant of double bagging. In this method we have used subsampling in order to make the out-of-bag samples larger and trained support vector machine as the additional classifier on these out-of-bag samples. The underlying base classifier is the decision tree. We have used radial basis function kernel, expecting that the new classifier can perform efficiently in both linear and non-linear feature space. We have studied the performance of the proposed ensemble method in several benchmark datasets with different subsampling rate (SSR). We have applied the proposed method in partial discharge classification of the gas insulated switchgear (GIS). We compare the performance of double SVMsbagging with other well-known classifier ensemble methods in condition diagnosis; the double SVMsbagging performed better than other ensemble method in this case. We applied the double SVMsbagging in 15 UCI benchmark datasets and compare its accuracy with other ensemble methods, e.g., Bagging, Adaboost, Random Forest and Rotation Forest. The performance of this method with optimum SSR generate significantly lower prediction error than Rotation Forest and Adaboost for most of the datasets.