In ensemble methods the aggregation of multiple
unstable classifiers often leads to reduce the misclassification
rates substantially in many applica- tions and benchmark classification
problems. We pro- pose here a new ensemble, gDouble SVMBaggingh,
which is a variant of double bagging. In this ensem- ble method
we used the support vector machine as the additional classifiers,
built on the out-of-bag samples. The underlying base classifier
is the decision tree. We used four kernel types; linear, polynomial,
radial ba- sis and sigmoid kernels, expecting the new classifier
perform in both linear and non-linear feature space. The major
advantages of the proposed method is that, 1) it is compatible
with the messy data structure, 2) the generation of support vectors
in the first phase fa- cilitates the decision tree to classify
the objects with higher confidence (accuracy), resulting in a
significant error reduction in the second phase. We have applied
the proposed method to a real case, the condition di- agnosis
for the electric power apparatus; the feature variables are the
maximum likelihood parameters in the generalized normal distribution,
and weibull dis- tribution. These variables are composed from
the partial discharge patterns of electromagnetic signals by
the apparatus. We compare the performance of double SVMbagging
with other well-known classifier ensemble methods in condition
diagnosis; the double SVMbagging with the radial basis kernel
performed better than other ensemble method and other kernels.
We applied the double SVMbagging with radial basis kernel in
15 UCI benchmark datasets and compare itfs accuracy with other
ensemble methods e.g., Bag- ging, Adaboost, Random forest and
Rotation Forest. The performance of this method demonstrates
that this method can generate significantly lower predic- tion
error than Rotation Forest and Adaboost more often than reverse.
It performed much better than Bagging and Random Forest.