Title
Double SVMBagging: A New Double Bagging with Support Vector Machine

Authors
Faisal M. Zaman and H. Hirose

Source

Engineering Letters, Vol.17 Issue 2, pp.128-140, June 2009, Advance Online Version Available: 22 May 2009


Abstract

In ensemble methods the aggregation of multiple unstable classifiers often leads to reduce the misclassification rates substantially in many applica- tions and benchmark classification problems. We pro- pose here a new ensemble, gDouble SVMBaggingh, which is a variant of double bagging. In this ensem- ble method we used the support vector machine as the additional classifiers, built on the out-of-bag samples. The underlying base classifier is the decision tree. We used four kernel types; linear, polynomial, radial ba- sis and sigmoid kernels, expecting the new classifier perform in both linear and non-linear feature space. The major advantages of the proposed method is that, 1) it is compatible with the messy data structure, 2) the generation of support vectors in the first phase fa- cilitates the decision tree to classify the objects with higher confidence (accuracy), resulting in a significant error reduction in the second phase. We have applied the proposed method to a real case, the condition di- agnosis for the electric power apparatus; the feature variables are the maximum likelihood parameters in the generalized normal distribution, and weibull dis- tribution. These variables are composed from the partial discharge patterns of electromagnetic signals by the apparatus. We compare the performance of double SVMbagging with other well-known classifier ensemble methods in condition diagnosis; the double SVMbagging with the radial basis kernel performed better than other ensemble method and other kernels. We applied the double SVMbagging with radial basis kernel in 15 UCI benchmark datasets and compare itfs accuracy with other ensemble methods e.g., Bag- ging, Adaboost, Random forest and Rotation Forest. The performance of this method demonstrates that this method can generate significantly lower predic- tion error than Rotation Forest and Adaboost more often than reverse. It performed much better than Bagging and Random Forest.


Key Words
Support vector machine, double bagging, CART, condition diagnosis, electric power apparatus

Citation

@

Times Cited in Web of Science: 3

Times Cited in Google Scholar: 4

Cited in Books:

WoS:

Others: