An Ensemble of Classifiers using Weighted Instance Selection

  • Unique Paper ID: 142553
  • Volume: 2
  • Issue: 3
  • PageNo: 96-102
  • Abstract:
  • The aim of proposed work is to analyze the Instance Selection Algorithm first. There are Weighted Instance Selection algorithms are available such as wDROP3 (weighted Decremental Reduction Optimization Procedure 3), wRNN (weighted Reduced Nearest Neighbor), which reduces the Sample set applied. Then the multiclass Instance Selection is useful technique for reducing space and time complexity. This removes irrelevant, noisy, superfluous instances from Training Set. Then the multiclass problem is solved by considering number of two class problem thus designing multiple two class classifiers and its combined output produces the result for it. The Boosting is use for providing weight for each instance of training set. The Designing of ensemble of classifiers is to combine all classifiers and learn by reduced training set. There are different techniques are available for designing an ensemble such as Bagging (Bootstrap Aggregating), Boosting (ADABOOST) and Error Correcting Output Code (ECOC) etc. The output of ensemble is better than the individual classifiers. The approach is tested with few benchmark data sets. It is found that Classification accuracy in the case of wDROP3 algorithm lies between 70% to 87%, but in case of wRNN algorithm lies between 61% to 89% and the Generalization accuracy in the case of wDROP3 algorithm lies between 79% to 96%, but in wRNN algorithm it lies between 75% to 9 4%. Another observation, when increases number of Classifiers per Ensemble then accuracy improves by 0.5 to 1.5%
email to a friend

Cite This Article

  • ISSN: 2349-6002
  • Volume: 2
  • Issue: 3
  • PageNo: 96-102

An Ensemble of Classifiers using Weighted Instance Selection

Related Articles