6533b7dcfe1ef96bd12727b5
RESEARCH PRODUCT
Decision Committee Learning with Dynamic Integration of Classifiers
Alexey Tsymbalsubject
Majority ruleBoosting (machine learning)business.industryComputer scienceFeature vectormedia_common.quotation_subjectMachine learningcomputer.software_genreRandom subspace methodComputingMethodologies_PATTERNRECOGNITIONVotingArtificial intelligenceAdaBoostbusinesscomputerClassifier (UML)Information integrationmedia_commondescription
Decision committee learning has demonstrated spectacular success in reducing classification error from learned classifiers. These techniques develop a classifier in the form of a committee of subsidiary classifiers. The combination of outputs is usually performed by majority vote. Voting, however, has a shortcoming. It is unable to take into account local expertise. When a new instance is difficult to classify, then the average classifier will give a wrong prediction, and the majority vote will more probably result in a wrong prediction. Instead of voting, dynamic integration of classifiers can be used, which is based on the assumption that each committee member is best inside certain subareas of the whole feature space. In this paper, the proposed dynamic integration technique is evaluated with AdaBoost and Bagging, the decision committee approaches which have received extensive attention recently. The comparison results show that boosting and bagging have often significantly better accuracy with dynamic integration of classifiers than with simple voting.
year | journal | country | edition | language |
---|---|---|---|---|
2000-01-01 |