Ensemble Selection based on Classifier Prediction Confidence

Loading...
Thumbnail Image
File version
Author(s)
Nguyen, Tien Thanh
Luong, Anh Vu
Dang, Manh Truong
Liew, Alan Wee-Chung
McCall, John
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
2020
Size
File type(s)
Location
License
Abstract

Ensemble selection is one of the most studied topics in ensemble learning because a selected subset of base classifiers may perform better than the whole ensemble system. In recent years, a great many ensemble selection methods have been introduced. However, many of these lack flexibility: either a fixed subset of classifiers is pre-selected for all test samples (static approach), or the selection of classifiers depends upon the performance of techniques that define the region of competence (dynamic approach). In this paper, we propose an ensemble selection method that takes into account each base classifier's confidence during classification and the overall credibility of the base classifier in the ensemble. In other words, a base classifier is selected to predict for a test sample if the confidence in its prediction is higher than its credibility threshold. The credibility thresholds of the base classifiers are found by minimizing the empirical 0–1 loss on the entire training observations. In this way, our approach integrates both the static and dynamic aspects of ensemble selection. Experiments on 62 datasets demonstrate that the proposed method achieves much better performance in comparison to some ensemble methods.

Journal Title

Pattern Recognition

Conference Title
Book Title
Edition
Volume

100

Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note
Access the data
Related item(s)
Subject

Artificial intelligence

Persistent link to this record
Citation

Nguyen, TT; Luong, AV; Dang, MT; Liew, AW-C; McCall, J, Ensemble Selection based on Classifier Prediction Confidence, Pattern Recognition, 2020, 100, pp. 107104: 1-107104: 15

Collections