TY - CHAP
T1 - Combining classifiers with informational confidence
AU - Jaeger, Stefan
AU - Ma, Huanfeng
AU - Doermann, David
PY - 2008
Y1 - 2008
N2 - We propose a new statistical method for learning normalized confidence values in multiple classifier systems. Our main idea is to adjust confidence values so that their nominal values equal the information actually conveyed. In order to do so, we assume that information depends on the actual performance of each confidence value on an evaluation set. As information measure, we use Shannon's well-known logarithmic notion of information. With the confidence values matching their informational content, the classifier combination scheme reduces to the simple sum-rule, theoretically justifying this elementary combination scheme. In experimental evaluations for script identification, and both handwritten and printed character recognition, we achieve a consistent improvement on the best single recognition rate. We cherish the hope that our information-theoretical framework helps fill the theoretical gap we still experience in classifier combination, putting the excellent practical performance of multiple classifier systems on a more solid basis.
AB - We propose a new statistical method for learning normalized confidence values in multiple classifier systems. Our main idea is to adjust confidence values so that their nominal values equal the information actually conveyed. In order to do so, we assume that information depends on the actual performance of each confidence value on an evaluation set. As information measure, we use Shannon's well-known logarithmic notion of information. With the confidence values matching their informational content, the classifier combination scheme reduces to the simple sum-rule, theoretically justifying this elementary combination scheme. In experimental evaluations for script identification, and both handwritten and printed character recognition, we achieve a consistent improvement on the best single recognition rate. We cherish the hope that our information-theoretical framework helps fill the theoretical gap we still experience in classifier combination, putting the excellent practical performance of multiple classifier systems on a more solid basis.
UR - https://www.scopus.com/pages/publications/38049057950
U2 - 10.1007/978-3-540-76280-5_7
DO - 10.1007/978-3-540-76280-5_7
M3 - Chapter
SN - 9783540762799
T3 - Studies in Computational Intelligence
SP - 163
EP - 191
BT - Machine Learning in Document Analysis and Recognition
A2 - Marinai, Simone
A2 - Fujisawa, Hiromichi
ER -