World Scientific
  • Search
  •   
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at [email protected] for any enquiries.

LEARNING TO ASSEMBLE CLASSIFIERS VIA GENETIC PROGRAMMING

    https://doi.org/10.1142/S0218001414600052Cited by:9 (Source: Crossref)

    This paper introduces a novel approach for building heterogeneous ensembles based on genetic programming (GP). Ensemble learning is a paradigm that aims at combining individual classifier's outputs to improve their performance. Commonly, classifiers outputs are combined by a weighted sum or a voting strategy. However, linear fusion functions may not effectively exploit individual models' redundancy and diversity. In this research, a GP-based approach to learn fusion functions that combine classifiers outputs is proposed. Heterogeneous ensembles are aimed in this study, these models use individual classifiers which are based on different principles (e.g. decision trees and similarity-based techniques). A detailed empirical assessment is carried out to validate the effectiveness of the proposed approach. Results show that the proposed method is successful at building very effective classification models, outperforming alternative ensemble methodologies. The proposed ensemble technique is also applied to fuse homogeneous models' outputs with results also showing its effectiveness. Therefore, an in-depth analysis from different perspectives of the proposed strategy to build ensembles is presented with a strong experimental support.

    References

    • U. Bhowan, M. Johnston, M. Zhang and X. Yao, Reusing genetic programming for ensemble selection in classification of unbalanced data, IEEE Trans. Evol. Comput., in press, doi:10.1109/TEVC.2013.2293393 . Google Scholar
    • U. Bhowanet al., IEEE Trans. Evol. Comput. 17(3), 368 (2013). Crossref, Web of ScienceGoogle Scholar
    • S.   Bian and W.   Wang , Int. J. Hybrid Intell. Syst.   4 , 103 ( 2007 ) . CrossrefGoogle Scholar
    • L. Breiman, Mach. Learn. 24(2), 123 (2001). Google Scholar
    • D. F. de Oliveira, A. M. P. Canuto and M. C. P. De Souto, Use of multi-objective genetic algorithms to investigate the diversity/accuracy dilemma in heterogeneous ensembles, Proc. IJCNN (2010) pp. 2339–2346. Google Scholar
    • T. Dietterich, Ensemble methods in machine learning, Lecture Notes in Computer Science (LNCS), Vol. 1857 (Springer, 2000), pp. 1–15 . Google Scholar
    • H. J. Escalante, N. Acosta-Mendoza, A. Morales-Reyes and A. Gago-Alonso, Genetic programming of heterogeneous ensembles for classification, Lecture Notes in Computer Science (LNCS), Vol. 8258 (Springer, 2013), pp. 9–16 . Google Scholar
    • H. J. Escalanteet al., Late fusion of heterogeneous methods for multimedia image retrieval, Proc. ACM Multimedia Information Retrieval Conf. (Vancouver, British Columbia Canada, 2008) pp. 172–179. Google Scholar
    • H. J. Escalante, M. Montes and L. E. Sucar, Ensemble particle swarm model selection, Proc. IJCNN (2010) pp. 1–10. Google Scholar
    • P. Espejo, S. Ventura and F. Herrera, IEEE Trans. Syst. Man. Cybern. C, APPL. REV. 40(2), 121 (2010). Crossref, Web of ScienceGoogle Scholar
    • Y.   Freund and R. E.   Schapire , J. Comput. Syst. Sci.   55 , 119 ( 1997 ) . Crossref, Web of ScienceGoogle Scholar
    • I. Guyonet al., Neural Netw. 21(3), 544 (2008). Crossref, Web of ScienceGoogle Scholar
    • L. I. Kuncheva, That elusive diversity in classifier ensembles, Lecture Notes in Computer Science (LNCS), Vol. 2652 (Springer, 2003), pp. 1126–1138 . Google Scholar
    • L. I. Kuncheva and C. J. Whitaker, Mach. Learn. 51(2), 181 (2003). Crossref, Web of ScienceGoogle Scholar
    • W. B. Langdon, S. J. Barret and B. F. Buxton, Combining decision trees and neural networks for drug discovery, Lecture Notes in Computer Science (LNCS), Vol. 2278 (Springer, 2002), pp. 60–70 . Google Scholar
    • W. B.   Langdon and R.   Poli , Foundations of Genetic Programming ( Springer , 2001 ) . Google Scholar
    • R. W. Lutz, Logitboost with trees applied to the wcci 2006 performance prediction challenge datasets, Proc. IJCNN (IEEE, 2006) pp. 1657–1660. Google Scholar
    • M. Macas, D. R. B. Gabrys and L. Lhotska, Particle swarm optimization of multiple classifier systems, Lecture Notes in Computer Science (LNCS), Vol. 4507 (Springer, 2007), pp. 333–340 . Google Scholar
    • C. Park and S. Cho, Evolutionary computation for optimal ensemble classifier in lymphoma cancer classification, Lecture Notes in Artificial Intelligence (LNAI), Vol. 2871 (Springer, 2003), pp. 521–530 . Google Scholar
    • D.   Ruta and B.   Gabrys , Inform. Fusion   6 , 63 ( 2005 ) . CrossrefGoogle Scholar
    • Y. Saeys, T. Abeel and Y. Van de Peer, Robust feature selection using ensemble feature selection techniques, Proc. ECML/PKDD5112, LNAI (Springer, 2008) pp. 313–325. Google Scholar
    • S. Vega-Pons and J. Ruiz-Schulcloper, Int. J. Pattern Recogn. Artif. Intell. 25(337), 337 (2011). Link, Web of ScienceGoogle Scholar
    • J. D. Wichard, C. Merkwirth and M. Ogorzalek, Building ensembles with heterogeneous models, 7th Course on the Int. School on Neural Nets IIASS (2002) pp. 1–8. Google Scholar
    • D. H.   Wolpert , Neural Netw.   5 , 241 ( 1992 ) . Crossref, Web of ScienceGoogle Scholar
    • L. Yang and Z. Qin, Combining classifiers with particle swarms, Lecture Notes in Computer Science (LNCS), Vol. 3611 (Springer, 2005), pp. 756–763 . Google Scholar
    • C.   Zhang and   Ma Yunqian , Ensemble Machine Learning, Methods and Applications ( Springer , 2012 ) . CrossrefGoogle Scholar
    • L.   Zhang et al. , Procedia Comput. Sci.   17 , 695 ( 2013 ) . CrossrefGoogle Scholar