参考文献/References:
[1] LEBANON G,LAFFERTY J.Boosting and maximum likelihood for exponential models[C]//Advances in Neural Information Processing Systems.Vancouver:Neural Information Processing Systems,2002:447-454.
[2] ZHOU Zhihua,WU Jianxin,TANG Wei.Ensembling neural networks: Many could be better than all[J].Artificial Intelligence,2002,137(1/2):239-263.
[3] 倪志伟,张琛,倪丽萍.基于萤火虫群优化算法的选择性集成雾霾天气预测方法[J].模式识别与人工智能,2016,29(2):143-153.DOI:10.16451/j.cnki.issn1003-6059.201602006.
[4] TRAN V T,TEMPEL S,ZERATH B,et al.MiRBoost: Boosting support vector machines for microRNA precursor classification[J].RNA,2015,21(5):775-785.DOI:10.1261/rna.043612.113.
[5] XU Jian,TANG Liang,LI Tao.System situation ticket identification using SVMs ensemble[J].Expert Systems with Applications,2016,60:130-140.DOI:10.1016/j.eswa.2016.04.017.
[6] 张春霞,张讲社.选择性集成学习算法综述[J].计算机学报,2011,34(8):1399-1410.
[7] MAO Shasha,JIAO Licheng,XIONG Lin,et al.Greedy optimization classifiers ensemble based diversity[J].Pattern Recognition,2011,44(6):1245-1261.DOI:10.1016/j.patcog.2010.11.007.
[8] LAZAREVIC A,OBRADOVIC Z.Effective pruning of neural network classifier ensembles[J].International Joint Conference on Neural Networks.Washington DC:IEEE Press,2001:796-801.DOI:10.1109/IJCNN.2001.939461.
[9] MARTINEZ-MUNEZ G,SUAREZ A,et al.Using boosting to prune bagging ensembles[J].Pattern Recognition Letters,2007,28(1):156-165.
[10] FREUND Y,SCHAPIRE R E.Experiments with a new boosting algorithm[C]//Proceedings of the Thirteenth International Conference on International Conference on Machine Learning.San Francisco:Morgan Kaufmann Publishers Inc,1996:148-156.
[11] 曹莹,苗启广,刘家辰,等.AdaBoost算法研究进展与展望[J].自动化学报,2013,39(6):745-758.
[12] 姚旭,王晓丹,张玉玺,等.基于AdaBoost和匹配追踪的选择性集成算法[J].控制与决策,2014(2):208-214.DOI:10.13195/j.kzyjc.2012.1472.
[13] CHANG Tiantian,LIU Hongwei,ZHOU Shuisheng.Large scale classification with local diversity AdaBoost SVM algorithm[J].Journal of Systems Engineering and Electronics,2009,20(6):1344-1350.
[14] LI Xuchun,WANG Lei,SUNG E.AdaBoost with SVM-based component classifiers[J].Engineering Applications of Artificial Intelligence,2008,21(5):785-795.DOI:10.1016/j.engappai.2007.07.001.
[15] LI Leijun,ZOU Bo,HU Qinghua,et al.Dynamic classifier ensemble using classification confidence[J].Neurocomputing,2013,99(99):581-591.
[16] MARGINEANTU D D,DIETTERICH T G.Pruning adaptive boosting[C]//Fourteenth International Conference on Machine Learning.San Francisco:Morgan Kaufmann Publishers Inc,1997:211-218.
[17] VALENTINI G,DIETTERICH T G.Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods[J].Journal of Machine Learning Research,2004,5(3):725-775.
[18] UEDA N,NAKANO R.Generalization error of ensemble estimators[C]//IEEE International Conference on Neural Networks.Washington DC:IEEE Press,1996:90-95.DOI:10.1109/ICNN.1996.548872.
[19] ASUNCION A,NEWMAN D.UCI machine learning repository[EB/OL].[2016-11-03] .http://archive.ics.uci.edu/ml/index.php.
[20] CHAN Z S H,KASABPV N.A preliminary study on negative correlation learning via correlation-corrected data(NCCD)[J].Neural Processing Letters,2005,21(3):207-214.DOI:10.1007/s11063-005-1084-6.
[21] KUNCHEVA L I,WHITAKER C J.Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy[J].Machine Learning,2003,51(2):181-207.DOI:10.1023/A:1022859003006.