[1]洪铭,汪鸿翔,刘晓芳,等.采用负相关学习的SVM集成算法[J].华侨大学学报(自然科学版),2018,39(6):942-946.[doi:10.11830/ISSN.1000-5013.201611103]
 HONG Ming,WANG Hongxiang,LIU Xiaofang,et al.SVM Ensembles Algorithm Using Negative Correlation Learning[J].Journal of Huaqiao University(Natural Science),2018,39(6):942-946.[doi:10.11830/ISSN.1000-5013.201611103]
点击复制

采用负相关学习的SVM集成算法()
分享到:

《华侨大学学报(自然科学版)》[ISSN:1000-5013/CN:35-1079/N]

卷:
第39卷
期数:
2018年第6期
页码:
942-946
栏目:
出版日期:
2018-11-20

文章信息/Info

Title:
SVM Ensembles Algorithm Using Negative Correlation Learning
文章编号:
1000-5013(2018)06-0942-05
作者:
洪铭 汪鸿翔 刘晓芳 柳培忠
华侨大学 工学院, 福建 泉州 362021
Author(s):
HONG Ming WANG Hongxiang LIU Xiaofang LIU Peizhong
College of Engineering, Huaqiao University, Quanzhou 362021, China
关键词:
负相关学习 误差-分歧分解 AdaBoost-SVM 集成学习 分类器
Keywords:
negative correlation learning error-ambiguilty decomposition AdaBoost-SVM ensemble learning classifier
分类号:
TP391
DOI:
10.11830/ISSN.1000-5013.201611103
文献标志码:
A
摘要:
为了平衡集成学习中多样性与准确性之间的关系,并提高决策分类器的泛化能力,提出一种基于负相关学习和AdaBoost算法的支持向量机(SVM)集成学习方法.将负相关学习理论融合到AdaBoost-SVM的训练过程中,利用负相关学习理论计算基分类器间的相关性,并根据相关性的值自适应调整基分类器的权重,进而得到加权后的决策分类器.在UCI数据集中进行仿真,结果表明:相较于传统的负相关集成学习算法和AdaBoost-SVM算法,所提出的方法分类准确率更高,泛化能力更好.
Abstract:
In order to balance the relationship between diversity and accuracy in ensemble learning, and improve the generalization ability of decision classifier, a new support vector machine(SVM)ensemble learning method based on negative correlation learning and AdaBoost algorithm is proposed. The negative correlation learning theory is integrated into the training process of AdaBoost-SVM, and the correlation between the base classifiers is calculated by using the negative correlation learning theory. Furthermore, the weight of the base classifier is adjusted adaptively according to the correlation value. The simulation results of UCI dataset show that compared with the traditional negative correlation ensemble learning algorithm and AdaBoost-SVM algorithm, the proposed method can get higher classification accuracy and better generalization ability.

参考文献/References:

[1] LEBANON G,LAFFERTY J.Boosting and maximum likelihood for exponential models[C]//Advances in Neural Information Processing Systems.Vancouver:Neural Information Processing Systems,2002:447-454.
[2] ZHOU Zhihua,WU Jianxin,TANG Wei.Ensembling neural networks: Many could be better than all[J].Artificial Intelligence,2002,137(1/2):239-263.
[3] 倪志伟,张琛,倪丽萍.基于萤火虫群优化算法的选择性集成雾霾天气预测方法[J].模式识别与人工智能,2016,29(2):143-153.DOI:10.16451/j.cnki.issn1003-6059.201602006.
[4] TRAN V T,TEMPEL S,ZERATH B,et al.MiRBoost: Boosting support vector machines for microRNA precursor classification[J].RNA,2015,21(5):775-785.DOI:10.1261/rna.043612.113.
[5] XU Jian,TANG Liang,LI Tao.System situation ticket identification using SVMs ensemble[J].Expert Systems with Applications,2016,60:130-140.DOI:10.1016/j.eswa.2016.04.017.
[6] 张春霞,张讲社.选择性集成学习算法综述[J].计算机学报,2011,34(8):1399-1410.
[7] MAO Shasha,JIAO Licheng,XIONG Lin,et al.Greedy optimization classifiers ensemble based diversity[J].Pattern Recognition,2011,44(6):1245-1261.DOI:10.1016/j.patcog.2010.11.007.
[8] LAZAREVIC A,OBRADOVIC Z.Effective pruning of neural network classifier ensembles[J].International Joint Conference on Neural Networks.Washington DC:IEEE Press,2001:796-801.DOI:10.1109/IJCNN.2001.939461.
[9] MARTINEZ-MUNEZ G,SUAREZ A,et al.Using boosting to prune bagging ensembles[J].Pattern Recognition Letters,2007,28(1):156-165.
[10] FREUND Y,SCHAPIRE R E.Experiments with a new boosting algorithm[C]//Proceedings of the Thirteenth International Conference on International Conference on Machine Learning.San Francisco:Morgan Kaufmann Publishers Inc,1996:148-156.
[11] 曹莹,苗启广,刘家辰,等.AdaBoost算法研究进展与展望[J].自动化学报,2013,39(6):745-758.
[12] 姚旭,王晓丹,张玉玺,等.基于AdaBoost和匹配追踪的选择性集成算法[J].控制与决策,2014(2):208-214.DOI:10.13195/j.kzyjc.2012.1472.
[13] CHANG Tiantian,LIU Hongwei,ZHOU Shuisheng.Large scale classification with local diversity AdaBoost SVM algorithm[J].Journal of Systems Engineering and Electronics,2009,20(6):1344-1350.
[14] LI Xuchun,WANG Lei,SUNG E.AdaBoost with SVM-based component classifiers[J].Engineering Applications of Artificial Intelligence,2008,21(5):785-795.DOI:10.1016/j.engappai.2007.07.001.
[15] LI Leijun,ZOU Bo,HU Qinghua,et al.Dynamic classifier ensemble using classification confidence[J].Neurocomputing,2013,99(99):581-591.
[16] MARGINEANTU D D,DIETTERICH T G.Pruning adaptive boosting[C]//Fourteenth International Conference on Machine Learning.San Francisco:Morgan Kaufmann Publishers Inc,1997:211-218.
[17] VALENTINI G,DIETTERICH T G.Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods[J].Journal of Machine Learning Research,2004,5(3):725-775.
[18] UEDA N,NAKANO R.Generalization error of ensemble estimators[C]//IEEE International Conference on Neural Networks.Washington DC:IEEE Press,1996:90-95.DOI:10.1109/ICNN.1996.548872.
[19] ASUNCION A,NEWMAN D.UCI machine learning repository[EB/OL].[2016-11-03] .http://archive.ics.uci.edu/ml/index.php.
[20] CHAN Z S H,KASABPV N.A preliminary study on negative correlation learning via correlation-corrected data(NCCD)[J].Neural Processing Letters,2005,21(3):207-214.DOI:10.1007/s11063-005-1084-6.
[21] KUNCHEVA L I,WHITAKER C J.Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy[J].Machine Learning,2003,51(2):181-207.DOI:10.1023/A:1022859003006.

备注/Memo

备注/Memo:
收稿日期: 2016-11-03
通信作者: 柳培忠(1976-),男,讲师,博士,主要从事计算机视觉、机器学习、嵌入式系统的研究.E-mail:pzliu@hqu.edu.cn.
基金项目: 国家自然科学基金资助项目(61203242); 福建省泉州市科技计划项目(2014Z113, 2014Z103); 华侨大学研究生科研创新能力培育计划资助项目(1400422003)
更新日期/Last Update: 2018-11-20