[1]王子玥,谢维波,李斌.采用口袋算法构造的多类别决策树模型[J].华侨大学学报(自然科学版),2019,40(1):121-127.[doi:10.11830/ISSN.1000-5013.201710002]
 WANG Ziyue,XIE Weibo,LI Bin.Model Construction of Multi Class Decision Tree Using Pocket Algorithm[J].Journal of Huaqiao University(Natural Science),2019,40(1):121-127.[doi:10.11830/ISSN.1000-5013.201710002]
点击复制

采用口袋算法构造的多类别决策树模型()
分享到:

《华侨大学学报(自然科学版)》[ISSN:1000-5013/CN:35-1079/N]

卷:
第40卷
期数:
2019年第1期
页码:
121-127
栏目:
出版日期:
2019-01-20

文章信息/Info

Title:
Model Construction of Multi Class Decision Tree Using Pocket Algorithm
文章编号:
1000-5013(2019)01-0121-07
作者:
王子玥 谢维波 李斌
华侨大学 计算机科学与技术学院, 福建 厦门 361021
Author(s):
WANG Ziyue XIE Weibo LI Bin
College of Computer Science and Technology, Huaqiao University, Xiamen 361021, China
关键词:
感知机多分类 开放应用架构 口袋算法 Gini指数 决策树
Keywords:
perceptron multi classification open application architecture pocket algorithm Gini index decision tree
分类号:
TP311
DOI:
10.11830/ISSN.1000-5013.201710002
文献标志码:
A
摘要:
采用开放应用架构(OAA)准则训练多个二分类感知机,以Gini指数筛选最优的方法构建二叉决策树.推算说明感知机多分类准则在每个树节点上对空间划分的局限性,将基于口袋算法的二叉树与多叉树在8个UCI数据集上进行比较,并与单变量决策树CART和C4.5的结果进行对照.结果表明:采用口袋算法基于OAA方法构建的二叉树,在准确率和空间划分的可解释性上优于基于经典多分类准则构建的多叉树.
Abstract:
This paper attempts to use open application architecture(OAA)criteria training several binary perceptron and then using Gini index to screen the best one to construct two fork decision tree. Theoretically explain the structural limitations of the perceptron classification criterion on the spatial partitioning at each tree node. Based on the experimental results of 8 UCI datasets we compared pocket algorithm two fork tree and multifork tree. We also compared the experimental results with univariate decision tree CART and C4.5. The results show the tree build by pocket algorithm and OAA method is superior to the others in both the accuracy rate and the interpretability of spatial division.

参考文献/References:

[1] MUSELLI M.On convergence properties of pocket algorithm[J].IEEE Transactions on Neural Networks,1997,8(3):623.DOI:10.1109/72.572101.
[2] YILDIZ O T,ALPAYDIN E.Linear discriminant trees[J].International Journal of Pattern Recognition and Artificial Intelligence,2005,19(3):323-353.
[3] MURTHY S K,KASIF S,SALZBERG S.A system for induction of oblique decision trees[J].Journal of Artificial Intelligence Research,1994,2(1):1-32.
[4] LOMAX S,VADERA S.A survey of cost-sensitive decision tree induction algorithms[J].Acm Computing Surveys,2013,45(2):1-35.DOI:10.1145/2431211.2431215.
[5] 苗夺谦,王珏.基于粗糙集的多变量决策树构造方法[J].软件学报,1997(6):425-431.
[6] LIM T S,LOH W Y,SHIH Y S.A comparison of prediction accuracy, complexity, and training time of thirty-three old and new classification algorithms[J].Machine Learning,2000,40(3):203-228.DOI:10.1023/A:10076082242 29.
[7] UTGOFF P E,BRODLEY C E.Linear machine decision trees[R].Amherst:University of Massachusetts,1991.
[8] SAHU S K,PUJARI A K,KUMAR V,et al.Greedy partitioning based tree structured multiclass SVM for Odia OCR[C]//Computer Vision, Pattern Recognition, Image Processing and Graphics.Patna:IEEE Press,2016:1-4.DOI:10.1109/NCVPRIPG.2015.7490018.
[9] XUE Song,JING Xiaojun,SUN Songlin,et al.Binary-decision-tree-based multiclass support vector machines[C]//International Symposium on Communications and Information Technologies.Inchon:IEEE Press,2014:85-89.DOI:10.1109/ISCIT.2014.7011875.
[10] MOUSTAKIDIS S,MALLINIS G,KOUTSIAS N,et al.SVM-based fuzzy decision trees for classification of high spatial resolution remote sensing images[J].IEEE Transactions on Geoscience and Remote Sensing,2012,50(1):149-169.DOI:10.1109/TGRS.2011.2159726.
[11] RAILEANU L E,STOFFEL K.Theoretical comparison between the Gini index and information gain criteria[J].Annals of Mathematics and Artificial Intelligence,2004,41(1):77-93.
[12] 潘大胜,屈迟文.一种改进ID3型决策树挖掘算法[J].华侨大学学报(自然科学版),2016,37(1):71-77.DOI:10.11830/ISSN.1000-5013.2016.01.0071.
[13] BREIMAN L,FRIEDMAN J H,OLSHEN R,et al.Classification and regression trees[J].Biometrics,1984,40(3):358.DOI:10.2307/2530946.
[14] GRAJSKI K A,BREIMAN L,VIANA D P G,et al.Classification of EEG spatial patterns with a tree-structured methodology: CART[J].IEEE Transactions on Biomedical Engineering,1986,33(12):1076-1086.DOI:10.1109/TBME.1986.325684.
[15] DE’ ATH G,FABRICIUS K E.Classification and regression trees: A powerful yet simple technique for ecological data analysis[J].Ecology,2000,81(11):3178-3192.DOI:10.2307/177409.
[16] QUINLAN J R.C4.5: Programs for machine learning[M].San Francisco:Morgan Kaufmann Publishers Inc,1992.
[17] MINGERS J.An empirical comparison of pruning methods for decision tree induction[J].Machine Learning,1989,4(2):227-243.DOI:10.1023/A:1022604100933.
[18] MINGERS J.An empirical comparison of selection measures for decision-tree induction[J].Machine Learning,1989,3(4):319-342.DOI:10.1007/BF00116837.
[19] KOTSIANTIS S B.Decision trees: A recent overview[J].Artificial Intelligence Review,2013,39(4):261-283.DOI:10.1007/s10462-011-9272-4.
[20] BRODLEY C E,UTGOFF P E.Multivariate decision trees[J].Machine Learning,1995,19(1):45-77.DOI:10.1023/A:1022607123649.

备注/Memo

备注/Memo:
收稿日期: 2017-10-10
通信作者: 谢维波(1964-),男,教授,博士,主要从事信号处理、视频图像分析的研究.E-mail:xwblxf@hqu.edu.cn.
基金项目: 国家自然科学基金资助项目(61271383); 华侨大学研究生科研创新能力培育计划资助项目 (16113 14016)
更新日期/Last Update: 2019-01-20