LI Zhan-shan, YANG Yun-kai,ZHANG Jia-chen. Filtering Feature Selection Algorithm Based on Entropy Weight Method[J]. Journal of Northeastern University(Natural Science), 2022, 43(7): 921-929.
[1]Zhou H F,Zhang Y,Zhang Y J,et al.Feature selection based on conditional mutual information:minimum conditional relevance and minimum conditional redundancy[J].Applied Intelligence,2019,49(3) :883-896. [2]Senawi A,Wei H L,Billings S A.A new maximum relevance-minimum multicollinearity(MRmMC) method for feature selection and ranking[J].Pattern Recognition,2017,67:47-61. [3]Wu B,Zhou M Z,Shen X P,et al.Simple profile rectifications go a long way[C]//European Conference on Object-Oriented Programming.Berlin: Springer,2013:654-678. [4]Qiu C Y.A novel multi-swarm particle swarm optimization for feature selection[J].Genetic Programming and Evolvable Machines,2019,20(4):503-529. [5]Djellali H,Ghoualmi N.Improved chaotic initialization of particle swarm applied to feature selection[C]//2019 International Conference on Networking and Advanced Systems(ICNAS).Matsue:IEEE,2019:1-5. [6]Baranauskas J A,Netto O P,Nozawa S R,et al.A tree-based algorithm for attribute selection[J].Applied Intelligence,2018,48(4):821-833. [7]Apolloni J,Leguizamón G,Alba E.Two hybrid wrapper-filter feature selection algorithms applied to high-dimensional microarray experiments[J].Applied Soft Computing,2016,38:922-932. [8]Robnik-ikonja M,Kononenko I.An adaptation of Relief for attribute estimation in regression[EB/OL].(1997-07-08)[2021-08-10].http://www.clopinet.com/isabelle/Projects/reading/robnik97-icml.pdf. [9]Das A,Das S.Feature weighting and selection with a Pareto-optimal trade-off between relevancy and redundancy[J].Pattern Recognition Letters,2017,88:12-19. [10]Kwak N,Choi C H.Input feature selection for classification problems[J].IEEE Transactions on Neural Networks,2002,13(1):143-159. [11]Peng H C,Long F H,Ding C.Feature selection based on mutual information criteria of max-dependency,max-relevance,and min-redundancy[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(8):1226-1238. [12]Estévez P A,Tesmer M,Perez C A,et al.Normalized mutual information feature selection[J].IEEE Transactions on Neural Networks,2009,20(2):189-201. [13]Li Y,Ma X F,Yang M X,et al.Improved feature selection based on normalized mutual information[C]//2015 14th International Symposium on Distributed Computing and Applications for Business Engineering and Science(DCABES).Guiyang:IEEE,2015:518-522. [14]Zhang P B,Wang X,Li X P,et al.EEG feature selection based on weighted-normalized mutual information for mental fatigue classification[C]//2016 IEEE International Instrumentation and Measurement Technology Conference Proceedings.Taipei:IEEE,2016:1-6. [15]Gao W F,Hu L,Zhang P,et al.Feature selection considering the composition of feature relevancy[J].Pattern Recognition Letters,2018,112:70-74. [16]Zhou H F,Wang X Q,Zhang Y.Feature selection based on weighted conditional mutual information[J/OL].Applied Computing and Informatics,2020[2021-08-10].https://www.emerald.com/insight/content/doi/10.1016/j.aci.2019.12.003/full/pdf?title=feature-selection-based-on-weighted-conditional-mutual-information. [17]Reshef D N,Reshef Y A,Finucane H K,et al.Detecting novel associations in large data sets[J].Science,2011,334(6062):1518-1524. [18]Zhu Y X,Tian D Z,Yan F.Effectiveness of entropy weight method in decision-making[J/OL].Mathematical Problems in Engineering,2020[2021-08-10].https://downloads.hindawi.com/journals/mpe/2020/3564835.pdf.