Review Papers | Computer Science & Engineering | India | Volume 5 Issue 2, February 2016
Feature Selection for Global Redundancy Minimization Using Regularized Trees
Shweta Satish Shringarputale | P. R. Rathod
Abstract: Feature selection has been a critical exploration subject in data mining, in light of the fact that the genuine data sets frequently have high dimensional elements, for example, the bioinformatics and content mining applications. Numerous current channel feature selection systems rank elements by advancing certain element ranking rules, such that related elements regularly have comparable rankings. These connected components are repetitive and don't give substantial shared data to help data mining. Therefore, when we select a set number of components, we plan to choose the top non-repetitive elements such that the valuable common data can be amplified. In past exploration, Ding et al. perceived this essential issue and proposed the base Redundancy Maximum Relevance Feature Selection (mRMR) model to minimize the repetition between consecutively chose features. In any case, this system utilized the eager hunt, along these lines the worldwide component repetition wasn't considered and the outcomes are not ideal. In this paper, we propose another element selection structure to all inclusive minimize the component excess with boosting the given element ranking scores, which can originate from any regulated or unsupervised strategies. Our new model has no parameter with the goal that it is particularly suitable for functional data mining application. We propose a tree regularization structure, which empowers numerous tree models to perform feature selection efficiently. The key thought of the regularization system is to punish selecting another component for part when its increase (e. g. data increase) is like the elements utilized as a part of past parts. The regularization structure is connected on irregular woods and helped trees here, and can be effectively connected to other tree models. Test studies demonstrate that the regularized trees can choose fantastic element subsets with respect to both solid and frail classifiers. Since tree models can normally manage clear cut and numerical variables, missing qualities, distinctive scales between variables, connections and nonlinearities and so forth. The tree regularization system gives a compelling and effective feature selection answer for many practical problems.
Keywords: Feature selection, Feature Ranking, Redundancy Minimization, Regularized Trees
Edition: Volume 5 Issue 2, February 2016,
Pages: 998 - 1002
How to Cite this Article?
Shweta Satish Shringarputale, P. R. Rathod, "Feature Selection for Global Redundancy Minimization Using Regularized Trees", International Journal of Science and Research (IJSR), Volume 5 Issue 2, February 2016, pp. 998-1002, https://www.ijsr.net/get_abstract.php?paper_id=NOV161230
How to Share this Article?
Similar Articles with Keyword 'Feature selection'
Model of Decision Tree for Email Classification
Nallamothu Naveen Kumar
Feature Selection for Handwriting Digit Recognition Using Convolutional Neural Network
Tarun Patel | Shivansh Tiwari | Vaibhav Dubey | Vaibhav Singh | Dr. Harvendra Kumar