High Impact Factor : 4.396 icon | Submit Manuscript Online icon |

JMIM: A Feature Selection Technique using Joint Mutual Information along with the Maximum of Minimum Approach

Author(s):

Rajlakshmi Sanjay Saner , K. K. Wagh Institute of Engineering Education and Research

Keywords:

Mutual Information, feature selection, classification, joint mutual information

Abstract

Feature selection using information theory increases the computational efficiency, scalability in terms of the dataset dimensionality irrespective of any classifier. However, researchers have observed some drawbacks. One of those is the amount of redundancy in the generated results. This redundancy increases with the increase in relevance. To overcome this limitation, JMI (Joint Mutual Information) technique has been proposed. This algorithm takes into account both - the relevance and redundancy along with the class label while calculating the mutual information. In this research work the technique named as Joint Mutual Information Maximization - JMIM is presented. This algorithm combines the JMI along with the 'maximum of minimum' approach to overcome the limitation of redundancy. In Experimental Analysis, a comparative study of JMIM feature set and WEKA feature selection technique is presented. Three classifiers have been used to test the accuracy of feature set generated by JMIM and WEKA. It investigates the performance of the algorithm in comparison with the 'Attribute Selection' technique from WEKA. Experimental results show that the JMIM algorithm gives better performance as compared to Attribute Selection. But, the number of features required to generate these results is much higher as compared to Attribute Selection.

Other Details

Paper ID: IJSRDV4I100158
Published in: Volume : 4, Issue : 10
Publication Date: 01/01/2017
Page(s): 77-80

Article Preview

Download Article