Title
Learning to Find Correlated Features by Maximizing Information Flow in Convolutional Neural Networks
Abstract
Training convolutional neural networks for image classification tasks usually causes information loss. Although most of the time the information is redundant with respect to the target task, there are still cases where discriminative information is also discarded. For example, if images that belong to the same category have multiple correlated features, the model may only learn a subset of the features and ignore the rest. This may not be a problem unless the classification in the test set highly relies on the ignored features. We argue that the discard of the correlated discriminative information is partially caused by the fact that the minimization of the classification loss doesn't ensure to learn all discriminative information but only the most discriminative information given the training set. To address this problem, we propose an information flow maximization (IFM) loss as a regularization term to find the discriminative correlated features. With less information loss the classifier can make predictions based on more informative features. We validate our method on the shiftedMNIST dataset and show the effectiveness of IFM loss in learning representative and discriminative features.
Year
DOI
Venue
2019
10.1109/ICCVW.2019.00094
2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW)
Keywords
Field
DocType
Information bottleneck,Information maximization,Correlated features,Information flow
Information flow (information theory),Pattern recognition,Computer science,Convolutional neural network,Artificial intelligence
Conference
Volume
Issue
ISSN
2019
1
2473-9936
ISBN
Citations 
PageRank 
978-1-7281-5024-6
0
0.34
References 
Authors
2
3
Name
Order
Citations
PageRank
Wei Shen1104.79
Fei Li29739.93
Rujie Liu314715.49