Title
A HMAX with LLC for visual recognition.
Abstract
Todayu0027s high performance deep artificial neural networks (ANNs) rely heavily on parameter optimization, which is sequential in nature and even with a powerful GPU, would have taken weeks to train them up for solving challenging tasks [22]. HMAX [17] has demonstrated that a simple high performing network could be obtained without heavy optimization. In this paper, we had improved on the existing best HMAX neural network [12] in terms of structural simplicity and performance. Our design replaces the L1 minimization sparse coding (SC) with a locality-constrained linear coding (LLC) [20] which has a lower computational demand. We also put the simple orientation filter bank back into the front layer of the network replacing PCA. Our systemu0027s performance has improved over the existing architecture and reached 79.0% on the challenging Caltech-101 [7] dataset, which is state-of-the-art for ANNs (without transfer learning). From our empirical data, the main contributors to our systemu0027s performance include an introduction of partial signal whitening, a spot detector, and a spatial pyramid matching (SPM) [14] layer.
Year
Venue
Field
2015
arXiv: Computer Vision and Pattern Recognition
L1 minimization,Pattern recognition,Computer science,Neural coding,Filter bank,Transfer of learning,Visual recognition,Pyramid,Artificial intelligence,Artificial neural network,Detector,Machine learning
DocType
Volume
Citations 
Journal
abs/1502.02772
0
PageRank 
References 
Authors
0.34
12
3
Name
Order
Citations
PageRank
Kean Hong Lau100.34
Yong Haur Tay222520.14
Fook Loong Lo300.34