Title
Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence
Abstract
In this paper, we propose adaptive weighted learning for linear regression problems via the Kullback-Leibler (KL) divergence. The alternative optimization method is used to solve the proposed model. Meanwhile, we theoretically demonstrate that the solution of the optimization algorithm converges to a stationary point of the model. In addition, we also fuse global linear regression and class-oriented linear regression and discuss the problem of parameter selection. Experimental results on face and handwritten numerical character databases show that the proposed method is effective for image classification, particularly for the case that the samples in the training and testing set have different characteristics.
Year
DOI
Venue
2013
10.1016/j.patcog.2012.10.017
Pattern Recognition
Keywords
Field
DocType
linear regression problem,kullback-leibler divergence,alternative optimization method,class-oriented linear regression,handwritten numerical character databases,different characteristic,optimization algorithm converges,fuse global linear regression,image classification,linear regression,kl divergence,src
Principal component regression,Pattern recognition,Linear model,Polynomial regression,Proper linear model,Bayesian multivariate linear regression,Artificial intelligence,Log-linear model,Linear predictor function,Machine learning,Mathematics,Linear regression
Journal
Volume
Issue
ISSN
46
4
0031-3203
Citations 
PageRank 
References 
7
0.43
28
Authors
3
Name
Order
Citations
PageRank
Zhizheng Liang116217.49
Y. F. Li21128105.83
Shixiong Xia310213.28