Abstract | ||
---|---|---|
This work proposes an approach for solving the linear regression problem by maximizing the dependence between prediction values and the response variable. The proposed algorithm uses the Hilbert-Schmidt independence criterion as a generic measure of dependence and can be used to maximize both nonlinear and linear dependencies. The algorithm is important in applications such as continuous analysis of affective speech, where linear dependence, or correlation, is commonly set as the measure of goodness of fit. The applicability of the proposed algorithm is verified using two synthetic, one affective speech, and one affective bodily posture datasets. Experimental results show that the proposed algorithm outperforms support vector regression (SVR) in 84% (264/314) of studied cases, and is noticeably faster than SVR, as an order of 25, on average. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1109/IJCNN.2014.6889867 | Neural Networks |
Keywords | Field | DocType |
regression analysis,Hilbert-Schmidt independence criterion,SVR,affective bodily posture dataset,affective speech,generic dependence measure,goodness-of-fit measure,linear dependency,linear regression problem,max-dependence regression,nonlinear dependency,prediction value,response variable,support vector regression | Pattern recognition,Regression,Regression analysis,Polynomial regression,Support vector machine,Proper linear model,Bayesian multivariate linear regression,Artificial intelligence,Goodness of fit,Machine learning,Mathematics,Linear regression | Conference |
Citations | PageRank | References |
0 | 0.34 | 16 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Pouria Fewzee | 1 | 0 | 0.34 |
Ali-Akbar Samadani | 2 | 63 | 5.09 |
Dana Kulic | 3 | 810 | 69.21 |
Fakhri Karray | 4 | 0 | 0.34 |