Abstract | ||
---|---|---|
Dimensionality reduction by feature projection is widely used in pattern recognition, information retrieval, and statistics. When there are some outputs available (e.g., regression values or classification results), it is often beneficial to consider supervised projection, which is based not only on the inputs, but also on the target values. While this applies to a single-output setting, we are more interested in applications with multiple outputs, where several tasks need to be learned simultaneously. In this paper, we introduce a novel projection approach called Multi-Output Regularized feature Projection (MORP), which preserves the information of input features and, meanwhile, captures the correlations between inputs/outputs and (if applicable) between multiple outputs. This is done by introducing a latent variable model on the joint input-output space and minimizing the reconstruction errors for both inputs and outputs. It turns out that the mappings can be found by solving a generalized eigenvalue problem and are ready to extend to nonlinear mappings. Prediction accuracy can be greatly improved by using the new features since the structure of outputs is explored. We validate our approach in two applications. In the first setting, we predict users' preferences for a set of paintings. The second is concerned with image and text categorization where each image (or document) may belong to multiple categories. The proposed algorithm produces very encouraging results in both settings. |
Year | DOI | Venue |
---|---|---|
2006 | 10.1109/TKDE.2006.194 | IEEE Trans. Knowl. Data Eng. |
Keywords | Field | DocType |
single-output setting,input feature,information retrieval,multi-output regularized feature,feature projection,multi-output regularized feature projection,new feature,novel projection approach,multiple category,multiple output,supervised projection,latent variable model,input output,learning artificial intelligence,data reduction,pattern recognition,generalized eigenvalue problem,dimensionality reduction | Dimensionality reduction,Nonlinear system,Computer science,Input/output,Artificial intelligence,Regression,Pattern recognition,Latent variable model,Algorithm,Correlation,Eigendecomposition of a matrix,Machine learning,Data reduction | Journal |
Volume | Issue | ISSN |
18 | 12 | 1041-4347 |
Citations | PageRank | References |
25 | 1.65 | 14 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shipeng Yu | 1 | 1767 | 118.84 |
Yu, Kai | 2 | 4799 | 255.21 |
Volker Tresp | 3 | 2907 | 373.75 |
Hans-Peter Kriegel | 4 | 20742 | 3284.07 |