Title
A Deep Graphical Model for Layered Knowledge Transfer
Abstract
Deep architectures can now be well trained on massive labeled data. However, there exist many application scenarios where labeled data are sparse or absent. Domain adaptation and multi-task transfer learning provide attractive options when related labeled data or tasks are abundant from different domains. In this paper, a new graphical modeling approach to multi-layer factorization based domain adaptation is explored to address the scenarios that insufficient labeled data are available for supervised learning. A deep convolutional factorization based transfer learning (DCFTL) algorithm is proposed to facilitate layer-wise transfer learning between domains. Completely based on graphical model representation, the proposed framework can seamlessly merge inference and learning, and has clear interpretability of conditional independence. The empirical performances on image classification tasks in both supervised and semi-supervised adaptation settings illustrate the effectiveness and generalization of the proposed deep layered knowledge transfer framework.
Year
DOI
Venue
2018
10.1109/ICPR.2018.8545795
2018 24th International Conference on Pattern Recognition (ICPR)
Keywords
Field
DocType
deep layered knowledge transfer framework,deep graphical model,deep architectures,multitask transfer learning,supervised learning,deep convolutional factorization based transfer learning algorithm,graphical model representation,inference,multilayer factorization,domain adaptation,DCFTL algorithm
Interpretability,Pattern recognition,Inference,Conditional independence,Computer science,Knowledge transfer,Transfer of learning,Supervised learning,Artificial intelligence,Graphical model,Contextual image classification,Machine learning
Conference
ISSN
ISBN
Citations 
1051-4651
978-1-5386-3789-0
0
PageRank 
References 
Authors
0.34
9
2
Name
Order
Citations
PageRank
Wei Lu131962.97
Fu-lai Chung224434.50