Abstract | ||
---|---|---|
Sum-Product Networks (SPNs) are a new class of deep probabilistic model allowing tractable and exact inference. Recently SPNs have been successfully employed as autoencoder framework in Representation Learning. However, SPNs autoencoding mechanism ignores the model structural duality and train the models separately and independently. In this paper, we propose the Dual-SPNs autoencoding mechanism which design model structure as a dual close loop. This approach training the models simultaneously, and explicitly exploiting their structural duality correlation to guide the training process. As shown in extensive multilabel classification experiments, Dual-SPNs autoencoding mechanism prove highly competitive against the ones employing SPNs autoencoding mechanism and other stacked autoencoder architectures. |
Year | Venue | Field |
---|---|---|
2018 | KSEM | Autoencoder,Inference,Computer science,Multi-label classification,Duality (optimization),Artificial intelligence,Statistical model,Machine learning,Feature learning |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
14 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shengsheng Wang | 1 | 98 | 17.51 |
hang zhang | 2 | 31 | 16.05 |
Jiayun Liu | 3 | 0 | 0.34 |
Qiangyuan Yu | 4 | 41 | 2.93 |