Title
Multimodal Compatibility Modeling via Exploring the Consistent and Complementary Correlations
Abstract
ABSTRACTExisting methods towards outfit compatibility modeling seldom explicitly consider multimodal correlations. In this work, we explore the consistent and complementary correlations for better compatibility modeling. This is, however, non-trivial due to the following challenges: 1) how to separate and model these two kinds of correlations; 2) how to leverage the derived complementary cues to strengthen the text and vision-oriented representations of the given item; and 3) how to reinforce the compatibility modeling with text and vision-oriented representations. To address these challenges, we present a comprehensive multimodal outfit compatibility modeling scheme. It first nonlinearly projects each modality into separable consistent and complementary spaces via multi-layer perceptron, and then models the consistent and complementary correlations between two modalities by parallel and orthogonal regularization. Thereafter, we strengthen the visual and textual representation of items with complementary information, and further induct both the text-oriented and vision- oriented outfit compatibility modeling. We ultimately employ the mutual learning strategy to reinforce the final performance of compatibility modeling. Extensive experiments demonstrate the superiority of our scheme.
Year
DOI
Venue
2021
10.1145/3474085.3475392
International Multimedia Conference
DocType
Citations 
PageRank 
Conference
1
0.35
References 
Authors
0
6
Name
Order
Citations
PageRank
Weili Guan14310.84
Haokun Wen232.09
Xuemeng Song334822.62
Chung-Hsing Yeh464180.82
Xiaojun Chang533.08
Liqiang Nie62975131.85