Title
Principle-to-program: Neural Fashion Recommendation with Multi-modal Input
Abstract
Outfit recommendation automatically pairs user-specified reference clothing with the most suitable complement from online shops. Wearing aesthetically is a criterion for matching such fashion items. Fashion style tells a lot about one's personality and emerges from how people assemble clothing outfit from seemingly disjoint items into a cohesive concept. Experts share fashion tips showcasing their compositions to public where each item has both an image and textual meta-data. Also, retrieving products from online shopping catalogs in response to such real-world image query is essential for outfit recommendation. Our earlier tutorial focused on style and compatibility in fashion recommendation mostly based on metric and deep learning approaches. Herein, we cover several other aspects of fashion recommendation using visual signals (e.g., cross-scenario retrieval, attribute classification) and combine text input (e.g., interpretable embedding) as well. Each section concludes walking through programs executed on Jupyter workstation using real-world data sets.
Year
DOI
Venue
2019
10.1145/3343031.3350544
Proceedings of the 27th ACM International Conference on Multimedia
Keywords
Field
DocType
attribute, clothing, compatibility, fashion recommendation, interpretation, multi-modal search, style
Computer science,Artificial intelligence,Multimedia,Modal
Conference
ISBN
Citations 
PageRank 
978-1-4503-6889-6
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Muthusamy Chelliah172.86
Soma Biswas233.41
Lucky Dhakad310.72