Title
Learning Effective and Efficient Embedding via an Adaptively-Masked Twins-based Layer
Abstract
BSTRACTEmbedding learning for categorical features is crucial for the deep learning-based recommendation models (DLRMs). Each feature value is mapped to an embedding vector via an embedding learning process. Conventional methods configure a fixed and uniform embedding size to all feature values from the same feature field. However, such a configuration is not only sub-optimal for embedding learning but also memory costly. Existing methods that attempt to resolve these problems, either rule-based or neural architecture search (NAS)-based, need extensive efforts on the human design or network training. They are also not flexible in embedding size selection or in warm-start-based applications. In this paper, we propose a novel and effective embedding size selection scheme. Specifically, we design an Adaptively-Masked Twins-based Layer (AMTL) behind the standard embedding layer. AMTL generates a mask vector to mask the undesired dimensions for each embedding vector. The mask vector brings flexibility in selecting the dimensions and the proposed layer can be easily added to either untrained or trained DLRMs. Extensive experimental evaluations show that the proposed scheme outperforms competitive baselines on all the benchmark tasks, and is also memory-efficient, saving 60% memory usage without compromising any performance metrics.
Year
DOI
Venue
2021
10.1145/3459637.3482130
Conference on Information and Knowledge Management
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
7
Name
Order
Citations
PageRank
Bencheng Yan111.04
Pengjie Wang201.01
Kai Zhang35811.65
Wei Lin4418.55
Kuang-Chih Lee5356.44
Jian Xu630120.18
Bo Zheng71210.73