Abstract | ||
---|---|---|
Feature engineering usually needs to excavate dense-and-implicit cross features from multi-filed sparse data. Recently, many state-of-the-art models have been proposed to achieve low-order and high-order feature interactions. However, most of them ignore the importance of cross features and fail to suppress the negative impact of useless features. In this paper, a novel multi-scale feature-crossing attention network (MsFcNET) is proposed to extract dense-and-implicit cross features and learn their importance in the different scales. The model adopts the DIA-LSTM units to construct a new attention calibration architecture, which can adaptively adjust the weights of features in the process of feature interactions. On the other hand, it also integrates a multi-scale feature-crossing module to strengthen the representation ability of cross features from multi-field sparse data. The extensive experimental results on three real-world prediction datasets demonstrate that our proposed model yields superior performance compared with the other state-of-the-art models. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1007/978-3-030-47426-3_12 | PAKDD (1) |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhifeng Xie | 1 | 53 | 10.70 |
Wenling Zhang | 2 | 0 | 0.34 |
Huiming Ding | 3 | 0 | 0.34 |
Lizhuang Ma | 4 | 498 | 100.70 |