TY - GEN
T1 - A Self-Attentive Interest Retrieval Recommender
AU - Wu, Min
AU - Li, Chen
AU - Tian, Lihua
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Thanks to the attention mechanism, self-attention networks (SANs) have been widely used in sequential recommendation. However, most existing SANs approaches still follow an old fashion generating one single embedding as final representation, which constraints model's capacity. To enrich this kind of representation, sequential recommender uses metadata such as item category to capture user's multi-interests. But this method will not reach its expectation due to item's long-tail property. This property will result a large constant of category cannot be effectively activated by the lack of interaction records. Another drawback is that may also lead to over-parameterization caused by the massive categories. Particularly, we propose a Self-Attentive Interest Retrieval network (SAIR) to explore a context-aware representation from user's behaviors while not fall into over-parameterization. SAIR works in a typical SANs manner, encode the behavior sequence using self-attention, and we propose an interest retrieval module to project the sequences to an interest relevance distribution adaptively. And we leverage an interest-to-interest interaction to generate several context-aware interests embeddings. Then we fuse multi-interest embeddings as final output. Extensive experiments are carried out on three real-world datasets, the results demonstrate that SAIR outperforms other SANs methods and other state-of-the-art algorithms in multiple evaluation metrics.
AB - Thanks to the attention mechanism, self-attention networks (SANs) have been widely used in sequential recommendation. However, most existing SANs approaches still follow an old fashion generating one single embedding as final representation, which constraints model's capacity. To enrich this kind of representation, sequential recommender uses metadata such as item category to capture user's multi-interests. But this method will not reach its expectation due to item's long-tail property. This property will result a large constant of category cannot be effectively activated by the lack of interaction records. Another drawback is that may also lead to over-parameterization caused by the massive categories. Particularly, we propose a Self-Attentive Interest Retrieval network (SAIR) to explore a context-aware representation from user's behaviors while not fall into over-parameterization. SAIR works in a typical SANs manner, encode the behavior sequence using self-attention, and we propose an interest retrieval module to project the sequences to an interest relevance distribution adaptively. And we leverage an interest-to-interest interaction to generate several context-aware interests embeddings. Then we fuse multi-interest embeddings as final output. Extensive experiments are carried out on three real-world datasets, the results demonstrate that SAIR outperforms other SANs methods and other state-of-the-art algorithms in multiple evaluation metrics.
KW - deep learning
KW - information retrieval
KW - recommendation systems
KW - self-attention networks
UR - https://www.scopus.com/pages/publications/85141350658
U2 - 10.1109/CCET55412.2022.9906367
DO - 10.1109/CCET55412.2022.9906367
M3 - 会议稿件
AN - SCOPUS:85141350658
T3 - 5th IEEE International Conference on Computer and Communication Engineering Technology, CCET 2022
SP - 7
EP - 12
BT - 5th IEEE International Conference on Computer and Communication Engineering Technology, CCET 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 5th IEEE International Conference on Computer and Communication Engineering Technology, CCET 2022
Y2 - 19 August 2022 through 21 August 2022
ER -