TY - JOUR
T1 - Personalized Representation With Contrastive Loss for Recommendation Systems
AU - Tang, Hao
AU - Zhao, Guoshuai
AU - Gao, Jing
AU - Qian, Xueming
N1 - Publisher Copyright:
© 1999-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - Sequential recommendation mines the user's interaction sequence or time information to get better recommendations and thus is gaining more and more attention. Existing sequential recommendations tend to build new models, and the study of the loss function is seriously neglected. Despite the increasing attention paid to contrastive learning recently, we believe that the key to contrastive learning is contrastive loss(CL), which also provides a new option for sequential recommendation. However, we find it works against the personalized representation of features. First, it is a relative constraint that keeps positive and negative samples away from each other but without an absolute constraint. Second, recent studies have shown that all embeddings should be uniformly distributed. However, CL only widens the distance of positive and negative samples within the training batch, rather than making a uniform distribution of all items. These two shortcomings make the embedding space too compact, which is harmful to personalized representation and recommendation. Therefore, this article proposes Personalized Contrastive Loss (PCL) to combine CL with absolute constraints of BCE/CE and employs regularization methods to make the representations uniformly distributed. State-of-the-art results are obtained in experiments on several commonly used datasets. The code and data will be available on GitHub.
AB - Sequential recommendation mines the user's interaction sequence or time information to get better recommendations and thus is gaining more and more attention. Existing sequential recommendations tend to build new models, and the study of the loss function is seriously neglected. Despite the increasing attention paid to contrastive learning recently, we believe that the key to contrastive learning is contrastive loss(CL), which also provides a new option for sequential recommendation. However, we find it works against the personalized representation of features. First, it is a relative constraint that keeps positive and negative samples away from each other but without an absolute constraint. Second, recent studies have shown that all embeddings should be uniformly distributed. However, CL only widens the distance of positive and negative samples within the training batch, rather than making a uniform distribution of all items. These two shortcomings make the embedding space too compact, which is harmful to personalized representation and recommendation. Therefore, this article proposes Personalized Contrastive Loss (PCL) to combine CL with absolute constraints of BCE/CE and employs regularization methods to make the representations uniformly distributed. State-of-the-art results are obtained in experiments on several commonly used datasets. The code and data will be available on GitHub.
KW - contrastive loss
KW - Personalization
KW - sequential recommendation
KW - uniformity
UR - https://www.scopus.com/pages/publications/85164789686
U2 - 10.1109/TMM.2023.3295740
DO - 10.1109/TMM.2023.3295740
M3 - 文章
AN - SCOPUS:85164789686
SN - 1520-9210
VL - 26
SP - 2419
EP - 2429
JO - IEEE Transactions on Multimedia
JF - IEEE Transactions on Multimedia
ER -