Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
Paper page - SPAR: Personalized Content-Based Recommendation via Long Engagement Attention
[go: Go Back, main page]

Librarian Bot. I found the following papers similar to this paper.

\n

The following papers were recommended by the Semantic Scholar API

\n\n

Please give a thumbs up to this comment if you found it helpful!

\n

If you want recommendations for any Paper on Hugging Face checkout this Space

\n

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: \n\n@librarian-bot\n\t recommend

\n","updatedAt":"2024-02-20T01:21:40.122Z","author":{"_id":"63d3e0e8ff1384ce6c5dd17d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg","fullname":"Librarian Bot (Bot)","name":"librarian-bot","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":318,"isUserFollowing":false}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.7376954555511475},"editors":["librarian-bot"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg"],"reactions":[{"reaction":"๐Ÿ‘","users":["JaiSurya","vedalken","nachshonc","chiyuzhang"],"count":4}],"isReport":false}},{"id":"6664c55e4e36a92d1dd67cb9","author":{"_id":"6186ddf6a7717cb375090c01","avatarUrl":"/avatars/716b6a7d1094c8036b2a8a7b9063e8aa.svg","fullname":"Julien BLANCHON","name":"blanchon","type":"user","isPro":true,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":176,"isUserFollowing":false},"createdAt":"2024-06-08T20:55:58.000Z","type":"comment","data":{"edited":false,"hidden":false,"latest":{"raw":"# Revolutionary Recommendation System: Meet SPAR with Long Engagement Attention! \n\nhttps://cdn-uploads.huggingface.co/production/uploads/6186ddf6a7717cb375090c01/-87T9BKU0EVA92SGw_z8j.mp4 \n\n## Links ๐Ÿ”—:\n๐Ÿ‘‰ Subscribe: https://www.youtube.com/@Arxflix\n๐Ÿ‘‰ Twitter: https://x.com/arxflix\n๐Ÿ‘‰ LMNT (Partner): https://lmnt.com/\n\n\nBy Arxflix\n![9t4iCUHx_400x400-1.jpg](https://cdn-uploads.huggingface.co/production/uploads/6186ddf6a7717cb375090c01/v4S5zBurs0ouGNwYj1GEd.jpeg)","html":"

\n\t\n\t\t\n\t\n\t\n\t\tRevolutionary Recommendation System: Meet SPAR with Long Engagement Attention!\n\t\n

\n

\n\n

\n\t\n\t\t\n\t\n\t\n\t\tLinks ๐Ÿ”—:\n\t\n

\n

๐Ÿ‘‰ Subscribe: https://www.youtube.com/@Arxflix
๐Ÿ‘‰ Twitter: https://x.com/arxflix
๐Ÿ‘‰ LMNT (Partner): https://lmnt.com/

\n

By Arxflix
\"9t4iCUHx_400x400-1.jpg\"

\n","updatedAt":"2024-06-08T20:55:58.867Z","author":{"_id":"6186ddf6a7717cb375090c01","avatarUrl":"/avatars/716b6a7d1094c8036b2a8a7b9063e8aa.svg","fullname":"Julien BLANCHON","name":"blanchon","type":"user","isPro":true,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":176,"isUserFollowing":false}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.5012816190719604},"editors":["blanchon"],"editorAvatarUrls":["/avatars/716b6a7d1094c8036b2a8a7b9063e8aa.svg"],"reactions":[],"isReport":false}}],"primaryEmailConfirmed":false,"paper":{"id":"2402.10555","authors":[{"_id":"65d2dc3dc6f238f8687cd62e","user":{"_id":"5fb2d92a9f63b546e74cb399","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1653523921526-5fb2d92a9f63b546e74cb399.png","isPro":false,"fullname":"chiyu_zhang","user":"chiyuzhang","type":"user"},"name":"Chiyu Zhang","status":"admin_assigned","statusLastChangedAt":"2024-02-19T15:28:27.272Z","hidden":false},{"_id":"65d2dc3dc6f238f8687cd62f","user":{"_id":"65d3810882939b6bb21a453f","avatarUrl":"/avatars/31041112c610e531bb0fbfb1f0c43257.svg","isPro":false,"fullname":"Yifei Sun","user":"sunyifei","type":"user"},"name":"Yifei Sun","status":"claimed_verified","statusLastChangedAt":"2024-02-19T17:05:24.217Z","hidden":false},{"_id":"65d2dc3dc6f238f8687cd630","user":{"_id":"62f94ec077b722f186641192","avatarUrl":"/avatars/37f4c7f2b6f59e2eac2d1b1b5cc88269.svg","isPro":false,"fullname":"jun chen","user":"junchen14","type":"user"},"name":"Jun Chen","status":"admin_assigned","statusLastChangedAt":"2024-02-19T15:30:13.784Z","hidden":false},{"_id":"65d2dc3dc6f238f8687cd631","user":{"_id":"64e7be5c0c47bf287ca5baf5","avatarUrl":"/avatars/a82dec23bde346f1479049f74e55004b.svg","isPro":false,"fullname":"Jie Lei","user":"jielei","type":"user"},"name":"Jie Lei","status":"admin_assigned","statusLastChangedAt":"2024-02-19T15:30:21.278Z","hidden":false},{"_id":"65d2dc3dc6f238f8687cd632","user":{"_id":"5feeae8dd8e0b2d24ed258cc","avatarUrl":"/avatars/1d66840db4ed6dc12bdee7ba8168ea9f.svg","isPro":false,"fullname":"Muhammad Abdul-Mageed","user":"mageed","type":"user"},"name":"Muhammad Abdul-Mageed","status":"admin_assigned","statusLastChangedAt":"2024-02-19T15:30:37.096Z","hidden":false},{"_id":"65d2dc3dc6f238f8687cd633","user":{"_id":"65b483c5ed110eb9f1ee62df","avatarUrl":"/avatars/29100098f5aed1735675d06c516a85b7.svg","isPro":false,"fullname":"Theron S. Wang","user":"TheronWong","type":"user"},"name":"Sinong Wang","status":"admin_assigned","statusLastChangedAt":"2024-02-19T15:30:43.635Z","hidden":true},{"_id":"65d2dc3dc6f238f8687cd634","name":"Rong Jin","hidden":false},{"_id":"65d2dc3dc6f238f8687cd635","name":"Sem Park","hidden":false},{"_id":"65d2dc3dc6f238f8687cd636","name":"Ning Yao","hidden":false},{"_id":"65d2dc3dc6f238f8687cd637","name":"Bo Long","hidden":false}],"publishedAt":"2024-02-16T10:36:38.000Z","submittedOnDailyAt":"2024-02-19T02:12:37.695Z","title":"SPAR: Personalized Content-Based Recommendation via Long Engagement\n Attention","submittedOnDailyBy":{"_id":"60f1abe7544c2adfd699860c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674929746905-60f1abe7544c2adfd699860c.jpeg","isPro":false,"fullname":"AK","user":"akhaliq","type":"user"},"summary":"Leveraging users' long engagement histories is essential for personalized\ncontent recommendations. The success of pretrained language models (PLMs) in\nNLP has led to their use in encoding user histories and candidate items,\nframing content recommendations as textual semantic matching tasks. However,\nexisting works still struggle with processing very long user historical text\nand insufficient user-item interaction. In this paper, we introduce a\ncontent-based recommendation framework, SPAR, which effectively tackles the\nchallenges of holistic user interest extraction from the long user engagement\nhistory. It achieves so by leveraging PLM, poly-attention layers and attention\nsparsity mechanisms to encode user's history in a session-based manner. The\nuser and item side features are sufficiently fused for engagement prediction\nwhile maintaining standalone representations for both sides, which is efficient\nfor practical model deployment. Moreover, we enhance user profiling by\nexploiting large language model (LLM) to extract global interests from user\nengagement history. Extensive experiments on two benchmark datasets demonstrate\nthat our framework outperforms existing state-of-the-art (SoTA) methods.","upvotes":35,"discussionId":"65d2dc3dc6f238f8687cd64e","githubRepo":"https://github.com/jyonn/legommenders","githubRepoAddedBy":"auto","ai_summary":"The SPAR framework enhances content recommendations by using pretrained language models, poly-attention layers, and large language models to effectively process long user engagement histories and predict user-item interactions.","ai_keywords":["pretrained language models","poly-attention layers","attention sparsity mechanisms","large language models"],"githubStars":30},"canReadDatabase":false,"canManagePapers":false,"canSubmit":false,"hasHfLevelAccess":false,"upvoted":false,"upvoters":[{"_id":"62885a9182e8b16fb143ba16","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1653103235388-noauth.jpeg","isPro":false,"fullname":"Jun Naito","user":"njun","type":"user"},{"_id":"6538119803519fddb4a17e10","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6538119803519fddb4a17e10/ffJMkdx-rM7VvLTCM6ri_.jpeg","isPro":false,"fullname":"samusenps","user":"samusenps","type":"user"},{"_id":"65d3c56682939b6bb2303c27","avatarUrl":"/avatars/89c5351dcc43c4b2f9ab47e6969afa45.svg","isPro":false,"fullname":"Amit siwal","user":"siwal","type":"user"},{"_id":"61a9e0d0629526edf57094f3","avatarUrl":"/avatars/77ccc569d50ef68d4bf663623993a200.svg","isPro":false,"fullname":"Agis Oikonomou","user":"agipof","type":"user"},{"_id":"64dabbb0aeec6bc7c8a92bea","avatarUrl":"/avatars/ff729cba034da9d4911cfe02fa8f321c.svg","isPro":false,"fullname":"Joseph C Steward","user":"jsteward2930","type":"user"},{"_id":"5feeae8dd8e0b2d24ed258cc","avatarUrl":"/avatars/1d66840db4ed6dc12bdee7ba8168ea9f.svg","isPro":false,"fullname":"Muhammad Abdul-Mageed","user":"mageed","type":"user"},{"_id":"641aa07e91e3376a057b95b6","avatarUrl":"/avatars/1c33146dd10b88ba594ef610c8137672.svg","isPro":false,"fullname":"Yihang He","user":"clicktoedit233","type":"user"},{"_id":"5fb2d92a9f63b546e74cb399","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1653523921526-5fb2d92a9f63b546e74cb399.png","isPro":false,"fullname":"chiyu_zhang","user":"chiyuzhang","type":"user"},{"_id":"62fd83d2cad078c7973099a9","avatarUrl":"/avatars/62e9e4e53a42945c51b1296955f5d760.svg","isPro":true,"fullname":"Md Tawkat Islam Khondaker","user":"Tawkat","type":"user"},{"_id":"65c55330e286dbda4e81c9a6","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/65c55330e286dbda4e81c9a6/1goq_45j5AVWjxfNavBgk.jpeg","isPro":false,"fullname":"Wei Rui Chen","user":"wrchen1","type":"user"},{"_id":"644040d77841867cd5b4d670","avatarUrl":"/avatars/0fc982cc2e2baaaf673f1871de51520a.svg","isPro":false,"fullname":"Peter Sullivan","user":"prsull","type":"user"},{"_id":"630fc966dd31b2a8dbe6699e","avatarUrl":"/avatars/f9d1d2ffaf347dc6c848037a001dd1ae.svg","isPro":false,"fullname":"Alcides Alcoba Inciarte","user":"Al-Alcoba-Inciarte","type":"user"}],"acceptLanguages":["*"],"dailyPaperRank":3}">
Papers
arxiv:2402.10555

SPAR: Personalized Content-Based Recommendation via Long Engagement Attention

Published on Feb 16, 2024
ยท Submitted by
AK
on Feb 19, 2024
#3 Paper of the day
Authors:
,
,
,

Abstract

The SPAR framework enhances content recommendations by using pretrained language models, poly-attention layers, and large language models to effectively process long user engagement histories and predict user-item interactions.

AI-generated summary

Leveraging users' long engagement histories is essential for personalized content recommendations. The success of pretrained language models (PLMs) in NLP has led to their use in encoding user histories and candidate items, framing content recommendations as textual semantic matching tasks. However, existing works still struggle with processing very long user historical text and insufficient user-item interaction. In this paper, we introduce a content-based recommendation framework, SPAR, which effectively tackles the challenges of holistic user interest extraction from the long user engagement history. It achieves so by leveraging PLM, poly-attention layers and attention sparsity mechanisms to encode user's history in a session-based manner. The user and item side features are sufficiently fused for engagement prediction while maintaining standalone representations for both sides, which is efficient for practical model deployment. Moreover, we enhance user profiling by exploiting large language model (LLM) to extract global interests from user engagement history. Extensive experiments on two benchmark datasets demonstrate that our framework outperforms existing state-of-the-art (SoTA) methods.

Community

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Revolutionary Recommendation System: Meet SPAR with Long Engagement Attention!

Links ๐Ÿ”—:

๐Ÿ‘‰ Subscribe: https://www.youtube.com/@Arxflix
๐Ÿ‘‰ Twitter: https://x.com/arxflix
๐Ÿ‘‰ LMNT (Partner): https://lmnt.com/

By Arxflix
9t4iCUHx_400x400-1.jpg

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2402.10555 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2402.10555 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2402.10555 in a Space README.md to link it from this page.

Collections including this paper 10