Sledovat
Xiangyang li
Xiangyang li
PekingUniversity,Noah's Ark Lab, Huawei
E-mailová adresa ověřena na: huawei.com
Název
Citace
Citace
Rok
How can recommender systems benefit from large language models: A survey
J Lin, X Dai, Y Xi, W Liu, B Chen, H Zhang, Y Liu, C Wu, X Li, C Zhu, ...
ACM Transactions on Information Systems, 2023
203*2023
CTRL: Connect Collaborative and Language Model for CTR Prediction
X Li, B Chen, L Hou, R Tang
ACM TRANSACTIONS ON RECOMMENDER SYSTEMS, 2023
55*2023
Exploring text-transformers in aaai 2021 shared task: Covid-19 fake news detection in english
X Li, Y Xia, X Long, Z Li, S Li
CONSTRAINT@AAAI 2021, 106-115, 2021
482021
Fencemask: a data augmentation approach for pre-extracted image features
P Li, X Li, X Long
arXiv preprint arXiv:2006.07877, 2020
302020
A unified framework for multi-domain ctr prediction via large language models
Z Fu, X Li, C Wu, Y Wang, K Dong, X Zhao, M Zhao, H Guo, R Tang
ACM Transactions on Information Systems, 2023
222023
Inttower: the next generation of two-tower model for pre-ranking system
X Li, B Chen, HF Guo, J Li, C Zhu, X Long, S Li, Y Wang, W Guo, L Mao, ...
Proceedings of the 31st ACM International Conference on Information …, 2022
182022
LLM-enhanced Reranking in Recommender Systems
J Gao, B Chen, X Zhao, W Liu, X Li, Y Wang, Z Zhang, W Wang, Y Ye, ...
arXiv preprint arXiv:2406.12433, 2024
92024
LLM4MSR: An LLM-Enhanced Paradigm for Multi-Scenario Recommendation
Y Wang, Y Wang, Z Fu, X Li, X Zhao, H Guo, R Tang
CIKM 2024, 2024
92024
FLIP: Fine-grained Alignment between ID-based Models and Pretrained Language Models for CTR Prediction
H Wang, J Lin, X Li, B Chen, C Zhu, R Tang, W Zhang, Y Yu
Proceedings of the 18th ACM Conference on Recommender Systems, 94-104, 2024
8*2024
AutoGen: An automated dynamic model generation framework for recommender system
C Zhu, B Chen, H Guo, H Xu, X Li, X Zhao, W Zhang, Y Yu, R Tang
Proceedings of the Sixteenth ACM International Conference on Web Search and …, 2023
82023
Low resource style transfer via domain adaptive meta learning
X Li, X Long, Y Xia, S Li
NAACL 2022 (oral), 2022
72022
Coir: A comprehensive benchmark for code information retrieval models
X Li, K Dong, YQ Lee, W Xia, Y Yin, H Zhang, Y Liu, Y Wang, R Tang
arXiv preprint arXiv:2407.02883, 2024
52024
Tired of plugins? large language models can be end-to-end recommenders
W Zhang, C Wu, X Li, Y Wang, K Dong, Y Wang, X Dai, X Zhao, H Guo, ...
Coling 2025, 2024
52024
Fencemask: A data augmentation approach for pre-extracted image features. arXiv 2020
P Li, X Li, X Long
arXiv preprint arXiv:2006.07877, 0
5
MC-indexing: Effective Long Document Retrieval via Multi-view Content-aware Indexing
K Dong, DGX Deik, Y Lee, H Zhang, X Li, C Zhang, Y Liu
Findings of the Association for Computational Linguistics: EMNLP 2024, 2673-2691, 2024
4*2024
CtrlA: Adaptive Retrieval-Augmented Generation via Probe-Guided Control
H Liu, H Zhang, Z Guo, K Dong, X Li, YQ Lee, C Zhang, Y Liu
arXiv preprint arXiv:2405.18727, 2024
32024
SampleLLM: Optimizing Tabular Data Synthesis in Recommendations
J Gao, Z Du, X Li, X Zhao, Y Wang, X Li, H Guo, R Tang
arXiv preprint arXiv:2501.16125, 2025
12025
SyNeg: LLM-Driven Synthetic Hard-Negatives for Dense Retrieval
X Li, X Li, H Zhang, Z Du, P Jia, Y Wang, X Zhao, H Guo, R Tang
arXiv preprint arXiv:2412.17250, 2024
12024
Bridging Relevance and Reasoning: Rationale Distillation in Retrieval-Augmented Generation
P Jia, D Xu, X Li, Z Du, X Li, X Zhao, Y Wang, Y Wang, H Guo, R Tang
arXiv preprint arXiv:2412.08519, 2024
12024
CELA: Cost-Efficient Language Model Alignment for CTR Prediction
X Wang, W Liu, X Chen, Q Liu, X Huang, D Lian, X Li, Y Wang, Z Dong, ...
arXiv preprint arXiv:2405.10596, 2024
12024
Systém momentálně nemůže danou operaci provést. Zkuste to znovu později.
Články 1–20