Takip et
Lei Shu (舒蕾)
Lei Shu (舒蕾)
Google Deepmind
google.com üzerinde doğrulanmış e-posta adresine sahip - Ana Sayfa
Başlık
Alıntı yapanlar
Alıntı yapanlar
Yıl
BERT post-training for review reading comprehension and aspect-based sentiment analysis
H Xu, B Liu, L Shu, PS Yu
arXiv preprint arXiv:1904.02232, 2019
9282019
Double embeddings and CNN-based sequence labeling for aspect extraction
H Xu, B Liu, L Shu, PS Yu
arXiv preprint arXiv:1805.04601, 2018
4802018
DOC: Deep Open Classification of Text Documents
L Shu, H Xu, B Liu
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
4292017
Multi-task pre-training for plug-and-play task-oriented dialogue system
Y Su, L Shu, E Mansimov, A Gupta, D Cai, YA Lai, Y Zhang
arXiv preprint arXiv:2109.14739, 2021
1892021
Zero-shot out-of-distribution detection based on the pre-trained model clip
S Esmaeilpour, B Liu, E Robertson, L Shu
Proceedings of the AAAI conference on artificial intelligence 36 (6), 6568-6576, 2022
1662022
Lifelong learning CRF for supervised aspect extraction
L Shu, H Xu, B Liu
arXiv preprint arXiv:1705.00251, 2017
1522017
Open-world learning and application to product classification
H Xu, B Liu, L Shu, P Yu
The World Wide Web Conference, 3413-3419, 2019
1432019
Achieving forgetting prevention and knowledge transfer in continual learning
Z Ke, B Liu, N Ma, H Xu, L Shu
Advances in Neural Information Processing Systems 34, 22443-22456, 2021
1252021
Unseen class discovery in open-world classification
L Shu, H Xu, B Liu
arXiv preprint arXiv:1801.05609, 2018
1122018
CLASSIC: Continual and contrastive learning of aspect sentiment classification tasks
Z Ke, B Liu, H Xu, L Shu
arXiv preprint arXiv:2112.02714, 2021
622021
Dombert: Domain-oriented language model for aspect-based sentiment analysis
H Xu, B Liu, L Shu, PS Yu
arXiv preprint arXiv:2004.13816, 2020
612020
Improve Mathematical Reasoning in Language Models by Automated Process Supervision
L Luo, Y Liu, R Liu, S Phatale, H Lara, Y Li, L Shu, Y Zhu, L Meng, J Sun, ...
arXiv preprint arXiv:2406.06592, 2024
59*2024
TaCL: Improving BERT pre-training with token-aware contrastive learning
Y Su, F Liu, Z Meng, T Lan, L Shu, E Shareghi, N Collier
arXiv preprint arXiv:2111.04198, 2021
592021
Lifelong domain word embedding via meta-learning
H Xu, B Liu, L Shu, PS Yu
arXiv preprint arXiv:1805.09991, 2018
582018
Understanding pre-trained bert for aspect-based sentiment analysis
H Xu, L Shu, PS Yu, B Liu
arXiv preprint arXiv:2011.00169, 2020
532020
Rewritelm: An instruction-tuned large language model for text rewriting
L Shu, L Luo, J Hoskere, Y Zhu, Y Liu, S Tong, J Chen, L Meng
Proceedings of the AAAI Conference on Artificial Intelligence 38 (17), 18970 …, 2024
412024
Lifelong-RL: Lifelong Relaxation Labeling for Separating Entities and Aspects in Opinion Targets
L Shu, B Liu, H Xu, A Kim
Proceedings of the 2016 Conference on Empirical Methods in Natural Language …, 2016
38*2016
Continual training of language models for few-shot learning
Z Ke, H Lin, Y Shao, H Xu, L Shu, B Liu
arXiv preprint arXiv:2210.05549, 2022
362022
Continual learning with knowledge transfer for sentiment classification
Z Ke, B Liu, H Wang, L Shu
Machine Learning and Knowledge Discovery in Databases: European Conference …, 2021
362021
Flexibly-structured model for task-oriented dialogues
L Shu, P Molino, M Namazifar, H Xu, B Liu, H Zheng, G Tur
arXiv preprint arXiv:1908.02402, 2019
29*2019
Sistem, işlemi şu anda gerçekleştiremiyor. Daha sonra yeniden deneyin.
Makaleler 1–20