Følg
Kevin Clark
Kevin Clark
Verifisert e-postadresse på cs.stanford.edu - Startside
Tittel
Sitert av
Sitert av
År
Electra: Pre-training text encoders as discriminators rather than generators
K Clark, MT Luong, QV Le, CD Manning
arXiv preprint arXiv:2003.10555, 2020
47272020
What does bert look at? an analysis of bert's attention
K Clark, U Khandelwal, O Levy, CD Manning
arXiv preprint arXiv:1906.04341, 2019
19672019
Toward expert-level medical question answering with large language models
K Singhal, T Tu, J Gottweis, R Sayres, E Wulczyn, M Amin, L Hou, K Clark, ...
Nature Medicine, 1-8, 2025
7332025
Deep reinforcement learning for mention-ranking coreference models
K Clark, CD Manning
arXiv preprint arXiv:1609.08667, 2016
5172016
Improving coreference resolution by learning entity-level distributed representations
K Clark, CD Manning
arXiv preprint arXiv:1606.01323, 2016
4642016
Semi-Supervised Sequence Modeling with Cross-View Training
K Clark, MT Luong, CD Manning, QV Le
arXiv preprint arXiv:1809.08370, 2018
4622018
Inducing domain-specific sentiment lexicons from unlabeled corpora
WL Hamilton, K Clark, J Leskovec, D Jurafsky
Proceedings of the conference on empirical methods in natural language …, 2016
4572016
Emergent linguistic structure in artificial neural networks trained by self-supervision
CD Manning, K Clark, J Hewitt, U Khandelwal, O Levy
Proceedings of the National Academy of Sciences 117 (48), 30046-30054, 2020
4182020
Large-scale analysis of counseling conversations: An application of natural language processing to mental health
T Althoff, K Clark, J Leskovec
Transactions of the Association for Computational Linguistics 4, 463-476, 2016
3742016
Entity-centric coreference resolution with model stacking
K Clark, CD Manning
Proceedings of the 53rd Annual Meeting of the Association for Computational …, 2015
2692015
Bam! born-again multi-task networks for natural language understanding
K Clark, MT Luong, U Khandelwal, CD Manning, QV Le
arXiv preprint arXiv:1907.04829, 2019
2362019
Directly fine-tuning diffusion models on differentiable rewards
K Clark, P Vicol, K Swersky, DJ Fleet
arXiv preprint arXiv:2309.17400, 2023
1162023
Sample efficient text summarization using a single pre-trained transformer
U Khandelwal, K Clark, D Jurafsky, L Kaiser
arXiv preprint arXiv:1905.08836, 2019
1112019
Text-to-image diffusion models are zero shot classifiers
K Clark, P Jaini
Advances in Neural Information Processing Systems 36, 58921-58937, 2023
1012023
Pre-training transformers as energy-based cloze models
K Clark, MT Luong, QV Le, CD Manning
arXiv preprint arXiv:2012.08561, 2020
882020
Revminer: An extractive interface for navigating reviews on a smartphone
J Huang, O Etzioni, L Zettlemoyer, K Clark, C Lee
Proceedings of the 25th annual ACM symposium on User interface software and …, 2012
712012
Intriguing properties of generative classifiers
P Jaini, K Clark, R Geirhos
arXiv preprint arXiv:2309.16779, 2023
342023
Meta-learning fast weight language models
K Clark, K Guu, MW Chang, P Pasupat, G Hinton, M Norouzi
arXiv preprint arXiv:2212.02475, 2022
102022
Contrastive pre-training for language tasks
TM Luong, QV Le, KS Clark
US Patent 11,449,684, 2022
82022
Stanford at TAC KBP 2017: Building a Trilingual Relational Knowledge Graph.
AT Chaganty, A Paranjape, J Bolton, M Lamm, J Lei, A See, K Clark, ...
TAC, 2017
72017
Systemet kan ikke utføre handlingen. Prøv på nytt senere.
Artikler 1–20