Подписаться
Jared Lichtarge
Jared Lichtarge
Подтвержден адрес электронной почты в домене google.com
Название
Процитировано
Процитировано
Год
Corpora generation for grammatical error correction
J Lichtarge, C Alberti, S Kumar, N Shazeer, N Parmar, S Tong
arXiv preprint arXiv:1904.05780, 2019
1682019
Data weighted training strategies for grammatical error correction
J Lichtarge, C Alberti, S Kumar
Transactions of the Association for Computational Linguistics 8, 634-646, 2020
512020
Weakly supervised grammatical error correction using iterative decoding
J Lichtarge, C Alberti, S Kumar, N Shazeer, N Parmar
arXiv preprint arXiv:1811.01710, 2018
262018
Simple and effective gradient-based tuning of sequence-to-sequence models
J Lichtarge, C Alberti, S Kumar
arXiv preprint arXiv:2209.04683, 2022
32022
Dynamic Subset Tuning: Expanding the Operational Range of Parameter-Efficient Training for Large Language Models
F Stahlberg, J Lichtarge, S Kumar
arXiv preprint arXiv:2411.08610, 2024
2024
Heterogeneous Federated Learning Via Multi-Directional Knowledge Distillation
JA Lichtarge, R Mathews, R Anil, E Amid, S Kumar
US Patent App. 18/417,947, 2024
2024
FEDERATED KNOWLEDGE DISTILLATION ON AN ENCODER OF A GLOBAL ASR MODEL AND/OR AN ENCODER OF A CLIENT ASR MODEL
E Amid, R Mathews, S Kumar, J Lichtarge, M Chen, T Yang, Y Ding
US Patent App. 18/078,782, 2024
2024
Knowledge Distillation with Domain Mismatch For Speech Recognition
Y Tien-Ju, YC Cheng, S Kumar, J Lichtarge, E Amid, Y Ding, R Mathews, ...
US Patent App. 18/488,578, 2024
2024
Hybrid federated learning of machine learning model (s)
E Amid, R Mathews, R Anil, S Kumar, J Lichtarge
US Patent App. 18/074,729, 2024
2024
Heterogeneous Federated Learning Using Knowledge Codistillation
J Lichtarge, E Amid, S Kumar, TJ Yang, R Anil, R Mathews
arXiv preprint arXiv:2310.02549, 2023
2023
В данный момент система не может выполнить эту операцию. Повторите попытку позднее.
Статьи 1–10