Folgen
Daniel Y Fu
Daniel Y Fu
Assistant Professor, UCSD
Bestätigte E-Mail-Adresse bei ucsd.edu - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Flashattention: Fast and memory-efficient exact attention with io-awareness
T Dao, D Fu, S Ermon, A Rudra, C Ré
Advances in Neural Information Processing Systems 35, 16344-16359, 2022
17912022
Hungry hungry hippos: Towards language modeling with state space models
DY Fu, T Dao, KK Saab, AW Thomas, A Rudra, C Ré
The Eleventh International Conference on Learning Representations, 2023
4432023
Flexgen: High-throughput generative inference of large language models with a single gpu
Y Sheng, L Zheng, B Yuan, Z Li, M Ryabinin, B Chen, P Liang, C Ré, ...
International Conference on Machine Learning, 31094-31116, 2023
3352023
Hyena hierarchy: Towards larger convolutional language models
M Poli, S Massaroli, E Nguyen, DY Fu, T Dao, S Baccus, Y Bengio, ...
International Conference on Machine Learning, 28043-28078, 2023
3002023
Fast and three-rious: Speeding up weak supervision with triplet methods
D Fu, M Chen, F Sala, S Hooper, K Fatahalian, C Ré
International conference on machine learning, 3280-3291, 2020
1372020
Rekall: Specifying video events using compositions of spatiotemporal labels
DY Fu, W Crichton, J Hong, X Yao, H Zhang, A Truong, A Narayan, ...
arXiv preprint arXiv:1910.02993, 2019
612019
Simple hardware-efficient long convolutions for sequence modeling
DY Fu, EL Epstein, E Nguyen, AW Thomas, M Zhang, T Dao, A Rudra, ...
International Conference on Machine Learning, 10373-10391, 2023
572023
Perfectly balanced: Improving transfer and robustness of supervised contrastive learning
M Chen, DY Fu, A Narayan, M Zhang, Z Song, K Fatahalian, C Ré
International Conference on Machine Learning, 3090-3122, 2022
522022
Monarch mixer: A simple sub-quadratic gemm-based architecture
D Fu, S Arora, J Grogan, I Johnson, ES Eyuboglu, A Thomas, B Spector, ...
Advances in Neural Information Processing Systems 36, 77546-77603, 2023
472023
Multi-resolution weak supervision for sequential data
P Varma, F Sala, S Sagawa, J Fries, D Fu, S Khattar, A Ramamoorthy, ...
Advances in Neural Information Processing Systems 32, 2019
412019
Shoring up the foundations: Fusing model embeddings and weak supervision
MF Chen, DY Fu, D Adila, M Zhang, F Sala, K Fatahalian, C Ré
Uncertainty in Artificial Intelligence, 357-367, 2022
31*2022
Analysis of faces in a decade of us cable tv news
J Hong, W Crichton, H Zhang, DY Fu, J Ritchie, J Barenholtz, B Hannel, ...
KDD'21: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery …, 2021
27*2021
Hydragen: High-Throughput LLM Inference with Shared Prefixes
J Juravsky, B Brown, R Ehrlich, DY Fu, C Ré, A Mirhoseini
arXiv preprint arXiv:2402.05099, 2024
242024
Laughing hyena distillery: Extracting compact recurrences from convolutions
S Massaroli, M Poli, D Fu, H Kumbong, R Parnichkun, D Romero, ...
Advances in Neural Information Processing Systems 36, 2024
222024
Flashfftconv: Efficient convolutions for long sequences with tensor cores
DY Fu, H Kumbong, E Nguyen, C Ré
arXiv preprint arXiv:2311.05908, 2023
172023
Redpajama: an open dataset for training large language models
M Weber, D Fu, Q Anthony, Y Oren, S Adams, A Alexandrov, X Lyu, ...
arXiv preprint arXiv:2411.12372, 2024
142024
Benchmarking and building long-context retrieval models with loco and m2-bert
J Saad-Falcon, DY Fu, S Arora, N Guha, C Ré
arXiv preprint arXiv:2402.07440, 2024
122024
Tabi: Type-aware bi-encoders for open-domain entity retrieval
M Leszczynski, DY Fu, MF Chen, C Ré
arXiv preprint arXiv:2204.08173, 2022
112022
und Ce Zhang
Y Sheng, L Zheng, B Yuan, Z Li, M Ryabinin, DY Fu, Z Xie, B Chen, ...
FlexGen: High-Throughput Generative Inference of Large Language Models with …, 2023
102023
Orexinergic neurotransmission in temperature responses to methamphetamine and stress: mathematical modeling as a data assimilation approach
A Behrouzvaziri, D Fu, P Tan, Y Yoo, MV Zaretskaia, DE Rusyniak, ...
PLoS One 10 (5), e0126719, 2015
82015
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20