팔로우
Dennis Wu
Dennis Wu
u.northwestern.edu의 이메일 확인됨
제목
인용
인용
연도
On sparse modern hopfield model
JYC Hu, D Yang, D Wu, C Xu, BY Chen, H Liu
Advances in Neural Information Processing Systems 36, 2024
372024
STanhop: Sparse tandem hopfield model for memory-enhanced time series prediction
D Wu, JYC Hu, W Li, BY Chen, H Liu
The Twelfth International Conference on Learning Representations, 2023
372023
Uniform memory retrieval with larger capacity for modern hopfield models
D Wu, JYC Hu, TY Hsiao, H Liu
The 41st International Conference on Machine Learning, 2024
272024
Nonparametric modern hopfield models
JYC Hu, BY Chen, D Wu, F Ruan, H Liu
arXiv preprint arXiv:2404.03900, 2024
232024
Provably optimal memory capacity for modern hopfield models: Tight analysis for transformer-compatible dense associative memories
JYC Hu, D Wu, H Liu
Advances in Neural Information Processing Systems (NeurIPS) 37, 2024
15*2024
Associated learning: an alternative to end-to-end backpropagation that works on cnn, rnn, and transformer
DYH Wu, D Lin, V Chen, HH Chen
International Conference on Learning Representations, 2021
92021
Detecting inaccurate sensors on a large-scale sensor network using centralized and localized graph neural networks
DY Wu, TH Lin, XR Zhang, CP Chen, JH Chen, HH Chen
IEEE Sensors Journal 23 (15), 16446-16455, 2023
42023
AI-based college course selection recommendation system: performance prediction and curriculum suggestion
YH Wu, EH Wu
2020 International Symposium on Computer, Consumer and Control (IS3C), 79-82, 2020
42020
Learning spectral methods by transformers
Y He, Y Cao, HY Chen, D Wu, J Fan, H Liu
arXiv preprint arXiv:2501.01312, 2025
22025
HonestBait: Forward References for Attractive but Faithful Headline Generation
CY Chen, D Wu, LW Ku
Findings of the Association for Computational Linguistics: ACL 2023, 2023
12023
Transformers and Their Roles as Time Series Foundation Models
D Wu, Y He, Y Cao, J Fan, H Liu
arXiv preprint arXiv:2502.03383, 2025
2025
Transformers Simulate MLE for Sequence Generation in Bayesian Networks
Y Cao, Y He, D Wu, HY Chen, J Fan, H Liu
arXiv preprint arXiv:2501.02547, 2025
2025
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–12