Följ
Niki Parmar
Niki Parmar
Co-Founder at Essential AI
Verifierad e-postadress på essential.ai
Titel
Citeras av
Citeras av
År
Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems 30, 2017
1678792017
Attention is all you need [J]
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Advances in neural information processing systems 30 (1), 261-272, 2017
7314*2017
Conformer: Convolution-augmented transformer for speech recognition
A Gulati, J Qin, CC Chiu, N Parmar, Y Zhang, J Yu, W Han, S Wang, ...
arXiv preprint arXiv:2005.08100, 2020
35872020
Image transformer
N Parmar, A Vaswani, J Uszkoreit, L Kaiser, N Shazeer, A Ku, D Tran
International conference on machine learning, 4055-4064, 2018
21712018
Stand-alone self-attention in vision models
P Ramachandran, N Parmar, A Vaswani, I Bello, A Levskaya, J Shlens
Advances in neural information processing systems 32, 2019
14272019
Bottleneck transformers for visual recognition
A Srinivas, TY Lin, N Parmar, J Shlens, P Abbeel, A Vaswani
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2021
13882021
Advances in neural information processing systems 30
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Curran Associates Inc, 2017
10542017
Gomez Aidan N., Kaiser Łukasz, Polosukhin Illia, Attention is all you need
V Ashish, S Noam, P Niki, U Jakob, J Llion
Adv. Neural Inf. Process. Syst 30, 1-11, 2017
7902017
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
6722018
The best of both worlds: Combining recent advances in neural machine translation
MX Chen, O Firat, A Bapna, M Johnson, W Macherey, G Foster, L Jones, ...
arXiv preprint arXiv:1804.09849, 2018
5402018
Scaling local self-attention for parameter efficient visual backbones
A Vaswani, P Ramachandran, A Srinivas, N Parmar, B Hechtman, ...
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2021
5012021
Mesh-tensorflow: Deep learning for supercomputers
N Shazeer, Y Cheng, N Parmar, D Tran, A Vaswani, P Koanantakool, ...
Advances in neural information processing systems 31, 2018
4312018
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
4042017
Attention is all you need. CoRR abs/1706.03762 (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
3452017
Attention is all you need, 2023
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2023
288*2023
Purity homophily in social networks.
M Dehghani, K Johnson, J Hoover, E Sagi, J Garten, NJ Parmar, S Vaisey, ...
Journal of Experimental Psychology: General 145 (3), 366, 2016
2322016
Attention is all you need
A Waswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, A Gomez, ...
NIPS, 2017
1872017
& Polosukhin, I.(2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Attention is all you need. In: Advances in neural information processing …, 2017
1842017
Stand-alone self-attention in vision models
N Parmar, P Ramachandran, A Vaswani, I Bello, A Levskaya, J Shlens
1682019
Corpora generation for grammatical error correction
J Lichtarge, C Alberti, S Kumar, N Shazeer, N Parmar, S Tong
arXiv preprint arXiv:1904.05780, 2019
1652019
Systemet kan inte utföra åtgärden just nu. Försök igen senare.
Artiklar 1–20