フォロー
Jakob Uszkoreit
Jakob Uszkoreit
Inceptive
確認したメール アドレス: uszkoreit.net
タイトル
引用先
引用先
Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems 30, 2017
1272922017
An image is worth 16x16 words: Transformers for image recognition at scale
A Dosovitskiy, L Beyer, A Kolesnikov, D Weissenborn, X Zhai, ...
arXiv preprint arXiv:2010.11929, 2020
391362020
Attention is all you need [J]
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Advances in neural information processing systems 30 (1), 261-272, 2017
36562017
Self-attention with relative position representations
P Shaw, J Uszkoreit, A Vaswani
arXiv preprint arXiv:1803.02155, 2018
24662018
Natural questions: a benchmark for question answering research
T Kwiatkowski, J Palomaki, O Redfield, M Collins, A Parikh, C Alberti, ...
Transactions of the Association for Computational Linguistics 7, 453-466, 2019
23762019
Mlp-mixer: An all-mlp architecture for vision
IO Tolstikhin, N Houlsby, A Kolesnikov, L Beyer, X Zhai, T Unterthiner, ...
Advances in neural information processing systems 34, 24261-24272, 2021
23742021
Image transformer
N Parmar, A Vaswani, J Uszkoreit, L Kaiser, N Shazeer, A Ku, D Tran
International conference on machine learning, 4055-4064, 2018
18992018
Advances in neural information processing systems
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Attention is all you need, 2017
18352017
A decomposable attention model for natural language inference
AP Parikh, O Täckström, D Das, J Uszkoreit
arXiv preprint arXiv:1606.01933, 2016
16832016
Attention Is All You Need.(Nips), 2017
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762 10, S0140525X16001837, 2017
12982017
Universal transformers
M Dehghani, S Gouws, O Vinyals, J Uszkoreit, Ł Kaiser
arXiv preprint arXiv:1807.03819, 2018
9292018
Gomez Aidan N., Kaiser Łukasz, and Polosukhin Illia. 2017
V Ashish, S Noam, P Niki, U Jakob, J Llion
Attention is all you need. In Advances in neural information processing …, 2017
8682017
Music transformer
CZA Huang, A Vaswani, J Uszkoreit, N Shazeer, I Simon, C Hawthorne, ...
arXiv preprint arXiv:1809.04281, 2018
8572018
Object-centric learning with slot attention
F Locatello, D Weissenborn, T Unterthiner, A Mahendran, G Heigold, ...
Advances in neural information processing systems 33, 11525-11538, 2020
6972020
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
6242018
How to train your vit? data, augmentation, and regularization in vision transformers
A Steiner, A Kolesnikov, X Zhai, R Wightman, J Uszkoreit, L Beyer
arXiv preprint arXiv:2106.10270, 2021
5472021
Attention is all you need. Advances in neural information processing systems
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems 30 (2017), 2017
4672017
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
3882017
Transforming machine translation: a deep learning system reaches news translation quality comparable to human professionals
M Popel, M Tomkova, J Tomek, Ł Kaiser, J Uszkoreit, O Bojar, ...
Nature communications 11 (1), 1-15, 2020
3042020
Attention is all you need. CoRR abs/1706.03762 (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
2802017
現在システムで処理を実行できません。しばらくしてからもう一度お試しください。
論文 1–20