フォロー
Goro Kobayashi
Goro Kobayashi
確認したメール アドレス: dc.tohoku.ac.jp - ホームページ
タイトル
引用先
引用先
Attention is Not Only a Weight: Analyzing Transformers with Vector Norms
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
Proceedings of the 2020 Conference on Empirical Methods in Natural Language …, 2020
1772020
Incorporating Residual and Normalization Layers into Analysis of Masked Language Models
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
Proceedings of the 2021 Conference on Empirical Methods in Natural Language …, 2021
212021
Assessing step-by-step reasoning against lexical negation: A case study on syllogism
M Ye, T Kuribayashi, J Suzuki, G Kobayashi, H Funayama
arXiv preprint arXiv:2310.14868, 2023
22023
Transformer Language Models Handle Word Frequency in Prediction Head
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
arXiv preprint arXiv:2305.18294, 2023
22023
Feed-Forward Blocks Control Contextualization in Masked Language Models
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
arXiv preprint arXiv:2302.00456, 2023
22023
Analyzing Feed-Forward Blocks in Transformers through the Lens of Attention Map
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
The Twelfth International Conference on Learning Representations, 2023
12023
Contrastive Learning-based Sentence Encoders Implicitly Weight Informative Words
H Kurita, G Kobayashi, S Yokoi, K Inui
arXiv preprint arXiv:2310.15921, 2023
2023
[SRW] Assessing Chain-of-Thought Reasoning against Lexical Negation: A Case Study on Syllogism
M Ye, T Kuribayashi, J Suzuki, H Funayama, G Kobayashi
The 61st Annual Meeting Of The Association For Computational Linguistics, 2023
2023
現在システムで処理を実行できません。しばらくしてからもう一度お試しください。
論文 1–8