Follow
Goro Kobayashi
Goro Kobayashi
Verified email at dc.tohoku.ac.jp - Homepage
Title
Cited by
Cited by
Year
Attention is Not Only a Weight: Analyzing Transformers with Vector Norms
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
Proceedings of the 2020 Conference on Empirical Methods in Natural Language …, 2020
1882020
Incorporating Residual and Normalization Layers into Analysis of Masked Language Models
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
Proceedings of the 2021 Conference on Empirical Methods in Natural Language …, 2021
242021
Analyzing feed-forward blocks in transformers through the lens of attention map
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
arXiv preprint arXiv:2302.00456, 2023
42023
Feed-forward blocks control contextualization in masked language models
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
arXiv preprint arXiv:2302.00456, 2023
42023
Transformer language models handle word frequency in prediction head
G Kobayashi, T Kuribayashi, S Yokoi, K Inui
arXiv preprint arXiv:2305.18294, 2023
32023
Assessing step-by-step reasoning against lexical negation: A case study on syllogism
M Ye, T Kuribayashi, J Suzuki, G Kobayashi, H Funayama
arXiv preprint arXiv:2310.14868, 2023
22023
Contrastive Learning-based Sentence Encoders Implicitly Weight Informative Words
H Kurita, G Kobayashi, S Yokoi, K Inui
arXiv preprint arXiv:2310.15921, 2023
2023
[SRW] Assessing Chain-of-Thought Reasoning against Lexical Negation: A Case Study on Syllogism
M Ye, T Kuribayashi, J Suzuki, H Funayama, G Kobayashi
The 61st Annual Meeting Of The Association For Computational Linguistics, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–8