フォロー
Aizhan Imankulova
Aizhan Imankulova
PhD, CogSmart Co., Ltd., Tokyo Metropolitan University
確認したメール アドレス: ed.tmu.ac.jp - ホームページ
タイトル
引用先
引用先
Improving low-resource neural machine translation with filtered pseudo-parallel corpus
A Imankulova, T Sato, M Komachi
Proceedings of the 4th Workshop on Asian Translation (WAT2017), 70-78, 2017
552017
Gender bias in masked language models for multiple languages
M Kaneko, A Imankulova, D Bollegala, N Okazaki
arXiv preprint arXiv:2205.00551, 2022
472022
Exploiting out-of-domain parallel data through multilingual transfer learning for low-resource neural machine translation
A Imankulova, R Dabre, A Fujita, K Imamura
arXiv preprint arXiv:1907.03060, 2019
452019
From masked language modeling to translation: Non-English auxiliary tasks improve zero-shot spoken language understanding
R Van Der Goot, I Sharaf, A Imankulova, A Üstün, M Stepanovic, ...
Proceedings of the 2021 Conference of the North American Chapter of the …, 2021
34*2021
Filtered pseudo-parallel corpus improves low-resource neural machine translation
A Imankulova, T Sato, M Komachi
ACM Transactions on Asian and Low-Resource Language Information Processing …, 2019
312019
Towards multimodal simultaneous neural machine translation
A Imankulova, M Kaneko, T Hirasawa, M Komachi
arXiv preprint arXiv:2004.03180, 2020
122020
Cross-lingual transfer learning for grammatical error correction
I Yamashita, S Katsumata, M Kaneko, A Imankulova, M Komachi
Proceedings of the 28th International Conference on Computational …, 2020
112020
Towards a standardized dataset on Indonesian named entity recognition
SO Khairunnisa, A Imankulova, M Komachi
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the …, 2020
92020
Simultaneous multi-pivot neural machine translation
R Dabre, A Imankulova, M Kaneko, A Chakrabarty
arXiv preprint arXiv:2104.07410, 2021
62021
Pre-trained word embedding and language model improve multimodal machine translation: A case study in Multi30K
T Hirasawa, M Kaneko, A Imankulova, M Komachi
IEEE Access 10, 67653-67668, 2022
52022
Studying the impact of document-level context on simultaneous neural machine translation
R Dabre, A Imankulova, M Kaneko
Proceedings of Machine Translation Summit XVIII: Research Track, 202-214, 2021
22021
Neural combinatory constituency parsing
Z Chen, L Zhang, A Imankulova, M Komachi
arXiv preprint arXiv:2106.06689, 2021
22021
English-to-Japanese diverse translation by combining forward and backward outputs
M Kaneko, A Imankulova, T Hirasawa, M Komachi
Proceedings of the Fourth Workshop on Neural Generation and Translation, 134-138, 2020
22020
Cross-lingual Multi-task Transfer for Zero-shot Task-oriented Dialog
R van der Goot, M Stepanovic, A Ramponi, I Sharaf, A Üstün, ...
RESOURCEFUL-2020: RESOURCEs and representations For Under-resourced …, 2021
2021
A Study on Exploiting Additional Resources for Low-resource Neural Machine Translation
A Imankulova
東京都立大学, 2021
2021
Japanese-Russian TMU Neural Machine Translation System using Multilingual Model for WAT 2019
AIMKM Komachi
WAT 2019, 165, 2019
2019
Japanese-Russian TMU Neural Machine Translation System using Multilingual Model for WAT 2019
A Imankulova, M Kaneko, M Komachi
Proceedings of the 6th Workshop on Asian Translation, 165-170, 2019
2019
Preliminary Experiments toward NMT on E-commerce Product Titles
A Imankulova, K Murakami
2018
現在システムで処理を実行できません。しばらくしてからもう一度お試しください。
論文 1–18