Xing Wang
Xing Wang
Tencent AI Lab
Verified email at - Homepage
Cited by
Cited by
Is ChatGPT a good translator? A preliminary study
W Jiao, W Wang, J Huang, X Wang, Z Tu
arXiv preprint arXiv:2301.08745, 2023
Encouraging divergent thinking in large language models through multi-agent debate
T Liang, Z He, W Jiao, X Wang, Y Wang, R Wang, Y Yang, Z Tu, S Shi
arXiv preprint arXiv:2305.19118, 2023
Context-aware self-attention networks
B Yang, J Li, DF Wong, LS Chao, X Wang, Z Tu
AAAI, 2019
Neural machine translation advised by statistical machine translation
X Wang, Z Lu, Z Tu, H Li, D Xiong, M Zhang
Proceedings of the AAAI Conference on Artificial Intelligence 31 (1), 2017
Modeling Recurrence for Transformer
J Hao, X Wang, B Yang, L Wang, J Zhang, Z Tu
NAACL, 2019
Exploiting Deep Representations for Neural Machine Translation
ZY Dou, Z Tu, X Wang, S Shi, T Zhang
EMNLP 2018, 2018
Self-attention with structural position representations
X Wang, Z Tu, L Wang, S Shi
EMNLP2019, 2019
Translating Phrases in Neural Machine Translation
X Wang, Z Tu, D Xiong, M Zhang
EMNLP 2017, 2017
Multi-granularity self-attention for neural machine translation
J Hao, X Wang, S Shi, J Zhang, Z Tu
EMNLP2019, 2019
Incorporating statistical machine translation word knowledge into neural machine translation
X Wang, Z Tu, M Zhang
IEEE/ACM Transactions on Audio, Speech, and Language Processing 26 (12 …, 2018
Dynamic Layer Aggregation for Neural Machine Translation with Routing-by-Agreement
ZY Dou, Z Tu, X Wang, L Wang, S Shi, T Zhang
AAAI, 2019
ParroT: Translating during chat using large language models tuned with human translation and feedback
W Jiao, J Huang, W Wang, Z He, T Liang, X Wang, S Shi, Z Tu
arXiv preprint arXiv:2304.02426, 2023
Information Aggregation for Multi-Head Attention with Routing-by-Agreement
J Li, B Yang, ZY Dou, X Wang, MR Lyu, Z Tu
NAACL, 2019
On the diversity of multi-head attention
J Li, X Wang, Z Tu, MR Lyu
Neurocomputing 454, 14-24, 2021
Understanding and Improving Sequence-to-Sequence Pretraining for Neural Machine Translation
W Wang, W Jiao, Y Hao, X Wang, S Shi, Z Tu, M Lyu
ACL 2022, 2022
Towards understanding neural machine translation with word importance
S He, Z Tu, X Wang, L Wang, MR Lyu, S Shi
EMNLP2019, 2019
Self-training sampling with monolingual data uncertainty for neural machine translation
W Jiao, X Wang, Z Tu, S Shi, MR Lyu, I King
arXiv preprint arXiv:2106.00941, 2021
Exploring Human-Like Translation Strategy with Large Language Models
Z He, T Liang, W Jiao, Z Zhang, Y Yang, R Wang, Z Tu, S Shi, X Wang
Transactions of the Association for Computational Linguistics, 2023
Topic-based coherence modeling for statistical machine translation
D Xiong, M Zhang, X Wang
IEEE/ACM Transactions on Audio, Speech, and Language Processing 23 (3), 483-493, 2015
How does selective mechanism improve self-attention networks?
X Geng, L Wang, X Wang, B Qin, T Liu, Z Tu
ACL2020, 2020
The system can't perform the operation now. Try again later.
Articles 1–20