This website requires JavaScript.
DOI: 10.1007/978-981-19-7960-6_12

Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation

Wenjie HaoHongfei XuLingling MuHongying Zan
Dec 2022
摘要
In this paper, we study the use of deep Transformer translation model for theCCMT 2022 Chinese-Thai low-resource machine translation task. We first explorethe experiment settings (including the number of BPE merge operations, dropoutprobability, embedding size, etc.) for the low-resource scenario with the6-layer Transformer. Considering that increasing the number of layers alsoincreases the regularization on new model parameters (dropout modules are alsointroduced when using more layers), we adopt the highest performance settingbut increase the depth of the Transformer to 24 layers to obtain improvedtranslation quality. Our work obtains the SOTA performance in theChinese-to-Thai translation in the constrained evaluation.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?