This website requires JavaScript.

Cross-Lingual Transfer Learning for Statistical Type Inference

Zhiming LiXiaofei XieHaoliang LiZhengzi XuYi LiYang Liu
Jul 2021
摘要
Hitherto statistical type inference systems rely thoroughly on supervisedlearning approaches, which require laborious manual effort to collect and labellarge amounts of data. Most Turing-complete imperative languages share similarcontrol- and data-flow structures, which make it possible to transfer knowledgelearned from one language to another. In this paper, we propose a cross-lingualtransfer learning framework, PLATO, for statistical type inference, whichallows us to leverage prior knowledge learned from the labeled dataset of onelanguage and transfer it to the others, e.g., Python to JavaScript, Java toJavaScript, etc. PLATO is powered by a novel kernelized attention mechanism toconstrain the attention scope of the backbone Transformer model such that modelis forced to base its prediction on commonly shared features among languages.In addition, we propose the syntax enhancement that augments the learning onthe feature overlap among language domains. Furthermore, PLATO can also be usedto improve the performance of the conventional supervised-based type inferenceby introducing cross-language augmentation, which enables the model to learnmore general features across multiple languages. We evaluated PLATO under twosettings: 1) under the cross-domain scenario that the target language data isnot labeled or labeled partially, the results show that PLATO outperforms thestate-of-the-art domain transfer techniques by a large margin, e.g., itimproves the Python to TypeScript baseline by +14.6%@EM, +18.6%@weighted-F1,and 2) under the conventional monolingual supervised scenario, PLATO improvesthe Python baseline by +4.10%@EM, +1.90%@weighted-F1 with the introduction ofthe cross-lingual augmentation.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?
0
被引用
笔记
问答