This website requires JavaScript.

Differentiable N-gram Objective on Abstractive Summarization

Yunqi ZhuWensheng ZhangMingjin Zhu
摘要
ROUGE is a standard automatic evaluation metric based on n-grams forsequence-to-sequence tasks, while cross-entropy loss is an essential objectiveof neural network language model that optimizes at a unigram level. We presentdifferentiable n-gram objectives, attempting to alleviate the discrepancybetween training criterion and evaluating criterion. The objective maximizesthe probabilistic weight of matched sub-sequences, and the novelty of our workis the objective weights the matched sub-sequences equally and does not ceilthe number of matched sub-sequences by the ground truth count of n-grams inreference sequence. We jointly optimize cross-entropy loss and the proposedobjective, providing decent ROUGE score enhancement over abstractivesummarization dataset CNN/DM and XSum, outperforming alternative n-gramobjectives.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?
0
被引用
笔记
问答