Assessing Knowledge Editing in Language Models via Relation Perspective
Yifan WeiXiaoyan YuHuanhuan MaFangyu LeiYixuan WengRan SongKang Liu
Yifan WeiXiaoyan YuHuanhuan Ma
...+3
Kang Liu
Nov 2023
0被引用
0笔记
摘要原文
Knowledge Editing (KE) for modifying factual knowledge in Large Language Models (LLMs) has been receiving increasing attention. However, existing knowledge editing methods are entity-centric, and it is unclear whether this approach is suitable for a relation-centric perspective. To address this gap, this paper constructs a new benchmark named RaKE, which focuses on Relation based Knowledge Editing. In this paper, we establish a suite of innovative metrics for evaluation and conduct comprehensive experiments involving various knowledge editing baselines. We notice that existing knowledge editing methods exhibit the potential difficulty in their ability to edit relations. Therefore, we further explore the role of relations in factual triplets within the transformer. Our research results confirm that knowledge related to relations is not only stored in the FFN network but also in the attention layers. This provides experimental support for future relation-based knowledge editing methods.