This website requires JavaScript.

Model Based Explanations of Concept Drift

Fabian HinderValerie VaquetJohannes BrinkrolfBarbara Hammer
Mar 2023
摘要
The notion of concept drift refers to the phenomenon that the distributiongenerating the observed data changes over time. If drift is present, machinelearning models can become inaccurate and need adjustment. While there do existmethods to detect concept drift or to adjust models in the presence of observeddrift, the question of explaining drift, i.e., describing the potentiallycomplex and high dimensional change of distribution in a human-understandablefashion, has hardly been considered so far. This problem is of importance sinceit enables an inspection of the most prominent characteristics of how and wheredrift manifests itself. Hence, it enables human understanding of the change andit increases acceptance of life-long learning models. In this paper, we presenta novel technology characterizing concept drift in terms of the characteristicchange of spatial features based on various explanation techniques. To do so,we propose a methodology to reduce the explanation of concept drift to anexplanation of models that are trained in a suitable way extracting relevantinformation regarding the drift. This way a large variety of explanationschemes is available. Thus, a suitable method can be selected for the problemof drift explanation at hand. We outline the potential of this approach anddemonstrate its usefulness in several examples.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?
0
被引用
笔记
问答