Papers
Topics
Authors
Recent
2000 character limit reached

Counterfactual Explanations of Concept Drift

Published 23 Jun 2020 in cs.LG, stat.ME, and stat.ML | (2006.12822v1)

Abstract: The notion of concept drift refers to the phenomenon that the distribution, which is underlying the observed data, changes over time; as a consequence machine learning models may become inaccurate and need adjustment. While there do exist methods to detect concept drift or to adjust models in the presence of observed drift, the question of explaining drift has hardly been considered so far. This problem is of importance, since it enables an inspection of the most prominent features where drift manifests itself; hence it enables human understanding of the necessity of change and it increases acceptance of life-long learning models. In this paper we present a novel technology, which characterizes concept drift in terms of the characteristic change of spatial features represented by typical examples based on counterfactual explanations. We establish a formal definition of this problem, derive an efficient algorithmic solution based on counterfactual explanations, and demonstrate its usefulness in several examples.

Citations (8)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.