Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

"Explanation" is Not a Technical Term: The Problem of Ambiguity in XAI (2207.00007v1)

Published 27 Jun 2022 in cs.HC and cs.AI

Abstract: There is broad agreement that AI systems, particularly those using Machine Learning (ML), should be able to "explain" their behavior. Unfortunately, there is little agreement as to what constitutes an "explanation." This has caused a disconnect between the explanations that systems produce in service of explainable Artificial Intelligence (XAI) and those explanations that users and other audiences actually need, which should be defined by the full spectrum of functional roles, audiences, and capabilities for explanation. In this paper, we explore the features of explanations and how to use those features in evaluating their utility. We focus on the requirements for explanations defined by their functional role, the knowledge states of users who are trying to understand them, and the availability of the information needed to generate them. Further, we discuss the risk of XAI enabling trust in systems without establishing their trustworthiness and define a critical next step for the field of XAI to establish metrics to guide and ground the utility of system-generated explanations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Leilani H. Gilpin (9 papers)
  2. Andrew R. Paley (1 paper)
  3. Mohammed A. Alam (1 paper)
  4. Sarah Spurlock (1 paper)
  5. Kristian J. Hammond (2 papers)
Citations (5)