Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

AI for Explaining Decisions in Multi-Agent Environments (1910.04404v2)

Published 10 Oct 2019 in cs.AI

Abstract: Explanation is necessary for humans to understand and accept decisions made by an AI system when the system's goal is known. It is even more important when the AI system makes decisions in multi-agent environments where the human does not know the systems' goals since they may depend on other agents' preferences. In such situations, explanations should aim to increase user satisfaction, taking into account the system's decision, the user's and the other agents' preferences, the environment settings and properties such as fairness, envy and privacy. Generating explanations that will increase user satisfaction is very challenging; to this end, we propose a new research direction: xMASE. We then review the state of the art and discuss research directions towards efficient methodologies and algorithms for generating explanations that will increase users' satisfaction from AI system's decisions in multi-agent environments.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Sarit Kraus (54 papers)
  2. Amos Azaria (33 papers)
  3. Jelena Fiosina (3 papers)
  4. Maike Greve (1 paper)
  5. Noam Hazon (17 papers)
  6. Lutz Kolbe (1 paper)
  7. Tim-Benjamin Lembcke (1 paper)
  8. Jörg P. Müller (12 papers)
  9. Sören Schleibaum (3 papers)
  10. Mark Vollrath (2 papers)
Citations (39)

Summary

We haven't generated a summary for this paper yet.