What should be in an XAI explanation? What IFT reveals by studying RTS is
coming at 02/19/2018 - 4:00pm

LPSC 125
Mon, 02/19/2018 - 4:00pm

Jon Dodge
Ph.D. Student , School of EECS, Oregon State University

"What should be in an explanation and what should they look like?” This is
a fundamental question to answer in order for Explainable Artificial
Intelligience (XAI) to gain the trust of human assessors. To this end, we
conducted a pair of studies investigating generation, content, and form of
explanations in the Real-Time Strategy (RTS) domain, specifically StarCraft
II. First, we observed expert explainers' (shoutcasters) foraging patterns
and speech, as they provide explanations in real-time. Second, we used a lab
study to examine how participants investigated agent behavior in the same
domain - but without the real-time constraint. By conducting this pair of
studies, we are able to study both (1) explanations supplied by experts and
(2) explanations demanded by assessors. Throughout our studies, we adopted an
Information Foraging Theory (IFT) perspective, which allows us to generalize
our results. In this talk, we present what these results tell us about how to
explain AI systems.


Read more:

Colloquium mailing list
  • [EECS Colloquium] ... School of Electrical Engineering & Computer Science

Reply via email to