When, What, and how should generative artificial intelligence explain to Users?

  • Soobin Jang
  • , Haeyoon Lee
  • , Yujin Kim
  • , Daeho Lee
  • , Jungwoo Shin
  • , Jungwoo Nam

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

With the commercialization of ChatGPT, generative artificial intelligence (AI) has been applied almost everywhere in our lives. However, even though generative AI has become a daily technology that anyone can use, most non-majors need to know the process and reason for the results because it can be misused due to lack of sufficient knowledge and misunderstanding. Therefore, this study investigated users’ preferences for when, what, and how generative AI should provide explanations about the process of generating and the reasoning behind the results, using conjoint method and mixed logit analysis. The results show that users are most sensitive to the timing of providing eXplainable AI (XAI), and that users want additional information only when they ask for explanations during the process of using generative AI. The results of this study will help shape the XAI design of future generative AI from a user perspective and improve usability.

Original languageEnglish
Article number102175
JournalTelematics and Informatics
Volume93
DOIs
StatePublished - Sep 2024

Keywords

  • Conjoint analysis
  • Conversational user interface
  • Explainable AI
  • Generative AI

Fingerprint

Dive into the research topics of 'When, What, and how should generative artificial intelligence explain to Users?'. Together they form a unique fingerprint.

Cite this