TY - JOUR
T1 - Should Machines Express Sympathy and Empathy? Experiments with a Health Advice Chatbot
AU - Liu, Bingjie
AU - Sundar, S. Shyam
N1 - Publisher Copyright:
© Copyright 2018, Mary Ann Liebert, Inc., publishers.
PY - 2018/10
Y1 - 2018/10
N2 - When we ask a chatbot for advice about a personal problem, should it simply provide informational support and refrain from offering emotional support? Or, should it show sympathy and empathize with our situation? Although expression of caring and understanding is valued in supportive human communications, do we want the same from a chatbot, or do we simply reject it due to its artificiality and uncanniness? To answer this question, we conducted two experiments with a chatbot providing online medical information advice about a sensitive personal issue. In Study 1, participants (N = 158) simply read a dialogue between a chatbot and a human user. In Study 2, participants (N = 88) interacted with a real chatbot. We tested the effect of three types of empathic expression - sympathy, cognitive empathy, and affective empathy - on individuals' perceptions of the service and the chatbot. Data reveal that expression of sympathy and empathy is favored over unemotional provision of advice, in support of the Computers are Social Actors (CASA) paradigm. This is particularly true for users who are initially skeptical about machines possessing social cognitive capabilities. Theoretical, methodological, and practical implications are discussed.
AB - When we ask a chatbot for advice about a personal problem, should it simply provide informational support and refrain from offering emotional support? Or, should it show sympathy and empathize with our situation? Although expression of caring and understanding is valued in supportive human communications, do we want the same from a chatbot, or do we simply reject it due to its artificiality and uncanniness? To answer this question, we conducted two experiments with a chatbot providing online medical information advice about a sensitive personal issue. In Study 1, participants (N = 158) simply read a dialogue between a chatbot and a human user. In Study 2, participants (N = 88) interacted with a real chatbot. We tested the effect of three types of empathic expression - sympathy, cognitive empathy, and affective empathy - on individuals' perceptions of the service and the chatbot. Data reveal that expression of sympathy and empathy is favored over unemotional provision of advice, in support of the Computers are Social Actors (CASA) paradigm. This is particularly true for users who are initially skeptical about machines possessing social cognitive capabilities. Theoretical, methodological, and practical implications are discussed.
KW - CASA
KW - empathy
KW - human-robot interaction
KW - sympathy
KW - uncanny valley
UR - https://www.scopus.com/pages/publications/85055080779
U2 - 10.1089/cyber.2018.0110
DO - 10.1089/cyber.2018.0110
M3 - Article
C2 - 30334655
AN - SCOPUS:85055080779
SN - 2152-2715
VL - 21
SP - 625
EP - 636
JO - Cyberpsychology, Behavior, and Social Networking
JF - Cyberpsychology, Behavior, and Social Networking
IS - 10
ER -