Title
Causal Perception in Question-Answering Systems
Abstract
ABSTRACTRoot cause analysis is a common data analysis task. While question-answering systems enable people to easily articulate a why question (e.g., why students in Massachusetts have high ACT Math scores on average) and obtain an answer, these systems often produce questionable causal claims. To investigate how such claims might mislead users, we conducted two crowdsourced experiments to study the impact of showing different information on user perceptions of a question-answering system. We found that in a system that occasionally provided unreasonable responses, showing a scatterplot increased the plausibility of unreasonable causal claims. Also, simply warning participants that correlation is not causation seemed to lead participants to accept reasonable causal claims more cautiously. We observed a strong tendency among participants to associate correlation with causation. Yet, the warning appeared to reduce the tendency. Grounded in the findings, we propose ways to reduce the illusion of causality when using question-answering systems.
Year
DOI
Venue
2021
10.1145/3411764.3445444
Conference on Human Factors in Computing Systems
DocType
Citations 
PageRank 
Conference
1
0.35
References 
Authors
0
5
Name
Order
Citations
PageRank
Po-Ming Law1113.13
Leo Yu-Ho Lo210.35
Alex Endert397452.18
John Stasko45655494.01
Huamin Qu52033115.33