Engage AI and Child in Explanatory Dialogue on Commonsense
Reasoning
Erqian Xu
∗
exu6@u.rochester.edu
Warner Graduate School of Education
and Human Development, University
of Rochester
Rochester, New York, USA
Hecong Wang
∗
hwang99@ur.rochester.edu
Department of Computer Science,
University of Rochester
Rochester, New York, USA
Zhen Bai
zbai@cs.rochester.edu
Department of Computer Science,
University of Rochester
Rochester, New York, USA
Figure 1: An idealized AI-child explanatory dialogue about the story of Little Red Riding Hood. The AI mistakenly inferred the
answer to be warmth because it knew that cloth can be used to keep warm. The child tried to make the AI “realize” its mistake
by inquiring about the consequence of the wolf not wearing the cloth. In the end, the child points out that the wolf wore cloth to
disguise itself.
ABSTRACT
Human-level commonsense reasoning capability is vital for human-
AI interaction, enabling AI to understand, anticipate, and respond
to human’s thoughts, feelings, and behaviors. Despite the recent
advancements in AI commonsense reasoning due to generative
language models, a young child is often more rational than state-
of-the-art AIs in terms of commonsense reasoning. The feld of
cognitive science, child development, and explainable AI have long
recognized the importance of explanations for sharing knowledge
and resolving contradictions. We, therefore, raise the question: can
Both authors contributed equally to this research.
∗
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for proft or commercial advantage and that copies bear this notice and the full citation
on the frst page. Copyrights for third-party components of this work must be honored.
For all other uses, contact the owner/author(s).
CHI EA ’23, April 23–28, 2023, Hamburg, Germany
© 2023 Copyright held by the owner/author(s).
ACM ISBN 978-1-4503-9422-2/23/04.
https://doi.org/10.1145/3544549.3585699
AIs leverage the power of explanations to learn human-level common-
sense reasoning? More specifcally, can explanatory dialogue with
children help AIs to develop commonsense reasoning capabilities? As
a frst step in this line of research, we aim to engage children in
explanatory dialogue with AIs during story reading. We present
our novel explanatory dialogue interface based on a state-of-the-art
multi-step commonsense reasoning engine and discuss our upcom-
ing pilot study.
CCS CONCEPTS
• Human-centered computing → Collaborative interaction.
KEYWORDS
Commonsense Reasoning, Explainable AI, Human-AI Collaboration
ACM Reference Format:
Erqian Xu, Hecong Wang, and Zhen Bai. 2023. Engage AI and Child in
Explanatory Dialogue on Commonsense Reasoning. In Extended Abstracts
of the 2023 CHI Conference on Human Factors in Computing Systems (CHI
EA ’23), April 23–28, 2023, Hamburg, Germany. ACM, New York, NY, USA,
8 pages. https://doi.org/10.1145/3544549.3585699