Start Making Sense: Cognitive and Affective Confidence Measures for Explanation Generation using Epistemic Planning Ronald P. A. Petrick Department of Computer Science Heriot-Watt University Edinburgh, Scotland, United Kingdom R.Petrick@hw.ac.uk Robin L. Hill School of Informatics University of Edinburgh Edinburgh, Scotland, United Kingdom R.L.Hill@ed.ac.uk Abstract This paper presents an overview of the EPSRC-funded project Start Making Sense, which is investigating explain- ability and trust maintenance in interactive and autonomous systems. This project brings together experimental research in cognitive science involving cooperative joint action with the practical construction of automated planning tools to ap- ply to the task of explanation generation. The project’s chal- lenges are addressed through three concrete objectives: (i) to study cooperative joint action in humans to identify the emotional, affective, or cognitive factors that are essential for successful human communication, (ii) to enhance epistemic planning techniques with measures derived from the studies for improved human-like explanation generation, and (iii) to deploy and evaluate the resulting system with human partici- pants. Given the range of AI systems expected to be deployed in the future, this project aims to provide insights into the tools needed to ensure such systems are effective. Introduction A fundamental problem in the design of autonomous sys- tems is that of action selection: based on the current state of the world, what action should the system take in order to achieve its goals? In the presence of humans, this problem typically becomes more complex: the system may also need to reason about the states, actions, and intentions of these agents. In collaborative environments that involve human communication, it is particularly important to identify, in- terpret, and understand the multimodal affective signals that humans employ, and which are often necessary for effective, successful achievement of communicative goals. For instance, consider a tourist on a guided walking tour of a city. After reaching a place where they can see they are almost back to the starting point, the tour guide says “Let’s go up that hill,” pointing to a large hill. “We can get a good view of the city from there.” However, on seeing the tired expression on the tourist’s face, the guide adds “Or we can stop at that cafe over there and take a break.” This scenario has two important features. First, it demonstrates that people like to be aware of their context and know what is going on. This is especially true in situations where a decision may not have been anticipated or expected. Here, an explanation may be needed not only to justify a decision but also to establish confidence in that choice: in other words, to trust it. Second, being able to read the situation and adapt to the needs of the moment is important when considering the possible actions that could be taken in a given situation. Here, a decision may need to be made dynamically. These two features capture the idea of dynamic trust maintenance, which will be needed for a broad range of the AI systems that are expected to be deployed in the near future, e.g., automated vehicles, service robots, or interactive voice-based assistants. This paper presents an overview of the EPSRC-funded project Start Making Sense: Cognitive and Affective Con- fidence Measures for Explanation Generation Using Epis- temic Planning, 1 which is investigating the need for explain- ability and trust maintenance in interactive and autonomous systems. To do so, this project brings together experimen- tal research in cognitive science involving cooperative joint action with the practical construction of automated planning tools, in particular epistemic planning techniques, to apply to the task of explanation generation. This challenge is be- ing addressed by tackling three key objectives: (i) to study cooperative joint action in humans to identify the emotional, affective, or cognitive factors that are essential for success- ful human communicative goals; (ii) to enhance epistemic planning techniques with measures derived from the cogni- tive science studies; and (iii) to deploy and evaluate the ef- fectiveness of the resulting system with human participants in situations that require explanation. Central to this work is the idea of understanding the affec- tive measures that humans use during activities like instruc- tion giving, plan following, and explanation generation; both when communication is successful but also when it fails. The goal is to characterise these measures in a form that en- ables them to be combined with tools based on epistemic planning, an approach that models the changing beliefs of the planner and other agents during the plan generation pro- cess. Affective measures will therefore help guide the plan- ner’s generation process, for instance as a special type of heuristic state information, enabling the planner to use this information not only for task-based action selection, but also to plan appropriate actions for communicative goals such as explanation generation, possibly as a result of dynamic changes in the interactive context. As a result, this work is also situated in the area of explainable planning, a subarea 1 http://start-making-sense.org/