Ergonomics & Human Factors 2021, Eds R Charles & D Golightly, CIEHF Empathy consideration in the design of natural language interfaces for future vehicles Ben Anyasodo, Gary Burnett Human Factors Research Group, Faculty of Engineering, University of Nottingham, UK ABSTRACT While the future of transportation paints a picture of seamless understanding of the passenger goals by the vehicle, it also exposes a gap in understanding what the human-machine engagement must become for a more natural in-car experience. Through natural language interfaces, a human-like interaction is possible. This raises the question is there a way that machines can be more “empathic”? If so, would this make for a more natural human-machine interaction? And how can we design usable natural language interfaces (specifically speech systems) to achieve this? Especially because although humans are emotional beings, machines are not. This paper explores the concept of empathy for speech systems by investigating the human-human empathy model and proposing design considerations. To achieve this, we interviewed professional persons who have to show empathy as part of their work. Seven themes were generated from the responses that form a usable framework for a human-machine empathy which could be applied to natural language speech system design. KEYWORDS Empathy, Driving, Natural language, Speech system, HCI limitations, dialogue management, future vehicles INTRODUCTION Natural language input is widely applied in vehicle speech systems more and more. However, system inaccuracies such as task complexity, naturalness of speech synthesis, system responsiveness/feedback, context awareness (given differing user/driving conditions), dialogue management etc); have continued to generate human-computer interaction (HCI) limitations (Weng et al., 2016). Such HCI limitations impact the user experience or any effect of human-like interaction on the user, regardless of the intelligence shown by natural language speech systems e.g., giving correct answers, predicting input, machine learning (Jenness et al., 2016). In an in-vehicle context this gets compounded further as most interaction with these speech systems happen as a secondary task, while driving is the primary task. Therefore, any secondary task must not increase cognitive load as that compromises driving safety. However, it has been shown that these HCI limitations increase user's cognitive load (Becker et al, 2006). According to Shneidermann (2000), the interaction becomes less empathic as cognitive demand increases on the user. In future vehicles, although users might be involved in something else other than driving as a primary task, reducing safety concerns, enhancing trust and improving user experience remain core priorities (Rödel et al., 2014). This study is focused on understanding elements of human-human interaction that may inform a natural and empathic human-machine interaction in natural language speech interfaces/systems, as a way of improving the overall user experience and system goals. The central question considered here is: “how