Trust and Prior Experience in Human-Robot Interaction Tracy L. Sanders 1,2 , Keith MacArthur 1 , William Volante 1 , Gabriella Hancock 3 , Thomas MacGillivray 1 , William Shugars 1 , P. A. Hancock 1 1 University of Central Florida, Psychology Department; 2 MITRE Corporation; 3 California State University, Long Beach This experiment explored the influence of users’ experience (prior interaction) with robots on their attitudes and trust toward robotic agents. Specifically, we hypothesized that prior experience would lead to 1) higher trust scores after viewing a robot complete a task, 2) smaller differences in trust scores when comparing a human and a robot completing the same task, and 3) more positive general attitudes towards robots. These hypotheses were supported although not all results achieved significant levels of differentiation. These findings confirm that prior experience plays an important role in both user trust and general attitude in human-robot interactions. INTRODUCTION A robot has been defined as “a machine capable of carrying out a complex series of actions automatically, especially one programmable by a computer” (Robot, 2016), particularly when hosted in a three dimensional body (Shinozawa, Naya, Yamato, & Kogure, 2005). The criteria set forth in this definition clearly allow for a broad range of machines to be considered robots. Robotic agents are extending the range of human capabilities in a wide variety of domains. From industrial settings where they are designed to alleviate the strain on workers, to simple household gadgets designed to assist users in their pursuit of a hands-off lifestyle, these robots can assume many roles and fulfill a myriad of needs. However, despite the far-reaching capabilities of robotic systems, many users may harbor adverse feelings such as dislike or distrust of these robots based on their preconceptions. Such attitudes can subsequently lead to disuse of these valuable tools (Parasuraman & Riley, 1997). Preconceptions can be based on, and shaped by, not only personal experience, but also popular culture, media depictions, and science fiction (Hancock, Billings, & Schaefer, 2011; Schaefer, Billings, & Hancock, 2012). These non- experiential influences can lead to erroneous perceptions about robots’ capabilities and their motivations. Here, we look to examine the influence of personal experience with robotics on trust and related attitudes toward robots. Previous studies have shown that users’ prior experiences can influence their comfort with a robotic system; specifically, more experienced individuals tend to be more comfortable with robots (Takayama & Pantofaru, 2009). Additionally, one’s comfort level with a robot influences the amount of trust one places in that robot; higher comfort leading to higher trust (Sanders, Oleson, Billings, Chen, & Hancock, 2011). As research has shown that experience can lead to increased comfort, and comfort can bolster trust, we therefore seek to clarify, expand, and quantify the relationship between user experience and trust during human-robot interactions (HRI). To better place the factor of user experience in its wider context, we must first briefly address the additional factors that contribute to HRI For example, willingness to interact with or use a robot, a hallmark of trust in the system, may be dependent upon users’ previous use. Such interdependency subsequently creates an interesting chicken-or-the-egg paradox between experience and choice to interact with a robot. We therefore first note the unidirectional and necessary relationship between the two constructs, emphasizing that experience must derive from use. However, their reciprocal relationship has also been evaluated and higher levels of experience with robots leads in turn to more use (Coeckelbergh, Pop, Simut, Peca, Pintea, David, & Vanderborght, 2016). Not only does experience lead to increased use, but higher levels of experience also engender more positive attitudes toward robots (Tsui, Desai, Yanco, Cramer, & Kemper, 2011). This relationship may seemingly act to polarize the population, as those individuals who do not use robotic systems fail to gain experience thus establishing and reinforcing negative attitudes toward robots, while those people who do engage in HRI gain more experience, producing more positive attitudes and even further use. The catalyst, which breaks this cycle, includes the numerous other factors that necessarily influence whether a user will choose to engage in HRI. Experience in HRI is related to these other factors, including comfort and trust, which also needs to be considered when examining HRI use. Trust in HRI is a multifaceted construct that influences a user’s behavior. While too little trust can lead to underuse, excessive trust can foster overreliance (Parasuraman & Riley, 1997). In a recent meta-analysis examining the antecedents of trust, Hancock and colleagues identified three major factors influencing trust in HRI (Hancock, Billings, Schaefer, Chen, De Visser, & Parasuraman, 2011). These factors include those related to 1) the human user (e.g., personality, experience with robots), 2) the robot (e.g., form and behavior), and 3) the environment (e.g., task type and culture). To date, of these considerations, robot-centric factors have proven to be those most closely related to trust. Future research may, however, establish stronger links between trust and individual differences in the human user. As Hancock and colleagues (2011) noted, there exist a dearth of qualifying quantitative studies investigating human-related antecedents to trust in HRI. In order to further develop this area of research, Sanders (2016) recently explored the role that individual differences Copyright 2017 by Human Factors and Ergonomics Society. DOI 10.1177/1541931213601934 Proceedings of the Human Factors and Ergonomics Society 2017 Annual Meeting 1809