Abstract— This paper investigates the influence of feedback provided by an autonomous robot (BIRON) on users’ discursive behavior. A user study is described during which users show objects to the robot. The results of the experiment indicate, that the robot’s verbal feedback utterances cause the humans to adapt their own way of speaking. The changes in users’ verbal behavior are due to their beliefs about the robots knowledge and abilities. In this paper they are identified and grouped. Moreover, the data implies variations in user behavior regarding gestures. Unlike speech, the robot was not able to give feedback with gestures. Due to the lack of feedback, users did not seem to have a consistent mental representation of the robot's abilities to recognize gestures. As a result, changes between different gestures are interpreted to be unconscious variations accompanying speech. I. INTRODUCTION HETHER interacting with a colleague from another department, a child with distinct cognitive and linguistic skills or a foreigner with different cultural background, humans try to adapt to their communication partners. In everyday interaction this process seems to happen automatically. Humans send certain clues and give verbal and nonverbal feedback. With their help the interaction partners form mental representations of each other. They build up beliefs about the others abilities and knowledge. By doing so, humans are able to adapt to others, which increases the possibility that the interaction is successful. The same is true when the interaction partner is a robot. Especially with the development of so-called social robots it is increasingly important to know more about peoples’ beliefs about the robot in order to design a successful interaction. Users’ beliefs can be studied by analyzing their behavior in a certain interaction situation. Knowing how users behave, moreover, helps to design dialogs. In this paper, a study with the service robot BIRON ((BIelefeld RObot companioN; see Fig. 1) is presented which aims at shedding some light at the user behavior in the situation of teaching objects to a robot. Manuscript received September 14, 2007. K. Rohlfing’s work was supported in part by the Volkswagen Foundation. M. Lohse, K. J. Rohlfing, and B. Wrede are with the Applied Computer Science Group, Bielefeld University, Bielefeld, Germany (phone +49 521 1062953; fax: +49 521 1062992; e-mail: mlohse; rohlfing; bwrede@techfak.uni-bielefeld.de). G. Sagerer is head of the Applied Computer Science Group, Bielefeld Uniersity, Bielefeld, Germany (e-mail: sagerer@techfak.uni-bielefeld.de) Fig. 1 Person interacting with BIRON II. RELATED WORK Since human-robot interaction is a rather young research field based on various disciplines, many approaches and scientific findings from other areas have to be taken into account. This is also true for related work on human discursive behavior which has been in the focus of human- computer interaction. [1] report research on linguistic adaptation during spoken and multimodal human-computer interaction in situations when errors occur. Their work focuses on modalities and intonation. The researchers conclude that users adapt to the system in three different ways: increasing linguistic contrast (alternation of input mode and lexical content); increasing hyperarticulation; suppression of linguistic variability (amplitude and frequency) when hyperarticulating. This work focuses on linguistic phenomena. [2, 3] also concentrate on error recovery in spoken dialog systems. In contrast to [1], the ten error recovery strategies the authors propose include non-linguistic phenomena. Their research focuses on the feedback of the system, whereas the work presented here concentrates on users’ utterances and gestures in reaction to the robot. Feedback as such plays a major role in HRI. According to [4] the term feedback describes “linguistic mechanisms, which enable the participants in a communication process to unobtrusively exchange information about four basic communication functions: contact, perception, understanding and attitudinal reactions”. Thus, it is the basis for grounding. [5] define the common ground on the level of speech and context. To establish a common ground “Try something else!” - When users change their discursive behavior in human-robot interaction Manja Lohse, Katharina J. Rohlfing, Britta Wrede, and Gerhard Sagerer, Member, IEEE W