Rate My Expectations: How online evaluations of professors impact students’ perceived control Neneh Kowai-Bell a , Rosanna E. Guadagno b,⇑ , Tannah Little a , Najean Preiss a , Rachel Hensley a a Department of Psychology, University of Houston, Clear Lake, United States b Department of Psychology, University of Alabama, United States article info Article history: Available online 13 May 2011 Keywords: Ratemyprofessors.com Internet Expectations Student evaluations of teaching Perceived control abstract Ratemyprofessors.com (RMP) is a website on which students can post their ratings of professors. The site is widely used; however, little research has examined the effect RMP content has on expectations and approach to the reviewed class. Two studies examined the hypothesis that Ratemyprofessors.com can have an impact on students’ impressions of professors and directly affect student perceptions of control over the course outcome and their attitudes toward taking the course. In Study 1, participants recalled an experience visiting Ratemyprofessors.com and taking a class from the professor reviewed. Most partici- pants reported a positive impression, an increase in perceived control, and a subsequent positive class- room experience that exceeded expectations. In Study 2, participants read and evaluated either a set of negative or positive comments about a given professor. Results indicated that positive comments had a more positive effect on perceived control, grade expectancy, and attitude toward the class, than did the negative comments. Thus, these results suggest that content on RMP can impact student expec- tations and approach to a potential class. Ó 2011 Elsevier Ltd. All rights reserved. 1. Introduction A growing literature has established that students are fre- quenting Internet sites in search of information about potential professors (Davison & Price, 2009; Kindred & Mohammed, 2005). Specifically, college students are increasingly relying on Internet professor rating sites such as RateMyProfessors.com (RMP) for information about professors and courses. Such sites allow college students to anonymously evaluate instructors. Once entered on the website, their evaluations are posted pub- licly, thereby sharing their input with other students. RMP is widely used (Edwards, Edwards, Qingmei, & Wahl, 2007); how- ever, little is known about the consequences of its use. The cur- rent study seeks to investigate the role of RMP content on student expectations and approach to the reviewed class. In two studies, we examine the relationship between RMP reviews and student motivational factors, using a multimethod approach (Campbell & Fiske, 1959) involving both descriptive and experi- mental methodologies. 1.1. Prior investigations of RateMyProfessors.com The RMP site allows for open-ended comments and provides 5- point scales for students to rate the professor’s helpfulness, clarity, and ‘‘easiness,’’ and also report level of interest in the course mat- ter. Overall quality is computed by combining the helpfulness and clarity ratings. Contrary to what some may assume, most of the RMP ratings of overall quality are positive (Kindred & Mohammed, 2005; Silva et al., 2008; Timmerman, 2008). Davison and Price (2009) scrutinized the rating categories provided on RMP: helpful- ness, clarity, and easiness. They argued that the RMP site content missed major relevant categories: amount learned and instructor interest and knowledge. There is some dissension among researchers about the overall utility of content found on the RMP website, but the majority of the research suggests that constructive information can be found in the open-ended comments. Kindred and Mohammed (2005) examined the open-ended comments for 626 professors from a variety of universities and found that students volunteered rele- vant information regarding instructor competence in nearly half of the comments (42%). Furthermore, Silva et al. (2008) found the most frequently mentioned comments were found in a category they labeled ‘‘instructor characteristics.’’ This category was com- prised of characteristics such as organization, clarity, enthusiasm, respect, fairness, and rapport. However, in line with Davison and Price’s (2009) concern, they did find student development to be 0747-5632/$ - see front matter Ó 2011 Elsevier Ltd. All rights reserved. doi:10.1016/j.chb.2011.04.009 ⇑ Corresponding author. Address: Department of Psychology, University of Alabama, P.O. Box 870348, Tuscaloosa, AL 35487-0348, United States. Tel.: +1 205 348 7803. E-mail address: Rosanna@ua.edu (R.E. Guadagno). Computers in Human Behavior 27 (2011) 1862–1867 Contents lists available at ScienceDirect Computers in Human Behavior journal homepage: www.elsevier.com/locate/comphumbeh