RateMyProfessors.com

RateMyProfessors.com
Type of site
Review Site
Available in English
Owner Viacom
Created by RateMyProfessors.com, LLC.
Website www.ratemyprofessors.com
Users About 800,000 visitors/month
Launched May 1999 (1999-05)

RateMyProfessors.com (RMP) is a review site, founded in May 1999 by John Swapceinski, a software engineer from Menlo Park, California, which allows college and university students to assign ratings to professors and campuses of American, Canadian, and United Kingdom institutions[1]. The site was originally launched as TeacherRatings.com and converted to RateMyProfessors in 2001. RMP was acquired in 2005 by Patrick Nagle and William DeSantis.[2] Nagle and DeSantis later resold RMP in 2007 to Viacom's MTVU, MTV’s College channel.[3] RMP is the largest online destination for professor ratings. The site includes 8,000+ schools, 1.7 million professors, and over 19 million ratings.[4]

Ratings and reviews

Users who have or are currently taking a particular professor’s course may post a rating and review of any professor that is already listed on the site. Furthermore, users may create a listing for any individual not already listed. To be posted, a rater must rate the course and/or professor on a 1-5 scale in the following categories: "overall quality" and "level of difficulty". The rater may also share if they would take the professor again, if the class was taken for credit, if attendance is mandatory, if the textbook is used, what grade they received in the course, and include comments of up to max 350 characters in length. Raters may also select up to 3 tags that describes the professor from a list of 20.[5][6]

According to the website’s help page, "a professor’s Overall Quality rating should reflect how well a professor teaches the course material, and how helpful he/she is both inside and outside of the classroom".[7] The professor’s Overall Quality rating determines whether his/her name is accompanied by a smiley face (meaning "Good Quality"), a frowny face ("Poor Quality"), or an in-between, expressionless face ("Average Quality").[5]

Validity

RateMyProfessors.com versus formal in-class student evaluations

Using data for 426 instructors at the University of Maine, [researchers] examined the relationship between RMP indices and formal in-class student evaluations of teaching (SET). This study found that the two primary RMP indices correlated significantly with their respective SET items. First, RMP "overall quality" showed a correlation of r = .68 with SET item "Overall, how would you rate the instructor?" Second, RMP "ease" showed a correlation of r = .44 with SET item "How did the work load for this course compare to that of others of equal credit?" Further, RMP "overall quality" (r = .57) and RMP "ease" (r = .51) were each correlated with its corresponding SET factor derived from a principal components analysis of all 29 SET items. The researchers concluded "While these RMP/SET correlations should give pause to those who are inclined to dismiss RMP indices as meaningless, the amount of variance left unexplained in SET criteria limits the utility of RMP.".[8]

Criticism

Positive correlation between easiness of class and rating of professor

Research on in-class evaluations shows that professor ratings increase when students rate the course as easy.[9] The same relationship has been shown for RMP. In an article in the journal Assessment and Evaluation in Higher Education, Clayson investigated what RMP actually rates and concluded that "students will give higher evaluations to instructors they judge as being easy. There is also a suggestion in these findings that, if students like an instructor (for whatever reason), then the easiness of the class becomes relatively irrelevant."[10]. Clayson concluded that "the majority of the evidence indicates that [ratemyprofessors.com] is biassed by a halo effect, and creates what most accurately could be called a 'likeability' scale." Other analyses of RMP class ratings have come to similar conclusions,[11][12][13] and some have concluded that professor attractiveness is also positively correlated with evaluation scores on RMP.[14] Felton et al. evaluated RMP ratings and found that "the hotter and easier professors are, the more likely they’ll get rated as a good teacher."[15]

Evaluation bias issues

A frequent criticism of RMP is that there is little reason to think that the ratings accurately reflect the quality of the professors rated.[16][17] Another criticism is that ratings have been shown to reflect gender bias toward the professors evaluated.[18] Furthermore, at RMP, "easiness", "clarity", and "helpfulness" are the only components taken into consideration and are not considered well-designed evaluations.[19][20] Edward Nuhfer argues that both Pickaprof.com and RMP "are transparently obvious in their advocacy that describes a 'good teacher' as an easy grader. Additionally, presenter Phil Abrami ... rated RMP as 'The worst evaluation I've seen' during a panel discussion on student evaluations at the 2005 annual AERA meeting."[21]

Edward Nuhfer has argued, "Pseudo-evaluation damages the credibility of legitimate evaluation and victimizes individuals by irresponsibly publishing comments about them derived from anonymous sources. This is voyeurism passed off as 'evaluation' and examples lie at PickAProf.com and RateMyProfessors.com. Neither site provides evaluation of faculty through criteria that might be valuable to a student seeking a professor who is conducive to their learning, thinking or intellectual growth."[22]

Multiple ratings per person

Single individuals are able to make multiple separate ratings of a single professor on RMP.[23] RMP admits [24] that while it does not allow such multiple ratings from any one IP address, it has no control over raters who use several different computers, or those that "spoof" IP addresses. Also, there is no way of knowing that those who rate a professor's course have actually taken the course in question, making it possible for professors to rate themselves and each other.[25]

Rating relevancy

Critics stated that a number of the ratings focus on qualities they see as irrelevant to teaching, such as physical appearance.[26] In late June 2018, several academics criticized the website's "hotness" score for contributing to sexism in academia. On 28 June, RateMyProfessor responded that while the feature was intended to "reflect a dynamic/exciting teaching style," it was often misused; the hotness rating was removed immediately.[27]

It is common at universities and colleges for faculty (especially junior faculty) to be called on by their departments to teach courses on topics that are not within their area(s) of expertise, which can earn them poor ratings at RMP that do not reflect the ability of those professors to teach courses on subjects that they are much more qualified to teach. RateMyProfessors, though it lets the student identify the course that they took with the professor, combines the ratings for all courses taught by each professor, instead of providing separate ratings averages for each course taught.

Permanent vs adjunct faculty

Adjunct faculty are not always readily identifiable nor verifiable, as such professors may work at multiple universities, change universities frequently, or maintain employment outside an academic setting..

Data breach

On January 11, 2016, RMP notified its users via email (and with a small notification link on its website) that a decommissioned version of RMP's website suffered a data breach affecting email addresses, passwords, and registration dates.[28] According to the California Department of Justice website, the security breach occurred six weeks earlier on or about November 26, 2015.[29]

Website features

Updates to website

RMP regularly modifies its site to reflect student preferences. In late 2011, professors were given the ability to make their Twitter handle available on their professor profile pages for students to follow. In 2014, RMP debuted a new responsive site design. In 2015, the site introduced custom URL's, which allows professors to create a custom URL for their ratings page.

Professor Notes

After mtvU took over the website, a notes feature was added that allows professors to register with the website (using a ".edu" e-mail address) in order to reply to students' comments. Another option, called "Professors Strike Back", featured videos of professors responding to their ratings on RMP.[30] Additionally, in 2015, the site debuted a new series "Professors Read Their Ratings"[31] in which professors read and react to their RMP ratings. Students may also submit videos to RMP.[32]

School ratings

Students can also comment on and rate their school by visiting their school's RMP school page. School Ratings categories include Academic Reputation, Location, Campus, School Library, Food, Clubs & Activities, Social Events, and Happiness.

Top Lists

RMP annually compiles Top Lists of the Highest Rated Professors and Top Schools in the U.S. based on ratings and comments from students.[33] For the first time, along with the release of their 2011-2012 Top Lists, RMP debuted its "Fun Lists."

Recognition

In 2008 RMP was recognized by Time Magazine as one of the 50 best websites of 2008.

In 2008, student evaluations of Professors from RMP accounted for 25% of a school's rating in Forbes annual "America's Best Colleges" listing. However, this is no longer true.[34]

In 2015, the site won two People's Choice Webby Awards after an extensive site overhaul.[35]

References

  1. "About RateMyProfessors.com".
  2. Wired Magazine - 2005
  3. "MTV Networks' mtvU Agrees to Acquire RateMyProfessors.com".
  4. "About RateMyProfessors.com".
  5. 1 2 http://www.ratemyprofessors.com/AddRating.jsp?tid=1458112
  6. http://www.ratemyprofessors.com/help.jsp#tally
  7. http://www.ratemyprofessors.com/help.jsp
  8. "RateMyProfessors.com versus formal in-class student evaluations of teaching".
  9. Mau, Ronald R., & Opengart, Rose A. (2012). Comparing Ratings: In-Class (Paper) vs. out of Class (Online) Student Evaluations. Higher Education Studies, 2(3), 55-68.
  10. Dennis E. Clayson (2013) What does ratemyprofessors.com actually rate?, Assessment & Evaluation in Higher Education, 39:6, 678-698, DOI: 10.1080/02602938.2013.861384
  11. Legg, Angela & H. Wilson, Janie. (2012). RateMyProfessors.com offers biased evaluations. Assessment & Evaluation in Higher Education. 37. 89-97. 10.1080/02602938.2010.507299.
  12. http://www.apa.org/gradpsych/features/2007/ratings.aspx
  13. Castro, Daniel, and Robert Atkinson. “Why it’s time to disrupt higher education by separating learning from credentialing”. Washington: Information Technology and Innovation Foundation, 2016. Online. Internet. 16 Apr 2018. Available: http://www2.itif.org/2016-disrupting-higher-education.pdf
  14. James Felton, Peter T. Koper, John Mitchell, and Michael Stinson. Attractiveness, easiness and other issues: student evaluations of professors on ratemyprofessors.com. Assessment & Evaluation in Higher Education, 33(1):45–61, 2008
  15. David Epstein, "‘Hotness’ and Quality", Inside Higher Ed, 8 May 2006, accessed 10 May 2008.
  16. Pfeiffer, Sacha (September 20, 2006). "Ratings sites flourish behind a veil of anonymity". Boston Globe Online.
  17. Westhues, Kenneth (December 2006). "Stephen Berman: Scapegoat". UWaterloo.ca.
  18. Huntsberry, William (February 23, 2015). "How We Talk About Our Teachers". WNYC Morning Edition.
  19. Lang, James M. (December 1, 2003). "RateMyBuns.com". Chronicle of Higher Education.
  20. See Fritz Machlup and T. Wilson, cited in Paul Trout, "Deconstructing an Evaluation Form", The Montana Professor, Vol. 8 No. 3, Fall 1998, accessed 7 May 2008.
  21. Edward B. Nuhfer, 2005, "A Fractal Thinker Looks at Student Evaluations", accessed 10 May 2008.
  22. Nuhfer, Edward Nuhfer (2010). "A Fractal Thinker Looks at Student Evaluations" (PDF). Retrieved April 25, 2010.
  23. Gabriela Montell, "The Art of the Bogus Rating", Chronicle of Higher Education, September 27, 2006
  24. Pfeiffer, "Ratings sites flourish behind a veil of anonymity".
  25. Montell, "The Art of the Bogus Rating", Chronicle of Higher Education.
  26. Bates, Laura (13 February 2015). "Female academics face huge sexist bias – no wonder there are so few of them". The Guardian. Retrieved 30 June 2018.
  27. Dalbey, Alex (29 June 2018). "Ratemyprofessors.com Ends Hotness Rating". The Daily Dot. Retrieved 30 June 2018.
  28. "RateMyProfessors.com – Find and rate your professor or campus". www.ratemyprofessors.com. Retrieved 2016-01-12.
  29. https://oag.ca.gov/ecrime/databreach/reports/sb24-59576
  30. Professors Strike Back on mtvU - As Seen on Rate My Professors
  31. http://www.ratemyprofessors.com/blog/video/albion-college-professors-read-their-ratings-part-2/
  32. http://www.ratemyprofessors.com/blog/video/submit-to-ratemyprofessors-com
  33. http://toplists.ratemyprofessors.com/
  34. https://www.forbes.com/sites/cartercoudriet/2017/08/02/top-colleges-2017-the-methodology/
  35. http://www.ratemyprofessors.com/blog/buzzpost/weve-won-two-peoples-voice-webby-awards/


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.