Les Perelman

Les Perelman
Born Leslie Cooper Perelman
Los Angeles, California, U.S.A.
Education University of California, Berkeley, UMass Amherst
Occupation Educator
Years active 1980-present
Known for Criticism of standardized testing

Les Perelman is a research affiliate at the Massachusetts Institute of Technology.[1] Perelman taught writing and composition at MIT, where he served as Director of Writing Across the Curriculum and an Associate Dean of Undergraduate Education.[2] He was an executive committee member of the Conference on College Composition and Communication.[3] and Co-Chair of its Committee on Assessment [4]

Perelman is known as a critic of automated essay scoring (AES),[5] and was instrumental in the College Board's decision to scrap the Writing Section of the SAT.[6]

Teaching

Perelman taught in and directed writing programs at Tulane University and the University of Southern California. At MIT, he taught writing and composition and served as the director of Writing Across the Curriculum and an Associate Dean in the Office of Undergraduate Education.[2]

Criticism of the SAT Writing Section

Following his 2005 study of essay samples as well as graded essays provided by the College Board for reference on the writing portion of the SAT, Perelman reported a high correlation between the length of an essay and score received. He also noted that the essays were not penalized for any factual inaccuracies.[7]

In 2013, Perelman met with David Coleman the incoming President of the College Board, and an outcome of that conversation was Coleman's decision to abolish the mandatory SAT Writing Section.[8]

Criticism of automated scoring

In 2012, Perelman demonstrated that long pretentious incoherent essays could achieve higher scores from the ETS scoring engine e-Rater than well written essays. [9]

In 2014, Perelman collaborated with students at MIT and Harvard to develop BABEL, the "Basic Automatic B.S. Essay Language" Generator. The nonsense essays generated by BABEL are claimed to perform well when graded by AES systems. Automated graders, Perelman argues, "cannot read meaning, and they cannot check facts. More to the point, they cannot tell gibberish from lucid writing."[10] Perelman's work is cited by the NCTE in their Position Statement on Machine Scoring, which expresses similar concerns about the limitations of AES:

Computer scoring systems can be "gamed" because they are poor at working with human language, further weakening the validity of their assessments and separating students not on the basis of writing ability but on whether they know and can use machine-tricking strategies.[11]

References

  1. "People Directory". Massachusetts Institute of Technology. Retrieved June 11, 2015.
  2. 1 2 "iMOAT". Massachusetts Institute of Technology. Retrieved June 11, 2015.
  3. "2015 CCCC Officers and Executive Committee". National Council of Teachers of English. Retrieved June 11, 2015.
  4. "Committee on Assessment (November 2016)". National Council of Teachers of English. Retrieved July 29, 2016.
  5. "Construct Validity, Length, Score, and Time in Holistically Graded Writing Assessments: The Case against Automated Essay Scoring (AES)" (PDF). WAC Clearinghouse. Retrieved June 14, 2015.
  6. "The man who killed the SAT essay". Boston Globe. Retrieved June 14, 2015.
  7. Winerip, Michael (May 4, 2005). "SAT Essay Test Rewards Length and Ignores Errors". The New York Times. Retrieved June 11, 2015.
  8. Balf, Todd. "The Story Behind the SAT Overhaul". Retrieved April 5, 2015.
  9. Winerip, Michael (22 April 2012). "Facing a Robo-Grader? Just Keep Obfuscating Mellifluously". The New York Times. Retrieved 5 April 2013.
  10. Kolowich, Steve (April 28, 2014). "Writing Instructor, Skeptical of Automated Grading, Pits Machine vs. Machine". The Chronicle of Higher Education. Retrieved June 11, 2015.
  11. "NCTE Position Statement on Machine Scoring". National Council of Teachers of English. Retrieved June 11, 2015.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.