Christopher D. Green wrote:
Turnitin plagiarism detection software seems to have high false positive
rate, and is supporting some favorable researchers.
http://www.insidehighered.com/news/2009/03/13/detect
Chris
--
ASU had a trial semester with Turnitin and my experiences were
similar to that reported in the article. The software had both a
high false positive and a high miss rate.
In addition to the false positives and misattributions noted in
the article, here are some misses: I submitted the manuscripts of
articles of mine that had been published and repeatedly
referenced and Turnitin counted them as legitimate first
submissions. I cut and pasted from classic psychology articles
and they were classified as legitimate. I cut and pasted several
paragraphs from psychology textbooks (in their n-th edition) and
Turnitin catagorized them as legitimate. Next, I challenged my
classes that semester to see what they could sneak by Turnitin.
The assignment was simple: The student got an extra point of
credit if they could report the rule they used and whether it
worked. More than half of the students could sneak material by
Turnitin for a wide variety of reasons -- text in foreign
language, text in technical language (chemistry, math), poetry,
textbooks. (The students loved this extra-credit opportunity.)
Finally, I played around with the sequence in which text was
submitted to Turnitin to determine how the program decided which
text was the original. The results were not satisfactory.
I would urge faculty to test this software carefully before
investing money in their services.
---------------------------------------------------------------
Kenneth M. Steele, Ph.D. [email protected]
Professor
Department of Psychology http://www.psych.appstate.edu
Appalachian State University
Boone, NC 28608
USA
---------------------------------------------------------------
---
To make changes to your subscription contact:
Bill Southerly ([email protected])