March 31, 2009
In the recent study, researchers Kathleen Gillis and Susan Lang examined the effectiveness of two of the most popular anti-plagiarism programs, Turnitin and SafeAssign.
Because a quick mouse click can make plagiarism an easy-out option for students on deadline, companies have created software programs that claim to help instructors discover dishonestly applied discourse in assignments.
However, a recent study of two programs by Texas Tech researchers found that two electronic eyes don’t have the 20/20 vision they claim for seeing “borrowed” copy.
As part of the Provost’s Integrity Matters initiative, Kathleen Gillis, director of the University Writing Center, and Susan M. Lang, director of first-year composition, headed a group to look into the pros and cons of two plagiarism detection services – Turnitin and Blackboard’s service, SafeAssign.
Their findings caused the team to seriously consider the benefits and liabilities of either program.
“Each program is capable of finding words, phrases and paragraphs that match words, phrases and paragraphs in their respective databases,” Gillis said “However, simple matching techniques neither prove nor disprove plagiarism. Identifying matching sets of words is not a good way to identify plagiarized texts. Therefore, neither program seems ‘best’ at what it promises to do.”
To begin, Lang said researchers used Turnitin and SafeAssign – called SafeAssignment at the time – to check a randomly selected batch of 200 papers on similar topics from the first-year writing program’s online database of student work. Then, the team performed a more thorough analysis with a smaller sample of these papers to see what each service flagged. Finally, they repeated this process with another 200 papers.
Sometimes one program would detect potentially unoriginal material another would overlook. Poor paraphrasing, incorrect mechanics in citation or identical wording frequently created false positives, they said.
While Turnitin marked more instances of potentially unoriginal material than SafeAssign, Lang said Turnitin didn’t necessarily find the correct source for the material, or only cited one source when a Google search of a known plagiarized phrase may have pulled up eight possibilities. That could make it difficult for a professor to prove plagiarism. Also, it tended to flag jargon use, such as “a recent study by” or “global warming.”
“Take the phrase ‘while many scholars have examined,’” Gillis said. “If a student paper contained this phrase and was submitted to Turnitin or SafeAssign, both databases would tag the student’s paper as plagiarized because the chances are high that such a phrase already exists in each database. Our research revealed that Turnitin tends to flag more of these sorts of phrases. SafeAssign tends to look for longer strings of phrases.”
Both programs’ marketing includes claims of checking for originality. But Lang said originality doesn’t necessarily mean good writing. Neither product seemed to take into account the audience, purpose, grammar, vocabulary or tone, which she deemed just as important to the writing process.
Also, Lang said she was irked with both programs’ color-coding method for originality. Both programs use green to signify if they flag 25 percent or less of the paper’s content as potentially unoriginal work. If the programs flag 26 to 50 percent of a paper’s content, the paper falls to a yellow. Orange and red signify papers flagged with 51 percent to 75 percent and 76 percent to 100 percent respectively. SafeAssign even adds a white category if only up to 10 percent passes as original material.
Lang likened the color-coding to the Homeland Security Terror Threat model, saying it seemed to encourage users to fear plagiarism.
“I could take 500 words from somewhere, drop that into a 2,000-word essay and still get a green pass with this,” Lang said. “What draws institutes of higher education to use these types of programs are the promises they make. But logically, the more papers that are put in the systems, the more chances you have at getting a match. So, you may get nothing back but five or eight different student papers with the same phrases. Thus, you just spent a lot of money to find out that students across the country write using similar words about similar topics. Big surprise there.”
Both Lang and Gillis say neither program substitutes for actually reading the students’ papers. Lang said many instructors can catch suspected plagiarism by hearing the change in tone and voice in the papers and then plugging suspect phrases into Google for free.
“In the end, students are really cheating themselves by choosing to plagiarize,” she said. “Writing is never easy. And by plagiarizing someone else’s work, students aren’t practicing critical thinking skills.”
Texas Tech University students, faculty and administrators are committed to creating a university atmosphere that is free of academic dishonesty.
All members of the university community are upheld to the standard of having integrity in the work they produce and contributing to the campus environment in an ethical, fun and honest manner. Integrity matters because student success matters.
What is Academic Dishonesty?
Strive for Honor boldly characterizes every member of our Red Raider family.
A Red Raider believes in: