On Saturday, September 26, 2015 German defense minister Ursula von der Leyen was accused by the VroniPlag Wiki website of plagiarizing over 75% of her 1990 dissertation on “the diagnosis of infections in pregnant women,” shocking many in the scientific community. However, delving deeper into the incidents of plagiarism on a global scale reveals that similar acts of fraud aren’t uncommon. In fact, with the rise in popularity of more sophisticated anti plagiarism software, the incidents of retractions due to scientific plagiarism have increased significantly since 2009 in publications like BioMed Central journals (Grens).
Not only are professional adults plagiarizing often (despite their increased probability of getting caught), but the sheer number of plagiarism incidents indicate that it’s a fairly commonplace practice specifically among the global scientific industry. As author Tina Amirtha explains, Science asked arXiv, an online archive of documents focused on math and physics, for information regarding plagiarism in their submissions. Science revealed that:
Among the 767,000 papers submitted from arXiv’s inception in 1991 until 2012, one in 16 authors were found to have copied long phrases and sentences from their own previously published work, and about one out of every 1,000 authors copied about a paragraph’s worth of text from other people’s papers without citing them.
Something that arXiv thought was important to consider when looking at these incidents was the fact that these submissions were coming from countries all over the world, and “plagiarism” is not defined the same way in every culture of institution.
[Tweet “Regulations on scientific plagiarism are more tolerant than to any other form of academic fraud.”]
Therefore, without a standardized plagiarism policy in place, it makes sense that arXiv would see a fluctuation in incidents of scientific plagiarism in different countries, especially if the tools (mostly “bots”) they were using to define and catch plagiarism were using rules defined by the British science community. Plus, it doesn’t take a rocket scientist (ha!) to understand how someone submitting an article written in a secondary language is more likely to use exact phrases from original sources written in that language than someone submitting an article in their mother tongue.
However, the cultural and language differences between Britain and other countries don’t account for all of the data regarding which countries had more incidents of plagiarism than others. The main correlation that the data has surfaced is that countries who submitted fewer papers had higher incidents of scientific plagiarism: Bulgarian authors submitted a total of 200 papers since Aug of 2011, 20% of which were flagged, compared to only 6% of Japan’s 4,700 articles submitted in the same time period.
But what does it really mean?
The only thing that’s certain is that the scientific community, like so many other communities, has yet to strike a common set of guidelines that are universal. This means that citing sources from certain countries risky for writers operating under strict and specific plagiarism rules in America or the UK. If more light is shed on the impact of these types of discrepancies in archives like arXiv, perhaps it will initiate movement towards a global standard in the scientific community.
However, when people like von der Leyen are caught cheating by folks in their own industry in their own country, the problem is more complicated. Despite the fact that plagiarism detection tools like Unicheck has been around for years, perhaps its existence has not yet had a big enough impact to change the old habits of industry professionals. It may take more folks getting caught for fraud to scare off plagiarism.