New York: You should not trust Wikipedia blindly for every topic as entries on politically controversial scientific topics could be subjected to “information sabotage”, a study says.
Study co-author Gene E. Likens, a distinguished research professor at the University of Connecticut, has monitored Wikipedia’s acid rain entry since 2003.
Likens had co-discovered acid rain in North America. “In the scientific community, acid rain is not a controversial topic. Its mechanics have been well understood for decades,” Likens said.
“Yet, despite having ‘semi-protected’ status to prevent anonymous changes, Wikipedia’s acid rain entry receives near-daily edits, some of which result in egregious errors and a distortion of consensus science,” he said.
In an effort to see how Wikipedia’s acid rain entry compared to other scientific topics, Likens partnered with Adam M. Wilson, a geographer at the University at Buffalo.
Together, they analysed Wikipedia edit histories for three politically controversial scientific topics (acid rain, evolution, and global warming), and four non-controversial scientific topics (the standard model in physics, helio-centrism, general relativity, and continental drift).
Using nearly a decade of data, Likens and Wilson teased out daily edit rates, the mean size of edits (words added, deleted, or edited), and the mean number of page views per day.
While the edit rate of the acid rain article was less than the edit rate of the evolution and global warming articles, it was significantly higher than the non-controversial topics.
Across the board, politically controversial scientific topics were edited more heavily and viewed more often, the researchers said.
“Wikipedia’s global warming entry sees 2-3 editings a day, with more than 100 words altered, while the standard model in physics has around 10 words changed every few weeks,” Wilson said.
“The high rate of change observed in politically controversial scientific topics makes it difficult for experts to monitor their accuracy and contribute time-consuming corrections,” he said.
“As society turns to Wikipedia for answers, students, educators, and citizens should understand its limitations when researching scientific topics that are politically charged,” Likens said.
Wikipedia does employ algorithms to help identify and correct blatantly malicious edits, such as profanity.
However, Likens and Wilson urged users to cast a critical eye on Wikipedia source material, which is found at the bottom of each entry. The study was published in the journal PLOS ONE.