Wikipedia has been every student’s saviour when it comes to finishing school or college projects. The world’s largest crowdsourced website contains information on any topic you can imagine.
As you already know, Wikipedia is an online encyclopedia with testable information. The idea that anyone with an internet connection can make edits to the information freely was bananas. It was never going to work, but somehow the site still serves its purpose.
Wikipedia launched 19 years ago, in 2001. In its early days, contributors carried out tasks like sorting out vandalism in the website as the information was available for everyone and anyone could have made edits. It was possible in the early days as the number of contributors was less and this resulted in lesser edits which could have been handled manually. But by 2007, the site started gaining an insane amount of traffic and encountered around 180 edits per minute. it went out of hand to control this amount manually.
Enter The Bots
Bot, short for “software robot”, is an automated tool developed by contributors to carry out specific duties. Currently, there are a total of 1601 bots working for Wikipedia carrying out different responsibilities.
Dr. Jeff Nickerson, a professor at the Stevens Institute of Technology in Hoboken, N.J., who has studied Wikipedia bots, told Digital Trends that the primary reason for which the bots were created, was protection against vandalism.
He told that there are a lot of times when someone goes to a Wikipedia page and defames it. With the amount of traffic on the website, it becomes really annoying and difficult for those who maintain those pages to continuously make changes to the pages.
“So one logical kind of protection [was] to have a bot that can detect these attacks.”, stated Nickerson.
Dr. Nickerson along with other researchers of the Stevens Institute of Technology carried out the first extensive analysis of all the 1601 bots working for the website. This study was published in the Proceedings of the ACM on Human-Computer Interactions Journal. According to the study, 10% of all the activities on the website are done by bots.
The study conducted by the researchers divided the bots into 9 categories according to their roles and responsibilities assigned to them. The categories are explained below:
- Generator – Responsible for generating redirect pages and pages based on other sources.
- Fixer – Responsible for fixing links, content, files and parameters in template/category/infobox
- Connector – Responsible for connecting Wikipedia with other Wikis and sites.
- Tagger – Responsible for tagging article status, article assessment, Wiki projects, multimedia status.
- Clerk – Responsible for updating statistics, documenting user data, updating maintenance pages and delivering article alert.
- Archiver – Responsible for archiving content and cleaning up the sandbox.
- Protector – Responsible for identifying violations, spams and vandalisms.
- Advisor – Responsible for providing suggestions for wiki projects and users. It also greets new users.
- Notifier – Responsible for sending notifications to users.
The Wikipedia that we know and trust for all our school/college projects won’t be the same without the help of these little guys who work tirelessly to make the platform more refined and trustworthy. In this time and age, bots have a negative reputation in the market. But these bots show that every coin has two sides. The Wikipedia bots are the immune system which protects the site and give us hope that technology can really help us. After all, we created technology, technology did not create us.
from Beebom https://beebom.com/bots-protecting-wikipedia/
No comments:
Post a Comment