An Internet bot, web robot, robot or simply bot, is a software application that runs automated tasks (scripts) over the Internet. Typically, bots perform tasks that are simple and repetitive, much faster than a person could. The most extensive use of bots is for web crawling, in which an automated script fetches, analyzes and files information from web servers. More than half of all web traffic is generated by bots.
Efforts by web servers to restrict bots vary. Some servers have a
robots.txt file which contains the rules governing bot behavior on that server. Any bot that does not follow the rules could, in theory, be denied access to, or removed from, the affected website. If the posted text file has no associated program/software/app, then adhering to the rules is entirely voluntary. There would be no way to enforce the rules, or to ensure that a bot’s creator or implementer reads or acknowledges the robots.txt file. Some bots are “good” – e.g. search engine spiders – while others are used to launch malicious attacks, for example on political campaigns.