More than half of traffic on the World Wide Web is mostly created by robots reporting on the activities of humans.
According to security outfit Incapsula in 2014, bots roaming the internet represented 56 percent of total web traffic. This is a decrease from 2013, when bots represented 61.5 percent of total internet traffic.
The majority of these bots are ‘good’ bots which include ‘crawlers’ that index web pages for search engines, social networking platforms, RSS feeds and translation services.
Incapsula is worried about an increasing number of ‘bad’ bots, which pose a threat to websites. The worst are ‘impersonator’ bots which are malevolent intruders engineered to circumvent common security measures. They have increased by 15 percent in the last two years.
Incapsul’s Igal Zeifman said that more than 90 percent of all cyber attacks that are executed by bots and the worst case scenario really depends on the attacker’s intentions and the target.
“Bots can spam, scam, spy, execute denial of service attacks and hack – they can do whatever a human hacker ‘teaches’ them to do, only on a much (much) bigger scale – and this arguably delays the internet’s growth, both as a medium and as a place of business.”
The overall decrease in bot traffic is the result of a steady drop in good bot activity. RSS services are dying and Google’s RSS service, Google Reader, shutdown in July 2013.
Zeifman said click-fraud bots that undermine the advertisers’ profitability are also growing in number.
“I think that today most advertisers have accepted the fact that some of their online budget will be lost on bots. However, I also believe that, as these losses continue to grow, the need for bot filtering solutions will become more and more clear,” he said.