story.
Misconceptions of Automated Web Traffic
Automated web traffic is one of the most destructive forces threatening web application security. Automated web traffic can perform unbelievable amounts of hidden requests and submissions against multiple characteristics of a website, many of which are not intended to be accessed by humans. The amount of automated web traffic is growing consistently, and it rises so too does the sophistication and complexity of the bot operators. Before discussing the activities of online bot traffic in detail, it is worth addressing some of the common misconceptions that website owners may have about automatic web traffic.1. Misconception: Bots are Just Simple Automated Scripts
The level of advancement of online bot traffic has been massively increasing as both the technology and platforms available to bot operators and, sophistication of defenses in place increases and most importantly, the gains to be achieved increase.
Modern bots are advanced persistent bots that will manage the distribution of traffic across large-scale environments and via multiple proxies to hide their activity among that of human users. Such advanced bots will frequently execute requests from real browsers and run JavaScript sent to validate users as humans. Detection mechanisms such as CAPTCHA can be bypassed, either by using artificial intelligence or brute-force systems or by employing farms of human agents to solve them on demand and pass the solution back to the bot. Such bots are intelligent enough to with these human services seamlessly.
2. Misconception: Online Bots Are Just a Web Security Problem
The challenge of managing automated web traffic is often handled by the web security department and information security officer. For some types of automated web traffic such as credit card fraud, it should be handled by the security department, and some other types of automated web traffic such as price aggregators are business considerations and should be managed by a relevant section of the business.
There are various other roles that may be involved in making decisions about different types of and challenges raised by automated web traffic. These can include functions such as head of ecommerce, head of platform, head of Ops, and head of marketing. The ideal management solution will provide sufficient information to allow people in these roles to view details of and make informed decisions about how to manage the elements of automated traffic specific to their roles without being dependent on a black box security-based system.
3. Misconception: Online Bot Operators Are Just Individual Hackers
As we all know, there are large organizations that operate automated web traffic networks and below that there are a group of organizations that are scraping data for legitimate purposes such as price aggregators, but sometimes a distributed set of lone hackers developing software to perform harmful scams or to sell to companies to spy on their competitors.
The amount of money that can be made with some types of automated web traffic means, in reality, complex unethical organizations employing technical experts and backed by human endeavor at an organizational, strategic and also at a lower level to complete manual tasks that are out of the scope of bot activity. There is also an increasing trend for the existence of third-party services that are focused on delivering automated traffic activity on demand.
4. Misconception: Only the Big Boys Need to Worry About Online Bots
Sometimes there can be a feeling that there are two types of bots,
Targeted bots – that focus on specific high-profile websites
Generic bots – targeted at spotted weaknesses in a large number of sites.
This may lead to a false sense of security for website owners of medium-sized websites; they might think that they have some general security protection in place, so the bot operators will never target their website. This is not true in reality; smaller sites tend to have fewer defenses so are easier targets. The frameworks that have been built are advanced to allow for easy expansion, and the available resources are such that a wide range of websites can be targeted. Small and medium-sized commercial online presences have been shown to be equally targeted by automated traffic activity.
5. Misconception: I Have a WAF, I Don’t Need to Worry About Bot Activity
Web application firewalls are beneficial tools that form a fundamental part of a security system. They are similar to network firewalls, instead of operating at a TCP/IP level, they serve at the HTTP level to process all incoming requests and match each request against a set of blocking requests, static rules that fail the checks. Therefore, they are very effective at removing off vulnerability scanning attempts such as SQL injection attacks.
However, WAFs are not efficient for identifying bot traffic, as the challenge of spotting automated web traffic is radically different. WAFs scan web traffic looking for illegitimate requests designed to exploit security weaknesses in web applications, whereas bot detection systems need to scan web traffic looking for legitimate requests that are aiming to exploit weaknesses in the business logic of a web application. Typically, this involves making a judgment after analyzing the series of requests made to look for patterns of behavior that differ from legitimate users.