Site icon The Security Ledger with Paul F. Roberts

Report: Bad Bots sent One in Five Web Requests in 2018

Bad bots aimed at disrupting websites for financial gain are rising both in sophistication and industry scope as attackers are learning how to evade and invalidate existing defense mechanisms, a new report has found.


Distil Network’s annual assessment of bad bots, “Bad Bot Report 2019: The Bot Arms Race Continues,” found that bad bots accounted for one in five website requests in 2018, or 20.4 percent of web traffic. Nearly half of the bad bots tracked in the report impersonate Google Chrome, while activity using mobile browsers–such as Safari Mobile, Android and Opera–increased from 10.4 percent last year to 13.9 percent.

“While bad bot traffic percentages have decreased slightly for the first time since 2015 (from 21.8 percent last year to 20.4 percent now), bot sophistication flourishes,” Edward Roberts, director of product marketing at Distil Networks, told Security Ledger.

Report: Financial industry in crosshairs of credential-stuffing botnets

He said that advanced persistent bots (APBs) researchers observed in their work have learned to evade detection by leveraging more “human-like techniques,” such as “mouse movements and clicks that fool even advanced detection methods.”

Researchers found that 73.6 percent of bad bots they observed were APBs, which have more sophisticated capabilities than the average bots. These programs can cycle through random IP addresses, enter through anonymous proxies and change their identities, among other more complex behavior. This is making it more difficult than ever to defend against them, researchers said.

Distil researchers investigated hundreds of billions of bad-bot requests from 2018 over thousands of domains to reach their conclusions. The goal of the research is to offer guidance about the nature and impact of automated threats to those in charge of implementing and maintaining website security, they said.

Bots Zero In on Financial Services

Bad bots are autonomous programs that scrape data–such as pricing and inventory levels–from sites without permission in order to reuse it and gain some kind of competitive edge. Some of the most dangerous bots also engage in outright criminal activities, such as fraud and theft.

The overall goal behind the activity of bad bots is, unsurprisingly, financial gain, researchers said. Organizations use them to collect data to get a leg up on competitors, while criminals use them more overtly to steal funds from user accounts or engage in other nefarious activities.

Research: Sextortion Scams more frequent, sophisticated

Though their activity decreased, bad bots widened the scope of their targeted industries in 2018, with a boost of bot traffic in industries that previously had minimal bad-bot activity,

“While bots have impacted nearly every industry historically, we saw some interesting changes from last year’s report,” Roberts said. “This year, the financial services industry topped the charts with 42.2 percent of traffic comprised of bad bots. Last year, that percentage was 24.6 percent.”

Researchers also included bot-traffic percentages for a few key industries in this year’s report, including government (29.9% bad-bot traffic) and education (37.9% bad-bot traffic), Roberts said. “It’s important to note that each industry faces different challenges with bot activity,” he added.

For example, the financial services industry–a longtime target of bad bots–usually see these automated programs trying to access user accounts for monetary, researchers said. Bots targeting the education industry–tracked for the first time in this report–also tried to access user accounts, but engaged in other activity as well, including searching for research papers and class availability, according to the report.

Another sector, the gambling and gaming industry, has particular concerns because of its direct handling of transactions, researchers said. Accounting for 25.9 percent of bad-bot traffic, organizations in this industry saw bots trying to take over accounts so that funds or rewards points can be transferred to an attacker. Bad bots also scraped sites relentlessly for changing betting lines as a competitive maneuver, according to the report.

Winning the bot ‘arms race’

Because of the constant evolution of both bad bots and their traffic patterns, as well as their new evasive tactics, Roberts characterized the position of online businesses as being in “an arms race against bots,” the creators of which work tirelessly to attack websites around the globe.

“They use browser automation software, or malware installed within real browsers, to connect to sites,” he said of more advanced tactics bad bots are using. “APBs tend to cycle through random IP addresses, enter through anonymous proxies and peer-to-peer networks, and are able to change their user agents. They leverage a mix of technologies and methods to evade detection while maintaining persistency on target sites.”

This increase in sophistication and nuance means organizations have a bigger task ahead of them to defend against bad bots, Roberts acknowledged. However, while “there is no one-size-fits-all approach for organizations to protect themselves, there are proactive steps that can be taken to address the problem,” he said.

In its report, Distil rated bots as “simple,” “moderate,” or “sophisticated” in terms of their capabilities. In 2018, 26.4 percent of bots observed were simple; 52.5 percent were moderate; and 21.1 percent were sophisticated.

To protect against simple or moderate bots, researchers recommend that online businesses block or CAPTCHA outdated user agents and browsers, and block known hosting providers and proxy services, Roberts told us. Protecting sites against more advanced bots requires a bit more drastic action, however, he said.

“For more advanced bot protection, companies should consider blocking all access points, including mobile apps and APIs,” Roberts said. “Protecting a website alone is no longer enough. Companies should also evaluate traffic sources daily to look for anomalies and suspicious behavior, investigate traffic spikes and monitor failed login attempts.”

Staying up-to-date on the latest large public data breaches also is a good way to protect a site from bad bots, he added, since it’s directly after these events that bots usually run stolen credentials across websites to try to access user accounts.

Exit mobile version