How ‘Bad Bots’ Hamper Small Business Success

Bots are wreaking havoc on business web applications and online application finds Distil Networks’ new 2018 Bad Bot Report (registration required).

According to the cybersecurity company’s research unit, so-called “bad bots” accounted for over 21 percent of all traffic hitting websites in 2017, an annual increase of 9.5 percent. But what exactly constitutes a bad bot?

Automated applications and scripts can be considered bad bots, and they run the gamut in terms of the damage they can do. For example, some can perform price scraping on online stores. This may not seem like a big deal, but it can have a detrimental effect on a small company’s chances of succeeding in a competitive retail environment.

Price-scraping bots enable competitors to get an unearned leg up in the marketplace, according to the report. Businesses also stand to lose business because competitors will outperform their victims on price-based searches.

Content-scraping bots can steal content, damaging the SEO rankings of a small business because search engines penalize websites with duplicate content. And that’s just the tip of the iceberg.

Also invading the web are account takeover, credit card fraud and denial-of-service bots, among many other varieties that enable fraud and cause damage to a company’s web operations.

The biggest targets are gambling and airline sites. Distil Networks found that approximately half of all traffic to those types of sites—53.1 percent and 43.9 percent respectively—came from bad bots. Sophisticated, hard-to-detect bots are more likely to hit e-commerce, healthcare and ticketing websites.

And there’s evidence that sophisticated bots are an attacker’s weapon of choice.

Nearly three-quarters of bad bot traffic can be considered “moderate or sophisticated bots.” This describes bots that are tough to detect because they mimic a human’s mouse movements or use multiple IPs to stage an attack.

Most bad bots (82.7 percent) register as desktop web browsers, or rather their user agents read as Chrome, Firefox, Internet Explorer or Safari. Only 10.4 percent appear as mobile browsers.

There are steps small businesses can take to prevent falling victim to bad bots.

“Small businesses should look at blocking old user agents/browsers from accessing their website since today most browsers auto-update,” advised Edward Roberts, director of product marketing at Distil Networks. “They should also block any data center traffic. In addition, they should block access from any countries where they don’t do business. Beyond that, they should consider a bot mitigation solution.”

To avoid bots, many organizations are blocking traffic from foreign IP addresses.

Last year, one in five companies blocked country-specific IPs, stated the report. Russia took the top spot as the most-blocked country for the first time, while China, the former leader, dropped to sixth place.

Countering the notion that most bad bot activity pours out of shadowy hacker dens, Distil Research Lab found that nearly 83 percent of bad bot traffic came from data centers in 2017, compared to 60 percent in 2016. The company attributes this to the low cost and easy availability of cloud-computing services that attackers use as a launch pad for bot campaigns.

Must Read

Get the Free Newsletter!

Subscribe to Daily Tech Insider for top news, trends, and analysis.