How online retailers can win in the battle of the bots

By Josh Shaul, Vice President of Web Security at Akamai Technologies

Credit: ID 59320677 © Davinci | Dreamstime.com

Bots are tipped to account for more than half of the world’s internet traffic in 2019. In other words, there will be more bots and scrapers visiting your website today, than there will be humans.

Bots are a hot trend in the tech world, touted by the likes of Google, Amazon and Facebook. Even academics and journalists use scraping software to gather data. But when the majority of the traffic to your online business comes from bots, there can be a profound ripple effect.

This ripple effect spreads across multiple risks associated with bot traffic, including slow websites, frustrated customers, and increased IT expenses. There are also brand-related risks such as bots that scrape your website for inventory assets, pricing data or content.

If that’s not enough, you’ve also got to deal with the bots responsible for DDoS attacks, ad fraud, SEO spam and credential stuffing – just to name a few. Of the 55 billion credential stuffing attacks Akamai witnessed over the last 17 months, many originated from botnets.

Not only are these nefarious bots constantly evolving, the people developing them are actively trialling new evasion techniques, going so far as to hire developers with brand and vendor specific expertise. These bot operators are capable of accurately imitating human behaviour online, making it hard for standard security controls to detect and block their activities.

Needless to say, dealing with bots isn’t easy. The ability to disguise bot traffic renders relying on a visitors’ internet protocol address useless. CAPTCHAs can help, but they’re also an inconvenience for legitimate users to your site. To regain control over your website, you need to separate known-good bots from the bad bots, as well as ensure the good bots are adhering to established rules and restrictions.

Look for the humans, not the bots

Bad bots visiting your website will employ various tricks and tactics to avoid detection. The key here is to look for the humans, identify their common behaviours and let them through. For Akamai, this proved effective in the case of the “sneaker bots” - the use of resale bot networks to buy limited edition items and then resell those purchased items at a substantial mark-up.

While many retailers utilise bot detection programs, often bot operators are able to bypass these by disguising themselves as humans. They do this by recording thousands of human interactions on your website such as imitating mouse movements, clicks and typing patterns. Then, at the time of your limited product release, bots can sneak in, insert these pre-recorded actions, dart to the checkout page, and purchase multiple products.  

This can frustrate your “human” customers and cause them to take their business elsewhere. Upon studying these bot behaviours, Akamai narrowed its focus and was able to distinguish the real customers by searching for interactions between users and their devices. For example, when humans taps a button on their phone, it moves the phone ever so slightly, called “minute movements” – which is absent for bots.

Block some, but not all, bots

Your website interacts with a wide spectrum of bots on a daily basis; they can have a positive or negative impact on your business and website infrastructure. Search engine bots crawl and index your web pages to make them available on search engines like Google. Social bots are used by social media platforms like Facebook to give visibility to your website and drive engagement on their platforms. Content aggregators use multiple sources on the web to find product updates, price changes or stock quotes.

On the flip side, other bots and scrapers can be used for malicious purposes including to steal digital content, inventory and other proprietary information, or even participate in DDoS attacks to overwhelm a website by flooding it with traffic. Poorly coded bots slow down site performance and cause origin sources to compete and become starved from serving legitimate human customers.

The challenge is to manage and allow some - but not all – bots to scrape your website. The traditional approach of blocking every single bot can have unintended side effects on your business, without addressing the problem. For retailers, blocking bots can reduce the visibility of your website on search engines and social media, and your products on shopping comparisons sites like Choice or iSelect.

When you find bad bots, confuse them

For airlines, managing bots is critical from a security and competition perspective. Travel sellers and comparison sites can send business to airlines, so airlines want their flights to show up in the search results for those sites. However, many airlines use booking systems like Amadeus IT and pay fees each time a single user searches flight information on that system. If thousands of bots are searching for an airlines’ seat and pricing information, that can get quite costly.

Read more: Web browsing woes over for Nine Entertainment

To solve this issue, Akamai shows bots different cached pricing information from real human users, so airlines aren’t querying customers every time a bot checks prices and availability. This way, bots won’t get the most up-to-date information, but will get some data without costing the airlines much.

Bots are here to stay. As automation becomes the norm, bots will represent great business opportunities and also increased risk. Battling the bots means having visibility into their operations and actions. For the good ones, identify their legitimate origins and characteristics, as well as impersonation trends.

And for the malicious ones, wide visibility will enable actionable threat intelligence and the ability to cope with the increasingly sophisticated operations running large-scale attack campaigns. 


Josh Shaul is the Vice President of Web Security at Akamai Technologies 


Tags web securitybotsbot traffic

Show Comments