For retailers competing to provide a seamless — and safe — online experience for consumers, bots pose a big problem. Hackers attempted a staggering 10 billion attempts to access retail sites between May and December 2018, according to a recent report by Akamai Technologies.
That’s no surprise.
The online retail market, expected to hit $4.9 trillion by 2021, is flush with cash. And because many shoppers keep the same username and password across multiple sites, it’s easy for malicious actors to attempt illicit purchases if an individual’s data is compromised.
The findings, detailed in Akamai’s “2019 State of the Internet / Security: Retail Attacks and API Traffic” report, examine a practice known as credential stuffing. This occurs when sophisticated all-in-one bots, many of which behave like human users, leverage stolen consumer data. A single bot can take that information and try to access as many as 120 sites at once, the report notes.
Retailers are woefully unprepared. Akamai cites a survey in which 71 percent of respondents say preventing attacks is difficult because it may diminish the user experience. Thirty percent are unable to detect or mitigate attacks, and 70 percent believe their strategy is lacking.
Bots Target Retailers for Their Data
The motivation is simple. Hackers want access to data, such as account balances, assets and personal information, that can be sold on the black market — and, to a lesser extent, the ability to purchase items for quick resale elsewhere.
Also of value are special discount codes and presale access often awarded to loyal customers.
Apparel is the most targeted vertical, with 3.7 billion credential stuffing attempts catalogued during the report’s period of study. But other segments have unique appeal: Office supply stores, for instance, can harbor sensitive data on businesses, while online jewelry retailers typically have clientele with high net worth.
APIs and IPv6 Traffic Create Blind Spots
A shift in the origin of internet traffic is causing new headaches, Akamai notes. Eighty-three percent of web hits are driven by application programming interfaces — not browsers. That’s nearly double the share found in a 2014 report.
With HTML now comprising just 17 percent of web traffic, a mounting security risk exists, according to the report: Some security tools that monitor browser-based traffic aren’t equipped to manage API traffic and react to intrusions accordingly.
Furthermore, the report notes, many security systems capable of using IPv6 still default to monitoring IPv4. Although IPv6 comprises little web traffic, the misconfiguration could signal network blind spots that bots might seek to exploit.
Improving detection and mitigation of bots — and an increased focus on keeping users from duplicating credentials between websites — are key defenses, Akamai notes. Another recent study, from Distil Networks, “2019 Bad Bot Report: The Bot Arms Race Continues,” provides key bot prevention tips for businesses of all types.
First, Distil advises blocking (or requiring a CAPTCHA test for) outdated user agents or browsers. Retailers also should block known hosting providers and proxy services tied to bot activity — and take the same precautions for exposed APIs and mobile apps.
It’s also essential to investigate suspicious traffic and unexplained spikes. Visits from a source with a high bounce rate or a surprise jump in activity could be a sign of trouble. Another giveaway: increases in failed attempts to log in or validate gift card numbers. It’s also helpful to follow the news. When word of a major breach occurs, it’s likely your site is at increased risk, the Distil report notes.