AI-Powered Scams Cost U.S. Consumers Millions

ai scams

The number of scams that utilized artificial intelligence doubled in the past year, costing Americans more than $108 million.

According to a report from Authority Hacker, nearly half of AI scams resulted in financial losses, with an average loss of $14,600. That success rate was significantly higher than other types of fraud; only 28% of all fraud scams last year resulted in a loss.

“Fraudsters are using the sophistication of AI to create convincing communications with unsuspecting consumers,” said Suzanne Sando, Senior Fraud and Security Analyst at Javelin Strategy & Research. “Anecdotally, we’re hearing a lot about the headaches that bank imposter scams are creating for both financial institutions and their customers. Many of these scam attempts can be stopped by the customers themselves, if they have been properly educated by their bank on how to detect these scam communications.”

Urgent Language

The Authority Hacker report found that the costliest AI scams are investment-related. Roughly three-quarters of investment fraud victims lost some amount of money, with an average loss of nearly $55,000. Imposter scams are the second most costly AI scam, which include business impersonation and romance scams.

Although those scams are more expensive, the most frequent form of AI scams are online shopping and negative review scams. Online shopping scams are particularly prevalent because it is easy for cybercriminals to create convincing images of fake products using AI and sell them.

AI also makes criminals’ messaging more effective by utilizing deepfakes and voice cloning to forge aspects of an individual’s personality. Criminals typically couple that technology with manipulation tactics.

“Many times, a criminal relies on urgent language to prompt an immediate knee-jerk response by the consumer to click a link,” Sando said. “For example, the text may indicate that fraud was detected on the customer’s account, and they can verify the transaction by clicking a link included in the text. That link may install malware used to transfer information to the criminal that they can use to perpetrate further fraud-related crimes.”

Recognizing Patterns

Though it might seem like the elderly would be at most risk from AI scams, the report found that consumers between 30 and 39 were most likely to fall victim to an AI scam. One reason could be that adults older than 60 are less engaged with social media and sites where many AI scams originate. However, older adults are less likely to report fraud as a rule.

Because of the threat AI scams pose, financial institutions must educate their customers on how to detect and respond to them. For instance, banks should inform consumers that they shouldn’t respond to text or email messages directly but instead reach out to the business in question and get the confirmation they need.

“In addition, financial institutions should employ AI themselves,” Sando said. “It can do the heavy lifting in detecting these kinds of scams before the interaction and transaction goes beyond the point of no return.

“With AI and real-time scam detection, financial institutions can use vital consumer data to recognize patterns and instances where certain behaviors aren’t in line with how their customer normally behaves and transacts. This allows for critical intervention before a transaction is completed, saving the customer from sending money to a criminal and quite possibly never seeing those funds again.”

Exit mobile version