In this post, I will discuss two applications of AI for fraud schemes. First, how criminals use emotions to help trick us in the grandparent scam. There are probably not too many things that pull on the heartstrings more than family members in trouble, especially when the trouble involves grandchildren. The bad actors have capitalized on this by convincing a grandparent (usually through a phone call) that their grandchild has been kidnapped, in a car accident, arrested, or been befallen by some crisis that requires an immediate payment of funds to resolve. The FBI’s Internet Crime Report of 2023 indicates that millions of dollars have been lost by victims who fell for this deceptive scheme.

Before Covid, I provided in-person presentations around the country in fraud topics. In more than just a few places, I would hear comments from audience members about how neighbors, friends and relatives had been victimized by the grandparent scam. The losses were mind boggling – in many cases it was $ 5,000 or more paid to the criminal to resolve a crisis that, ultimately, did not exist.

Now that I am back doing in-person events, I hear the same stories from those familiar with victims, but with a twist. The criminals used to convince grandparents that their grandchild was in trouble by subterfuge. Maybe a criminal using their own voice during a phone call but saying they were sick or hurt to excuse the fact that it wasn’t the grandchild’s voice. With AI, criminals have been obtaining voice exemplars of children from social media sites like TikTok or Instagram and using AI, in the same voice, to make it say anything that the criminal wants it to say. The result might be more convincing because it is the real voice of the supposed victim.

In a recent case, A 73-year-old grandmother received a phone call from a person who stated he was a police officer and that her grandson had been arrested on drug charges and needed $10,000 in bail money. The supposed officer then passed the phone to the grandson, who then urged his grandmother to post bail and said he was in serious trouble. Although the voice sounded like the grandson’s, but it was not actually him on the phone.

The alleged scammer managed to duplicate his voice, right down to switching from English and Italian, as he does when speaking to his grandmother, even calling her “nonna” — the Italian word for grandmother. It’s believed the caller used (AI) to mimic her grandson’s voice in English and Italian. In a panic, the grandmother hung up and immediately withdrew $10,000 to save her grandson. She was instructed to tell no one about what was happening, thinking her grandson’s life was in danger and was reluctant to tell anyone about the situation. The police said that the grandmother was the victim of an artificial intelligence-based phone scam that made her think she was speaking to her grandson who urgently needed money.

It is helpful for people to understand that actual kidnappings are very rare in the Unites States. Given the unlikelihood of it happening to a family member, some sage advice is to verify that a family member is really in trouble (kidnapped or another crisis) by calling that person at a number you know to be for them. Resisting any commands by the criminal not to do so will help to facilitate the crime. If a call is made, it is most likely that the “victim” will be found to be safe and not in jail, in a hospital or tied up by a kidnapper. Another idea to verify the authenticity of a crisis is to ask the caller to provide information that only the grandchild would know and that would not be available on social media. Having a pre-arranged code word that could be used in this regard will serve the same purpose.

The use of AI has made it more likely that the grandparent scam will be successful in extorting victims out of thousands of dollars. It is a good idea to keep in mind these FBI guidelines as it pertains to this scam:

  • Do not provide personal information or money to anyone you have only communicated with by telephone or online.
    Be careful about what you post online. Scammers can use details shared on social media platforms and dating sites to legitimize their story.
  • Be suspicious of telephone calls in which the caller requests bail money for a family member in distress and requires you to immediately act.  Contact the family member directly for confirmation.
  • Document any identifying information from the caller, such as name, phone number and/or address. This can help law enforcement apprehend the scammers.

Another example of how AI is being used to augment fraud is through a sophisticated type of robocall. With an understanding of human psychology and the ability to mimic human conversation, AI-driven systems could create convincing personas that are designed to build rapport and trust.

These calls utilized voice synthesis technology to emulate real people, engaging potential victims in personalized conversations. The AI on the other end would adapt its responses based on the person’s speech patterns, leading to more natural interactions and making it increasingly difficult to discern whether the caller was human or machine. Senior citizens were often targets due to their inherent trust in authority figures and their limited exposure to the digital landscape’s complexities.

The rapid advancement of artificial intelligence (AI) has ushered in an era of unprecedented technological innovation, revolutionizing industries and reshaping the way we interact with the world. From healthcare to finance, education to entertainment, AI’s influence has been undeniable. Yet, with every powerful tool comes the potential for misuse, and AI is no exception. This threat is evolving, and we should all stay tuned to see how AI will be adapted for more evil uses.

With that in mind, we should also be cognizant of how AI can be used to detect and prevent fraud by detecting anomalies more efficiently to curtail credit card fraud for example. Or the use of AI to help better thwart fraudulent robo calls and text messages. It could be an arms race between those who use AI for evil and those that use it to stop evil!