AI BEING USED FOR SCAM CALLS
Unfortunately, scams targeting older adults have become the norm. Fraudsters gain access to their personal information, either by mining social media or purchasing data on the Dark Web, and create scenarios in calls that frighten grandparents. The scammer often impersonates a grandchild or another close relative in a crisis situation, like an arrest, and asks for immediate financial assistance – for purported bail money. These fraudsters can “spoof” the caller ID to make the incoming call appear to be coming from law enforcement or a law firm.
The alarming news is that AI, or artificial intelligence technology, has elevated the grandparent scam to another level. Fraudsters can now “mimic voices, convincing seniors that their loved ones are in distress,” according to a recent Washington Post article. According to the piece, scammers can replicate a voice from just a short audio sample, and then use AI tools to hold a conversation in that voice, which “speaks” whatever the imposter types. Experts warn that the most prominent danger of artificial intelligence is the ability of this tech to blur the line between fact and fiction, and provide criminals with effective and inexpensive tools. Recent phone scams using AI-based voice transcription tools that are readily available online have US authorities concerned. According to Blackbird, “audio transcription via artificial intelligence, which has become almost impossible to distinguish from the human voice, allows ill-intentioned people, such as fraudsters, to obtain information and sums from victims in a more effective manner than they rely on. It usually is.”