当前位置:首页 > reviews > 正文

Scammers use AI to mimic voices of loved ones in distress

2024-12-19 07:44:38 reviews

Artificial intelligence is making phone scams more sophisticated — and more believable. Scam artists are now using the technology to clone voices, including those of friends and family.

The disturbing trend is adding to mounting losses due to fraud. Americans lost nearly $9 billion to fraud last year alone – an increase of over 150% in just two years, according to the Federal Trade Commission. 

The AI scam, which uses computer-generated voice, has left a trail of emotional devastation. Jennifer DeStefano, a mother, recounted during a U.S. Senate meeting her terrifying encounter with scammers who used the voice of her 15-year-old daughter, claiming they had her. 

"Mom, these bad men have me. Help me, help me, help me," DeStefano said she was told over the phone. 

But her daughter was safe in her bed. 

Kathy Stokes, the AARP director of fraud prevention, said younger people actually experience fraud and financial loss more often than older people, but it's the older generation who often have so much to lose.

Pete Nicoletti, a cyber security expert at Check Point Software Technologies, said common software can recreate a person's voice after just 10 minutes of learning it. 

To protect against voice cloning scams, Nicoletti recommends families adopt a "code word" system and always call a person back to verify the authenticity of the call. Additionally, he advises setting social media accounts to private, as publicly available information can be easily used against individuals.

    In:
  • AI

最近关注

友情链接