Cybersecurity experts have issued a warning about an AI voice scam that involves scammers using AI voice synthesis to generate a voice that mimics the voice of the target’s loved ones to trick the victim into sending them money. According to a report, this type of scam is becoming increasingly common. In fact, a survey conducted by the Better Business Bureau found that this type of scam was the most commonly reported scam in the US in 2022, accounting for 23% of all reports.
The scam works by first obtaining access to the target’s social media profiles. The attackers then use the information gathered from these profiles, such as recordings of the target’s voice or information about their loved ones, to create a synthetic voice that mimics the target’s voice or that of their family member or friend. The attackers then call the victim, using the AI-generated voice to impersonate the person’s loved one and request money. According to the report, these scams are difficult to detect and can be highly convincing, making them an effective tool for cybercriminals. The report advises people to be wary of these types of scams and to take extra precautions when receiving unexpected phone calls.
Read more at: https://www.helpnetsecurity.com/2023/05/08/ai-voice-scam/