Deepfake scams are becoming more and more common. According to Bloomberg, in the United States alone, consumers lost almost $8.8 billion last year, up 44 percent from 2021. While scammers have perfected the art of their craft, there are some red flags to look out for and precautions you can take to avoid falling victim to a sneaky deepfake scam. Here are some important tips to keep in mind.
AI-Generated Voices Can Fool Parents
Bloomberg notes that computer-generated children’s voices, “so realistic they fool their own parents,” are commonly used in deepfake scams. They explain that parents get calls that clone their child’s voice with AI to sound indistinguishable from the real thing. Dubbed “social engineering scams,” they have the highest hit rates and generate some of the quickest returns for fraudsters.
“Cloning a person’s voice is increasingly easy. Once a scammer downloads a short sample from an audio clip from someone’s social media or voicemail message—it can be as short as 30 seconds—they can use AI voice-synthesizing tools readily available online to create the content they need,” they explain.
RELATED: FBI Releases 3 Tips to Protect Yourself as Violent Extremist Threats Increase
How to Defend Yourself
The Better Business Bureau offers a bunch of tips, starting with really paying attention to the videos you are sent. “Poor quality deepfakes are easy to identify. Look for isolated blurry spots in the video, double edges to the face, changes in video quality during the video, unnatural blinking or no blinking, and changes in the background or lighting. If you notice any of these telltale signs, you’re probably looking at a deepfake video,” they say.
Similarly, listen closely to the audio. “Fake audio might include choppy sentences, unnatural or out-of-place inflection, odd phrasing, or background sounds that don’t match the speaker’s location. These are all signs of fake audio,” they say.
Confirm the Identity
Artem Oleshko / ShutterstockAnd don’t believe everything you see online. “Scammers count on you to take them at their word without verifying their identity. Always use a healthy dose of skepticism when contacted by a person or company if you can’t validate who they really are. Be wary of videos featuring celebrities or politicians that are especially divisive or scandalous,” they say.
Another tip? “Make sure you know who you are talking to,” the BBB says. “As deepfake technology progresses, you’ll need to confirm the identity of who you are speaking with – even if you think you know and trust them.” While you probably would send money to a stranger who calls you out of the blue, “if scammers start using deepfakes to impersonate your loved ones, falling victim could be easier,” they point out. “Pay attention if a friend or family member makes an out-of-character request and confirms their identity before sending money or giving up sensitive personal information.”
Stay Alert
They also stress the importance of being careful about what you post online. “The only way a scammer can make a deepfake video of you is if they have access to a selection of photos and videos featuring your face. Stay alert to the possibility of impersonation. Make sure your family knows about deepfakes, and use caution when posting things publicly,” they say.
Another piece of advice? Don’t make financial decisions based on viral videos. “If a celebrity insists you invest in Bitcoin or donate funds to a specific charity in a viral video, do some research before you send money. Scammers would love to get their hands on your money by impersonating someone you trust,” they say.