Cybersecurity Tips Newsletter
June 10, 2025

Don’t Fall for Deepfakes!

Artificial intelligence (AI) is a powerful tool, but in the wrong hands, it can be used for criminal purposes. A “deepfake” is a picture, video or audio recording that has been manipulated using AI. These media are made to be convincing impersonations of another person’s likeness, image or voice and can be used to depict an individual doing or saying something that they didn’t actually do or say.

Deepfakes are typically made with a form of AI that uses generative adversarial networks (GANs), which analyze existing pictures and videos to “map” or copy people’s faces and imitate their voices, allowing the user to create faked pictures and videos.

Deepfakes can be used to blackmail, harass or extort people. Don’t fall victim to these imposters; learn how to spot a deepfake and protect yourself from being scammed!

How can you spot a Deepfake?

  • Pay attention to small changes or differences in the image or video itself. This includes inconsistent lighting, movements, facial expressions, strange blinking and other unusual body tics.
  • If you need to verify that an image or video is real, AI-powered software can break down and analyze the images, videos, or voice recordings to determine their legitimacy. For example, the “DeepFake-O-Meter” was created by the University of Buffalo and is supported by the UB Office of the Vice President for Research and Economic Development, as well as the National Science Foundation.

How are deepfakes used in scams?

  • Business Email Compromise (BEC) Scams: Cybercriminals use deepfakes to impersonate an organization’s executives in video calls or emails to request fraudulent payments, W-2 information or wire transfers.
  • Financial Scams: Deepfakes are used to promote fake investment opportunities or ask for donations to fake charities, often using the likeness of a public figures. A recent example involved an 82-year-old man who lost more than $690,000 of his retirement savings after falling for a deepfake video of a famous tech billionaire.
  • Extortion Scams: Criminals create fake videos or audio of family members pretending to be in trouble and asking for money to help them out of whatever situation was fabricated. This method is similar to scams in which grandparents are called, told their grandchildren are in distress and ask for money. Deepfakes can also be used to create fake pornographic media to extort the victim by threatening their release.
  • Phishing Scams: Deepfakes can be used to make phishing attempts more effective by convincingly impersonating trusted sources, coworkers, family, friends and more.

 

How to protect yourself

  • Be wary of requests for immediate action. Cybercriminals try to create a sense of urgency and the need to act without thinking.
  • Take the time to educate yourself about current scams and fraud involving deepfakes.
  • Use strong passwords.
  • Enable multifactor authentication (MFA) on devices and accounts.
  • Verify sources of emails and texts.
  • If you suspect a deepfake, report it.

How Can You Report Deepfakes?

  • On social media, report suspicious activity directly to the platform. Review the platform's community guidelines, as they may have specific policies regarding deepfakes or manipulated media.
  • On websites, report suspicious activity directly to the website's administrator. Often, there will be a contact form for this purpose.
  • Report deepfakes directly to law enforcement. Contact the local field office or the FBI's Cyber Watch at [email protected], especially if you believe the incident involves illegal activity. You can also contact the Internet Crime Complaint Center (IC3), which allows public citizens to report cybercrimes, including deepfake-related offenses. Finally, you can report the incident to your local police department, particularly if the deepfake involves local criminal behavior.

 

Cyber Habit of the Month

Traveling for summer vacation? Think before you post photos on social media, so you don’t share more than you intended to! Criminals scan social media looking for houses that are possibly empty, making them tempting targets for theft. Consider waiting until you’re home to post vacation photos, and also consider turning off any setting that auto-tags locations on photos you post.

 

Additional Resources