How a Family Password Can Protect Against AI Deepfakes

Photo - How a Family Password Can Protect Against AI Deepfakes
AI deepfakes have become entrenched in society, used for nefarious political campaigns or creating scandalous celebrity images for public entertainment. Now, these fake voices and videos threaten to breach the private domain of families, aiming to steal financial data or sensitive information.
As highlighted in our previous coverage, recognizing the magnitude of the threat is the initial step recommended by cybersecurity professionals. It's challenging for most people to fathom that a conversation could be intruded upon by a deepfake voice of a loved one, or that an identity could be swapped during a family video call.

However, experts at SocialProof Security, self-described as "ethical hackers" focusing on cybersecurity education, advocate for a simple yet potent scam deterrent: a family password.

How to shield your family from deepfakes? SocialProof Security CEO Rachel Tobac has devised guidelines to navigate the pitfalls of AI deepfakes:

  1. Select a password that's seldom used in your family's everyday language, and unlikely to be mentioned in casual conversation.
  2. Simplifying the initial advice: no need for complex phrases or sentences—a single word will do.
  3. Store this crucial verification word securely, ideally memorized. Do not reveal to others the existence of a family password or hint at the chosen secret word.
  4. Refrain from using pet names as the password.
  5. Avoid words linked to your profession or hobbies. 
If your company is a superhero-themed company and your passcode is ‘Batmobile’, that’s probably something I could guess,
Tobac notes.
Additionally, ensure the chosen word is clear to your children, who should understand its significance and purpose. With today's children engaging with gadgets from a young age, scammers might target younger family members. They could attempt to use an AI-synthesized voice of a parent or a video deepfake to coax a child into divulging financial or other sensitive information.

And with that, the ball is back in the court of the bots.