How low-tech safeguards can protect you from hi-tech AI scams


Job offer scams have increased dramatically over the past few years, with the Federal Trade Commission stating that financial losses suffered by victims increased from $90M in 2020 to half a trillion dollars last year …

Deepfake videos are increasingly common, but while scammers are using hi-tech AI methods, you can use some very low-tech safeguards to protect yourself.

Job offer scams are usually geared toward identity theft. You’ll receive an approach, often through a legitimate-looking LinkedIn profile, offering you an interview. At some point during the fake hiring process, you’ll be asked to verify your ID using something like your driving license, and that will then be used to take out credit cards and loans in your name.

There are companies out there fighting AI-powered scams with AI-powered detection software, but a Wired piece says that a lot of people are turning toward much simpler forms of verification.

Some corporate professionals are turning instead to old-fashioned social engineering techniques to verify every fishy-seeming interaction they have.

Welcome to the Age of Paranoia, when someone might ask you to send them an email while you’re mid-conversation on the phone, slide into your Instagram DMs to ensure the LinkedIn message you sent was really from you, or request you text a selfie with a time stamp, proving you are who you claim to be. Some colleagues say they even share code words with each other, so they have a way to ensure they’re not being misled if an encounter feels off.

Experts say that simple methods like this can be very effective, and both recruiters and candidates alike are using them. For example, in a genuine interview you might be asked a series of seemingly-random questions like your favorite local coffee shop – a simple test to see whether you do really live in the city shown on your résumé.

Another step either side can take is to ask the other person to use their phone camera to take a live photo of the laptop being used for the call. This will reveal whether it’s real or running deepfake software.

9to5Mac’s Take

Deepfake video tech in particular makes it easier than ever for a scammer to convincingly impersonate a friend, family member, or anyone else. If you receive an unexpected request for help, especially anything involving money, always use another method to contact them to verify. A growing number of people are agreeing codewords to be used by family members if they really are in trouble.

Unsolicited approaches from recruiters may be genuine, but should definitely put you on full alert. Always contact a company’s HR department independently via the contact number on their website to verify an approach or offer is real before supplying any personal data.

Photo by Chris Montgomery on Unsplash

FTC: We use income earning auto affiliate links. More.



Source link

Previous articleDominari Holding’s Strategic Interest in Bitcoin Mining Set to Go Public