Scams are getting scarily advanced nowadays - whether it's mimicking the sound of a loved one's voice, or duping a friend's phone number, all to get hands on your hard-earned cash.
In fact, artificial intelligence (AI) has completely changed the scamming game - making it a whole lot harder to tell what's real from what's fake online.
Luckily, there are plenty of experts on hand to help you be as vigilant as possible against possible scammers.
Advert
Cybersecurity expert Paul Bischoff, Consumer Privacy Advocate at Comparitech, has spoken to the US Sun - picking out three key signs that you could be speaking to automated artificial intelligence chatbots.
"It's getting increasingly difficult to distinguish between real people and AI chatbots," Bischoff said.
"This is especially true when it comes to customer service-related conversations in which human representatives are often limited to pre-made canned responses."
Advert
So, what do you need to look out for, to deduce whether you might be speaking to an AI-generated bot?
Well, firstly, Bischoff told the US Sun: "Look out for repetitive responses that seem to lack humor and empathy." So, if they're saying the same thing over and over - and don't seem to be reacting in a normal, human-like way to what you're saying, alarm bells should start ringing.
Next, Bischoff warned: "Bots usually have impeccable spelling and grammar but clunky, wooden phrasing."
And the third indication that you might be speaking to a robot, rather than a real person?
Advert
"Consistently fast response times are another sign," Bischoff said. Most humans need time to think and respond to what you're saying - whereas a bot likely has an armory of replies ready to shoot off at any second.
Scammers aren't just using chatbots to potentially dupe unsuspecting victims, but there's also been a terrifying rise in AI mimicking a loved one's voice over the phone.
These are known as 'voice clones' - and most often sound like a friend or family calling you in a bind, in desperate need of some cash.
Advert
These scams are even harder to detect - but according to the US Sun, there are a few ways to keep yourself safe. You could set up a 'safe word' with close friends and family ahead of time, something they know to use if they're calling you and asking for money.
If you haven't set up a safe word, you can quiz the person on the phone about a personal memory - something a bot is unlikely to be able to answer correctly.
In all instances, be very, very careful when anyone is asking you for money - particularly when they want you to send it via nontraditional routes, like cryptocurrency or gift cards.