Aug 06, 2024
Cyber Kidnapping Poses a Growing ThreatTechnology is advancing rapidly, and things that would have seemed like science fiction just a few years ago have now become a common part of everyday life.
- Generative AI is capable of producing text, images, video, or audio.
- Deepfakes make it look like real people are doing and saying things they never did or said.
- Voice cloning can convincingly replicate anyone’s voice.
In the wrong hands, these technologies can be incredibly dangerous, and it’s not just consumers who are at risk. Many of these AI social engineering scams target businesses and their employees.
It’s Now Easier Than Ever to Scam Anyone
According to Security Intelligence, new technology means the barrier to entry for bad actors who want to carry out social engineering scams has never been lower. Criminals can now use deepfake and voice cloning tools in their schemes.
In some instances, they contact the call centers of major financial institutions, convincingly impersonating customers who need to access their accounts. In other scenarios, they contact company employees, posing as company executives or clients who are authorizing financial transactions. Businesses in any industry, not just the financial sector, can be targeted.
According to Regula, 37% of global businesses have already been targeted with deepfake and voice cloning fraud.
- In one example, The Hacker News says employees at software development company Retool were targeted with fraudulent texts from someone claiming to be part of the IT team. The employees were instructed to click on what appeared to be a legitimate link to fix a payroll issue. One employee clicked the link and was taken to a fraudulent landing page that requested credentials. The hackers then used AI to clone the actual IT worker’s voice in order to obtain the multifactor authentication code necessary to gain access. In the end, the hackers were able to access 27 customer accounts and steal $15 million worth of cryptocurrency.
- In another case, CNN says employees sent $25 million to fraudsters who used deepfake technology to impersonate the company’s chief financial officer and several other employees during a video conference call.
- According to The Guardian, scammers posed as the CEO of WPP, an advertising group, by creating a WhatsApp account with a publicly available image and using it to set up a Microsoft Teams meeting. The scammers then used voice cloning and video taken from a YouTube video to pose as the CEO. All this was done to try to trick an employee into setting up a new business in order to steal money and information, but thankfully, the ploy was not successful.
In another terrifying example, CNN reports that a mother was tricked into thinking her daughter had been kidnapped because the daughter’s voice had been cloned.
If voice cloning technology is good enough to convince a mother that she’s talking to her own daughter, it’s certainly good enough to convince your employees that they’re talking to their boss or a client.
Is Your Company Prepared for the Onslaught of AI-Powered Social Engineering Scams?
This isn’t a problem that’s likely to go away, so companies need to be vigilant. Here are some steps you can take to reduce the risk:
- Educate employees and contractors. By now, everyone is familiar with phishing scams. However, they may not realize scammers are using voice cloning and deepfake technology to pose as real people. They need to know that they can’t necessarily trust a familiar voice over the phone or even a familiar face in a video conference session. Train everyone to be suspicious of unexpected requests for wire transfers or sensitive information and to verify all such requests.
- Employ multifactor authentication. Hackers may come up with excuses to request MFA codes, but MFA codes should never be shared. Consider implementing other verification methods, such as biometric verification.
- Review your insurance coverage. Some policies may exclude or place sub-limits on losses from social engineering schemes in which employees willingly (but under false pretenses) hand over money or sensitive information. Review your policies to determine what coverage is in place and whether you need more.
Are you looking for insurance against AI social engineering scams? Need coverage for your clients? Tangram offers cyber insurance with social engineering fraud coverage with a $100,000 sub-limit. Learn more.
Caroline Dougherty
AVP, Sr KRE Underwriter
(P) 707-981-4366