New kind of cybercrime that can use AI to "steal" your vioce

It’s easy enough to forge a signature for fraudulent purposes. However, until recently, some things—like our voices—have been distinctive and difficult to mimic. Not so in our brave new world.

A new kind of cybercrime that uses artificial intelligence and voice technology is one of the unfortunate developments of postmodernity. You can’t trust what you see, as deep fake videos have shown, or what you hear, it seems. A $243,000 voice fraud case, reported by the Wall Street Journal, proves it.

In March, fraudsters used AI-based software to impersonate a chief executive from the German parent company of an unnamed UK-based energy firm, tricking his underling, the energy CEO, into making an allegedly urgent large monetary transfer by calling him on the phone. The CEO made the requested transfer to a Hungarian supplier and was contacted again with assurances that the transfer was being reimbursed immediately. That too seemed believable.

However, when the reimbursement funds had yet to appear in accounts and a third call came from Austria, with the caller again alleging to be the parent company’s chief executive requesting another urgent transfer, the CEO became suspicious. Despite recognizing what seemed to be his boss’s voice, the CEO declined to make the transfer, realizing something was amiss.

Although the CEO recognized the familiar accent and intonations of the chief executive, it turns out that the boss wasn’t making the call. The funds he transferred to Hungary were subsequently moved to Mexico and other locations and authorities have yet to pinpoint any suspects.

Rüdiger Kirsch, a fraud expert at insurer Euler Hermes, which covered the victim company’s claim, tells the Journal that the insurance company has never previously dealt with claims stemming from losses due to AI-related crimes. He says the police investigation into the affair is over and indicates that hackers used commercial voice-generating software to carry out the attack, noting that he tested one such product and found the reproduced version of his voice sounded real to him.

Certainly, law enforcement authorities and AI topplay experts are aware of voice technology’s burgeoning capabilities, and the high likelihood that AI is poised to be the new frontier for fraud. Last year, Pindrop, a company that creates security software and protocols for call centres, reported a 350% rise in voice fraud between 2013 and 2017, primarily to credit unions, banks, insurers, brokerages, and card issuers.

By pretending to be someone else on the phone, a voice fraudster can access private information that wouldn’t otherwise be available and can be used for nefarious purposes. The ability to feign another’s identity with voice is easier than ever with new audio tools and increased reliance on call centres that offer services (as opposed to going to the bank and talking to a teller face-to-face, say). As the tools to create fakes improve, the chances of criminals using AI-based voice tech to mimic our voices and use them against us are heightened.

转载于:https://my.oschina.net/u/3884088/blog/3101290

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值