The deepfake danger: When it wasn’t you on that Zoom call


Deepfakes pose a real threat to security and risk management and it’s only going to get worse as the technology develops and bad actors can access malicious offerings such as deepfakes as a service.

In August, Patrick Hillman, chief communications officer of blockchain ecosystem Binance, knew something was off when he was scrolling through his full inbox and found six messages from clients about recent video calls with investors in which he had allegedly participated. “Thanks for the investment opportunity,” one of them said. “I have some concerns about your investment advice,” another wrote. Others complained the video quality wasn’t very good, and one even asked outright: “Can you confirm the Zoom call we had on Thursday was you?”

With a sinking feeling in his stomach, Hillman realized that someone had deepfaked his image and voice well enough to hold 20-minute “investment” Zoom calls trying to convince his company’s clients to turn over their Bitcoin for scammy investments. “The clients I was able to connect with shared with me links to faked LinkedIn and Telegram profiles claiming to be me inviting them to various meetings to talk about different listing opportunities. Then the criminals used a convincing-looking holograph of me in Zoom calls to try and scam several representatives of legitimate cryptocurrency projects,” he says.

As the world’s largest crypto exchange with $25 billion in volume at the time of this writing, Binance deals with its share of fake investment frauds that try to capitalize on its brand and steal people’s crypto. “This was a first for us,” Hillman says. “I see it as a harbinger of what we think is the future of AI-generated deepfakes used in business scams, but it is already here.”

The scam is so novel that if it weren’t for astute investors detecting oddities and latency in the videos Hillman may have never known about these deepfake video calls, despite the company’s heavy investments in security talent and technologies.