Start using Compass

Harness Compass to aid in context checking, trust & safety policy and assessment of narratives and claims.

Sign up today

Original Claim

Deepfake videos are being used to impersonate financial executives online, promoting fraudulent schemes and misleading statements attributed to real individuals.

2 months ago

Context by Compass

The claim that deepfake videos are being used to impersonate financial executives online, promoting fraudulent schemes and misleading statements attributed to real individuals, is supported by multiple sources. Deepfake technology, which involves AI-generated videos and voice cloning, has indeed been used to impersonate executives in financial fraud schemes. For instance, a case in Hong Kong involved a finance worker transferring $25 million after a video call with deepfake avatars of company executives PYMNTS. Similarly, a deepfake video of financial analyst Michael Hewson was used in a fraudulent investment scheme Ironscales. The rise of such technology has been noted as a significant threat by cybersecurity experts, with the European Union's cybersecurity agency highlighting the increase in deepfake impersonation attacks Irish Times. These incidents underscore the growing sophistication of financial fraud facilitated by deepfakes, making it crucial for businesses to implement robust verification protocols and employee training to mitigate these risks.