Criminals use deepfake audio, video, or voice-cloning technology to impersonate trusted individuals (e.g., executives, account holders) and trick financial institutions or victims into releasing funds, exploiting improvements in generative AI that can replicate voices with up to 95% accuracy. By crafting highly realistic facsimiles of a legitimate person’s voice or videoconference presence, adversaries initiate unauthorized transactions or extract sensitive details without raising immediate suspicion. They often pair these deepfake methods with text-based generative tools to enhance the credibility of urgent requests or high-level instructions; some actors also misuse deepfake capabilities to bypass voice-based identity checks on money mule accounts. Although a few advanced detection solutions can differentiate genuine from spoofed recordings by identifying unique acoustic signatures, many victims or financial institutions remain vulnerable due to insufficient security measures. This method often serves as a direct scam to generate illicit proceeds through deception, rather than a technique for placing preexisting criminal funds into the financial system.
Deepfake Impersonation
Impersonation Fraud Using Deepfake Technology
Tactics
Deepfake impersonation is used as a direct scam mechanism to generate new criminal proceeds by deceiving victims or financial personnel into releasing funds under false pretenses.
Risks
This technique exploits phone or digital channels that rely on insufficient or purely voice-based authentication measures, enabling highly realistic deepfake impersonation attacks. By cloning voices or videos of trusted individuals, criminals bypass standard identity checks and directly deceive financial personnel or automated systems. This channel vulnerability is central to successful impersonations, leading to unauthorized fund withdrawals or account access under false pretenses.
Indicators
Multiple unrelated customer accounts receive similarly styled voice or video calls exhibiting the same speech patterns or facial image distortions, suggesting a coordinated impersonation scheme.
High-value wire transfers initiated via a voice call purportedly from an account holder who has no history of similar transactions and provides no secondary verification.
Voice or video call displays digital artifacts, unnatural speech patterns, delayed responses, or mismatched lip movement that deviate from previously recorded calls of the genuine individual.
Requests for urgent fund disbursements originate from newly registered domains or unfamiliar phone numbers not previously associated with the genuine account holder.
Caller claiming to be a senior executive insists on bypassing standard security protocols and provides plausible-sounding reasons for skipping required identity checks.
Voice-based authentication or biometric checks are passed despite discrepancies with the known voice profile or previously recorded samples for that account holder.
Data Sources
Collects device fingerprints, IP addresses, login timestamps, and other cybersecurity signals within digital banking channels. Correlating these with suspicious voice-based transactions helps uncover deepfake impersonation by spotting anomalous device or network usage.
- Provides publicly available data on phone numbers, domain registrations, IP addresses, and other digital footprints.
- Enables verification of newly registered domains or unrecognized phone numbers used in deepfake impersonation attempts, helping flag suspicious contact points not associated with legitimate account holders.
Maintains a detailed record of all financial transactions, including timestamps, amounts, account identifiers, and transaction sources.
Allows for the correlation of unusual or high-value fund movements with potentially fraudulent voice requests, aiding in the detection of out-of-pattern transactions that may result from deepfake instructions.
- Contains records of authentication events, user logins, and network activity, including timestamps, IP addresses, and session details.
- Assists in detecting anomalous successful voice-based authentication attempts or unusual logins that may reveal bypassed security checks due to deepfake impersonation.
Consolidates histories of known or suspected fraud incidents, including impersonation scams, associated methods, and impacted accounts.
Helps investigators identify repeat or correlated deepfake-based scams and cross-reference impersonation techniques across multiple customer accounts or institutions.
Captures audio/video call logs, transcripts, and messaging activity, including timestamps, caller/recipient details, and any recorded content.
Enables comparison of suspected deepfake calls with known legitimate recordings to identify discrepancies in voice, speech patterns, or video artifacts, supporting the detection of impersonation attempts.
Mitigations
Configure targeted alerts for sudden large-value wire transfers initiated by phone voice requests from individuals without a prior transaction history of similar scale. Automatically pause these transfers pending further authentication to detect and prevent deepfake impersonation attempts that exploit unusual or urgent transactions.
Implement robust identity verification for high-risk phone or video instructions by requiring multi-factor authentication or advanced voice biometrics with spoof detection that deepfake technology cannot easily replicate. By validating authenticity beyond voice alone, institutions can thwart impersonation attempts seeking unauthorized fund releases.
Train frontline and supervisory staff to recognize potential deepfake indicators, such as unnatural vocal timbre, suspicious digital artifacts, inconsistent lip-sync on video calls, and urgent requests to bypass normal checks. Instruct employees to pause and verify large or unusual instructions through official internal channels or secondary confirmations.
Inform and remind customers, including corporate account holders, about deepfake scams and the importance of verifying large or urgent transfer requests. Provide specific guidance on confirming identity through known official channels or multiple points of contact before complying with unusual instructions.
Require out-of-band confirmation or dual authorization for high-value or urgent transactions initiated by phone or video. This measure ensures that no single voice or video call can approve a wire transfer without additional scrutiny or authentication through an alternate, secured channel.
Instruments
- Criminals use deepfake audio or video to impersonate legitimate account holders, such as executives or customers, tricking bank personnel into releasing funds or approving transfers.
- The highly realistic voice or visual imitation bypasses typical verification measures, like voice-based identity checks or personal Q&A, leading staff to believe they are dealing with the genuine account owner.
- This deception enables unauthorized withdrawals or wire transfers, directly generating illicit proceeds under false pretenses.
Service & Products
- Perpetrators posing as genuine owners of money mule or customer accounts use deepfake voice technology to pass call-based identity checks.
- Rapid cross-border transfers are initiated under false pretenses, leveraging realistic-sounding requests to evade suspicion.
- Deepfakes allow them to sidestep typical red flags, as staff believe they are speaking with the legitimate account holder.
- Fraudsters impersonate legitimate customers via deepfake audio or video calls to support teams, requesting password resets or security overrides.
- They exploit reliance on voice-based authentication or minimal remote verification measures.
- Once access is granted, criminals initiate unauthorized transactions or harvest sensitive account details with limited scrutiny.
- Criminals use deepfake voice calls to impersonate legitimate account holders or executives, instructing financial institutions to authorize large wire transfers.
- Voice-based identity checks are bypassed, as staff or automated systems are deceived by convincingly replicated speech patterns.
- Urgent or high-value requests are framed with advanced generative AI, reducing suspicion and prompting immediate release of funds.
Actors
They orchestrate deepfake-based impersonation by cloning the voices or videos of legitimate account holders, executives, or money mule holders to request unauthorized fund transfers or gain account access. This exploitation deceives financial staff or automated systems, enabling the release of illicit proceeds under false pretenses.
Criminals use deepfake calls or video chats to impersonate legitimate bank customers, prompting staff to reset passwords, override security settings, or approve high-value disbursements. This impersonation bypasses standard authentication measures, exploiting trust in familiar account holders.
Perpetrators clone the voices of money mule account holders to pass call-based identity checks, initiating unauthorized transactions under the false pretense of legitimate ownership. This tactic circumvents typical red flags associated with money mule activity by leveraging realistic-sounding verification calls.
Criminals impersonate senior or executive-level officials through deepfake audio or video calls, issuing high-priority fund transfer requests. The realistic imitation of authoritative figures convinces financial institutions to expedite transactions without secondary verification, creating an avenue for fraud.
References
O'Brien, V., Turner, E.J. (2024, July). SIA - SARS in action magazine, Issue 26. SIA - SARS in Action Magazine, 26. https://spoti.fi/3xRfMHC