Business Email Compromise (BEC) has been the dominant corporate fraud threat for years. Deepfake CEO fraud is its AI-era evolution — and it's substantially more convincing. Instead of an email impersonating an executive, attackers now conduct video calls using real-time or pre-recorded AI-generated video of the actual executive, making the deception visually credible in a way that text alone cannot achieve.
How Deepfake CEO Fraud Operates
Documented attack chains follow a consistent pattern:
- Target research: Attackers identify the organization, research the executive (CEO, CFO, board member), and collect source video from public sources — earnings calls, conference presentations, LinkedIn videos, media appearances
- Deepfake generation: Source video is processed through AI synthesis tools to create a realistic face-swap that can be run as a real-time filter or pre-recorded video segment
- Social engineering setup: Finance or accounting employees are contacted via email, phone, or messaging, often with a plausible context: an urgent acquisition, a confidential deal, a regulatory requirement
- The video call: The "executive" appears on video with a credible explanation for the urgency and directly instructs the employee to initiate a wire transfer
- Execution: The transfer is initiated to an attacker-controlled account. By the time the fraud is discovered, the funds have moved through multiple hops and are difficult to trace
The FBI's Internet Crime Complaint Center (IC3) has documented escalating BEC losses, and deepfake-assisted variants now account for a growing share of the highest-dollar cases. The FTC has also issued advisories on AI-assisted business impersonation fraud.
Notable Attack Characteristics
Organizations that have documented deepfake CEO fraud attempts share these common attack characteristics:
- Urgency framing: The request is almost always framed as time-sensitive — a deal closing today, a regulatory deadline, a confidential pre-announcement
- Secrecy requirements: Employees are often told not to discuss the transfer with colleagues or normal approval chains
- Targeting of new or junior finance staff: Employees who are less familiar with executive communication patterns and less likely to challenge a directive
- Multi-channel reinforcement: The video call is sometimes preceded or followed by spoofed emails or text messages to reinforce credibility
- International transfer routing: Destination accounts are typically offshore or in jurisdictions with limited recovery cooperation
Detecting Deepfake Video in Real Time
Current deepfake technology is convincing but not perfect. Warning signs include:
- Unnatural blinking rate (too infrequent or mechanically regular)
- Slight halo or edge artifact around the face where it meets the background
- Inconsistent lighting — face illumination doesn't match the room
- Subtle lip-sync delays between audio and video
- Audio quality inconsistency — voice doesn't quite match the visual environment's acoustic properties
- Unusual camera angle or very static positioning
Importantly: these tells are getting harder to detect as the technology improves. Do not rely on visual detection alone. The process controls below are more reliable than human visual inspection.
Organizational Defenses
🔒 Out-of-Band Callback Verification
Any wire transfer request above a defined threshold must be verified through a separate communication channel using a known, pre-established phone number — not any number provided in the original communication. Call the executive's direct line from your organization's contact directory. This single control defeats the attack regardless of how convincing the deepfake is.
✅ Dual-Authorization for Wire Transfers
Require two independent approvals for any significant wire transfer, from two different people via two different channels. No single employee — regardless of seniority — should be able to unilaterally authorize a large transfer based on a single communication.
📋 Verbal Code Word Protocols
Establish a verbal code word or phrase that executives use when making legitimate urgent requests — similar to the family code word system. Finance teams should be trained to request this phrase for out-of-band transfers, and executives should know to provide it proactively for legitimate urgent requests.
🎓 Regular Staff Training
Finance and accounting staff should receive regular training on deepfake CEO fraud patterns. Normalize the concept that questioning an urgent directive from the CEO is not insubordination — it's policy. No legitimate executive should object to a brief callback verification.
If your organization has been victimized, report immediately to the FBI IC3 and contact your bank's wire transfer recall team within hours. See the full recovery guide at AIScamRecovery.com.
Prevention resources at PreventAIScams.com — How to Verify AI vs Human.
Related Resources
- Recovery guide if you were targeted If this scam hit you, here's how to recover.
- How to protect yourself from AI scams Prevention tactics for the scams making headlines.
Frequently Asked Questions
What is deepfake CEO fraud?
Deepfake CEO fraud uses AI-generated video of a real executive in a video call to instruct finance employees to authorize large wire transfers. The deepfake makes the request appear to come from a trusted senior source.
How do organizations protect against deepfake CEO fraud?
Implement out-of-band callback verification for any wire transfer above a threshold. Require dual authorization. Establish verbal code words for urgent requests. Train finance staff to treat callback verification as mandatory policy, not optional.
What are the signs of a deepfake video call?
Unnatural blinking, facial edge artifacts, inconsistent lighting, lip-sync delays, and static camera positioning. However, process controls (callback verification) are more reliable than visual detection alone as technology improves.