As generative AI makes live voice and video cloning trivial, Google Meet has become a prime attack surface for social engineering — fake CFOs approving wire transfers, imposter candidates cheating on interviews, spoofed executives sitting in on confidential calls. Resemble AI integrates directly with Meet to score each participant's audio and video stream for deepfake likelihood in real time.
Detection runs continuously during the call, flagging suspicious participants to organizers without disrupting the meeting. Enterprise security teams can review flagged sessions, require re-authentication, or terminate the call — all backed by forensic logs that tie back to PerTh watermarks on authorized recordings.
Every speaker in a Meet call gets a live deepfake confidence score, updated frame by frame throughout the session.
Resemble Detect analyzes both audio and video streams, catching voice cloning and face-swap attacks in the same pipeline.
Flag suspicious participants to meeting organizers without tipping off the attacker or disrupting the call.
Harden high-value calls — board meetings, deal negotiations, candidate interviews — against impersonation attempts.
Export per-participant detection results and recording watermark data for incident response and legal review.
Detection runs on infrastructure that meets enterprise compliance standards. Available as cloud or on-prem deployment.