Audiobooks are a prime target for voice cloning. Resemble AI's security stack protects Kobo's catalog on two fronts: PerTh can embed inaudible watermarks into every audiobook so unauthorized copies and AI-training reuse can be traced, and Resemble Detect flags deepfake narrations that attempt to impersonate signed talent.
Watermarks survive the formats audiobooks actually travel through — compression, re-encoding, excerpting, and clip sharing on social. Rights holders get a defensible trail of evidence without changing the listener experience.
Embed PerTh watermarks into narrator audio so every Kobo title carries inaudible, tamper-resistant provenance markers.
Resemble Detect scans submissions for AI-generated or cloned narrator voices before they reach the Kobo store.
Watermarks persist through MP3 compression, chapter splitting, clip sharing, and even model retraining — the trail doesn't break.
Prove ownership of narrator performances and author-read editions. Trace leaks, piracy, and unauthorized AI-training reuse back to the source.
Detection models update as new voice-cloning systems appear. Your defenses stay current without infrastructure changes on Kobo's side.
Watermarks are psychoacoustically masked, undetectable to human listeners, with no impact on narration quality or fidelity.