YouTube expands its AI resemblance detection tool across the entertainment industry, allowing access to celebrities and talent agencies for the first time.
The expansion comes six months after YouTube first deployed the AI detection tool in October, initially limited to a specific set of creators with YouTube channels.
On Tuesday, April 21, the company announced that talent agencies, management companies, and the celebrities they represent can now sign up, regardless of whether they have a YouTube channel. The expansion was developed with the support of talent agencies and management companies, including Creative Artists Agency (CAA), United Talent Agency (UTA), WME, and Untitled Management.
YouTube had previously announced in December 2024 that they had enlisted the help of CAA talents to create the AI-powered deepfake detection tool. The CAA provided feedback to YouTube to build the system and refine controls. The agency represents numerous artists like Ariana Grande, Beyoncé, Bob Dylan, and more.
The tool operates similarly to Content ID, used by rights holders to flag unauthorized use of copyrighted material. It analyzes newly uploaded videos on YouTube to identify content that may contain the likeness of registered creators.
To sign up, participants need a government-issued ID and a brief selfie video for identity verification and creating a facial likeness model. Verification takes up to five days. Once registered, participants can authorize agents or managers to review flagged content on their behalf.
YouTube stores resemblance models and identity information for up to three years after a participant’s last login or until they withdraw consent or delete their account.
The AI resemblance detection tool, an extension of YouTube’s privacy tools for deepfakes, aims to detect visual matches of a registered creator’s face. Future plans include expanding the detection to audio.
Youtube noted that the tool is still in the experimental phase and is continually refining the software. They encourage participants to report any discrepancies in the AI-generated content.
The music industry has been cracking down on deepfakes, with Sony Music requesting streaming platforms to remove over 135,000 songs allegedly created by fraudsters using AI to impersonate artists on their roster.
Denis Cuiseur, president of global digital activities and US sales at Sony Music Entertainment, said that deepfakes pose a direct commercial threat to legitimate artists. They can potentially damage an artist’s reputation or disrupt a release campaign.
Meanwhile, Spotify recently tested a new feature allowing artists to review and approve eligible releases before they go live. This move aims to protect artists against deepfakes and incorrect AI attributions.





/2026/04/25/69ed29a83c3a7575049610.jpg)
