Photo Credit: Raph_PH / CC by 2.0
The video’s premise is entirely false, prompting Finn and his team to post a sharp rebuttal on the band’s official Facebook page. “We’re not sure where this came from but please don’t be fooled,” the message begins. “Neil’s never had trouble with erections.” The band also notes that despite the visuals fooling several people, the voice imitation is notably off.
While some online responses took a lighthearted approach, adapting Crowded House lyrics to fit the video’s subject, industry observers see the incident as indicative of a serious and fast-growing concern within the music and entertainment industries. The use of deepfakes to promote questionable medical treatments using celebrity likenesses is especially troubling.
For Finn, the spread of misinformation was not just a personal affront but a clear warning sign about the evolving sophistication of generative artificial intelligence technologies—and their potential to create scams that harm society at large.
The real album, which does not feature a song by that exact name, was largely drowned out by the viral, meme-friendly fake. This clash of authentic artistry vs. AI mimicry highlights how opportunists can take advantage of fan hype for an upcoming album and completely derail it with minimal effort.
Of course, the problem of AI deepfake celebrities goes well beyond music. Oprah Winfrey and Gayle King, among other high-profile figures, have been similarly targeted by deepfakes spreading misinformation over the last few years. Winfrey was featured in AI-altered advertisements for endorsing a ‘manifestation’ course, while Gail King found her likeness used to promote weird weight loss gummies.
Even the FBI has weighed in on the issue, with former Director Christopher Wray participating in an Oprah special to warn about the dangers of manipulated “hyper-realistic but fake media.”