It’s Time to Take Down AI Taylor Swift

Przemek Klos/Shutterstock conzorb/Shutterstock Craiyon

Ever wonder what Taylor Swift and Harry Styles singing a song from the Disney movie “Tangled” would sound like? What about Rihanna covering a Beyoncé song? I didn’t realize that there was such a strong demand for such things until recently when AI-generated song covers overran my TikTok For You Page.

@damnedsink

Taylor Swift & Harry Styles – I See The Light (AI Cover) – from Tangled #harrystyles #fyp #taylorswift #swiftok #aicover #ai #kitsai (Made with @Kits.AI)

♬ Harry and Taylor I See The Light from Tangled – DamnedSink

The AI boom has made generating content easier than ever. Platforms like MusicAI and Voicify AI, among others, make the process as simple as selecting an artist and uploading a song file. Suddenly, you have the cast of “Glee” singing a song from “Barbie.”

Online fan communities have run wild with this technology—and the engagement that comes with it. Fans flood the comments with praise, like people stating that the Rihanna cover was “so natural and beautiful,” while the Disney-fication of Swift and Styles had fans begging for them to be cast in a live-action version of the film. The Swift and Styles video has over 4,000 comments. The TikTok account that first posted the mashup, which is entirely dedicated to AI covers, has over 12,000 followers and videos that attain thousands of views.

Other fans have also seen how successful this type of content is. One K-pop AI cover account has 17,000 followers, and this Dua Lipa account that posts both AI content and general fan videos has over 160,000 followers. 

Fans are eating it up. But I find this general support of AI songs within the fan communities to be at odds with the preservation of actual human creators, be they musicians or influencers.

To be fair, there are some artists, like Pharrell and Grimes, who have supported the rise of AI in the music industry as an inevitable technological development. But many more artists have spoken out against these covers. Selena Gomez said an AI version of her singing “Starboy” by The Weeknd was “scary.” Ed Sheeran has denounced the technology, saying that “if everything is done by robots, everybody’s gonna be out of work.” A concerned Lil Wayne told Billboard that AI couldn’t replicate his voice and style. 

But fans aren’t listening. AI Ed Sheeran has covered songs ranging from “Love Yourself” by Justin Beiber to “Again” by Fetty Wap. People have made AI versions of Selena Gomez singing various Taylor Swift songs. And Swifties have feverishly cranked out AI versions of her singing songs from Ariana Grande, Dua Lipa, and Olivia Rodrigo. There are entire TikTok accounts dedicated to these covers, which garner thousands of views.  

Swift herself hasn’t made any statement about AI. But, as an artist, she has made her general sentiments about controlling her creative output clear. Her quest to re-record her masters is all about owning her own music and having creative authority over her work. It’s particularly ironic for her fans to make AI versions of what they want songs from 1989 (Taylor’s Version) to sound like when the entire point of her re-recording project is to regain control over her creative catalog. 

Most of these covers don’t sound good—any AI version of Swift singing an Olivia Rodrigo song just sounds like a weirdly altered version of Rodrigo’s vocals. But quality doesn’t matter. The real harm is the threat of artists not benefiting from their lyrics or vocals being used for AI songs. Sure, someone like Swift or Beyoncé is likely too big of a cultural phenomenon to be entirely replaced by AI. But most, like the rising number of artists who have grown their audiences through TikTok such as Noah Kahan or PinkPantheress, are more at risk. 

Fans are also hurt by this sort of subterfuge. Just look at Discord communities dedicated to Harry Styles and One Direction, where people are selling ostensibly leaked songs for up to $400. A similar scandal fooled Frank Ocean fans in May when someone made $10,000 from selling AI songs meant to sound like the artist. 


Which is to say, AI covers hurt just about everyone involved. The artists lose creative control over their work. The fans are getting scammed. And people who are open about using AI to create “new” songs are making the practice a normal part of fan culture. Though it might be fun to imagine Beyoncé singing a Dolly Parton song, I think online fan communities need to be more intentional about what they are creating. In the long run, AI covers not only dilute the supply of art in the creator economy but also devalues the the fan experience.

Content for Creators.

News, tips, and tricks delivered to your inbox twice a week.

Newsletter Signup

Top Stories