Thu. Jan 1st, 2026
Is Your New Favorite Artist Authentic? Here’s How to Tell.

A new song is gaining traction, and as Kylie Minogue famously sang, it’s proving difficult to dislodge from the mind.

However, this raises questions: what if the track was composed by a robot, or the artist themselves are a creation of artificial intelligence (AI)? Should streaming platforms be required to identify AI-generated music? And, ultimately, does it matter if the music is enjoyable?

A recent survey indicated that 97% of participants were unable to differentiate between AI-generated and human-created music. Yet, subtle indicators exist for those who know where to look.

Here’s a brief guide.

AI music gained prominence last summer following accusations that the band The Velvet Sundown was AI-generated, leading to their viral fame.

Despite lacking a record label and having a minimal social media presence, the band rapidly amassed hundreds of thousands of monthly listeners on Spotify after releasing two albums in quick succession, sparking suspicion within the music community.

Initially denying the claims, the band later described themselves as a synthetic project “guided by human creative direction, and composed, voiced and visualised with the support of artificial intelligence”.

They framed the project as an “artistic provocation” rather than a deception, but many fans felt misled.

Internet users questioned the band’s heavily edited photos, featuring generic backgrounds and a distinctive orange filter.

Furthermore, there was no evidence of live performances, no fan reviews, and a dearth of concert photos or videos. The band members had not granted interviews and appeared to lack individual social media accounts.

Investigating an artist’s real-world and online presence can be a useful method for verification. However, experts suggest that rapidly advancing technology is making it increasingly difficult to discern AI-generated music.

Nevertheless, there remain subtle signs that listeners can be aware of.

LJ Rich, who began creating AI music approximately five years ago, recalls that early AI systems could only generate three seconds of music at a time, requiring around 10 hours to produce one minute of audio.

Today, an entire song can be generated rapidly with a single prompt, leading to what industry experts describe as an “explosion” of AI music, often referred to as “slop,” on streaming platforms.

A formulaic feel – pleasant but lacking depth or emotional resonance – can indicate AI involvement, says the musician and technology speaker, as can vocals that feel strained.

AI songs tend to adhere to conventional verse-chorus structures and often lack satisfying conclusions. Rich notes that AI is more likely to produce grammatically correct lyrics, while some of the most memorable human-written lyrics can sometimes lack coherence.

Consider Alicia Keys’ “concrete jungle where dreams are made of” or The Rolling Stones’ use of double negatives in “(I Can’t Get No) Satisfaction.”

“If it doesn’t feel emotional, that’s a significant indicator,” the former BBC Click presenter continues. “Does it create the tension and release that is fundamental to the music we love? Does it tell a story?”

Another tell-tale sign is unrealistic productivity. Prof Gina Neff, from the Minderoo Centre for Technology and Democracy at the University of Cambridge, points to the case of an artist who was suspected of using AI after releasing multiple similar-sounding albums simultaneously.

Their songs resembled a mashup of 80s rock bands – “classic rock hits that had been put in a blender.”

“This may be suitable as background music for many, but it won’t create the superstars of the future who draw on the past but then create something entirely new,” she continues.

A song that sounds almost too perfect, lacking subtle imperfections and variations, can also be a red flag.

This might manifest as flawless vocals and overly polished production, according to Tony Rigg, music industry adviser and lecturer in music industry management at the University of Lancashire.

He adds that unusual phrasing, unnatural emotional delivery, and generic or repetitive lyrics can also be clues.

“AI hasn’t experienced heartbreak yet… It recognizes patterns,” he explains. “What makes music human is not just sound but the stories behind it.”

It’s also important to listen closely to the vocals. AI “singers” often sound slightly slurred, and consonants and plosives (hard sounds like “p” and “t”) may be imprecise. So-called “ghost” harmonies, where backing vocals appear and disappear randomly, can also be present.

However, Rigg cautions that these signs are “hints not proof,” acknowledging that it is not always easy for casual listeners to detect AI-generated music.

Beyond generating full songs, AI is also being used by established artists as a tool to augment their creativity.

Currently, there is no requirement – or standardized method – for artists to disclose if and how they are using AI.

Some are transparent: The Beatles, for example, used machine learning to isolate John Lennon’s voice from a 1970s cassette recording, enabling them to release “Now and Then,” dubbed their “last song,” in 2023.

Artists such as Imogen Heap and Timbaland have even created AI personas and released singles under their names.

Last month, Heap released the song “Aftercare” featuring her AI model, ai.Mogen, which was trained on her voice.

She initially developed the voice model as a chatbot – a “desperate attempt” to manage a flood of messages and requests, including those from fans – but it has since been featured on several songs, allowing Heap to participate in more collaborations than she would otherwise have time for.

While she admits that “it does sound different if you really know my voice,” she has invested significant effort in making the AI version sound human and believes that most listeners would be unable to distinguish it.

Heap is not attempting to deceive listeners – ai.Mogen is credited as a co-contributor on the track.

Her hope is that, if listeners connect with the song without realizing that part of the vocals are sung by her AI model, they may reconsider any preconceived negative ideas or concerns they have about AI.

“I hope that people listen, don’t realize, and find peace in that,” she tells the BBC.

She clarifies that she is not opposed to using AI to create music, but it is not something she has yet explored extensively.

Heap advocates for greater transparency regarding the elements that contribute to a song and how AI has been utilized.

Drawing a parallel to food labeling, she asserts that “we need that for music, and we need that for AI.”

Currently, streaming platforms are not legally obligated to label AI-generated songs, despite increasing calls for mandatory disclosures.

In January, Deezer launched an AI detection tool, followed by a system this summer that tags AI-generated music.

Deezer claims that its detection system can identify tracks made using the most prevalent AI music creation tools, and it is actively expanding its ability to detect music produced with other tools. The company states that the risk of false positives – incorrectly identifying human-created music as AI-generated – is very low.

This week, the company reported that a third (34%) of content uploaded to its platform was entirely AI-generated – approximately 50,000 tracks per day.

Manuel Moussallam, Deezer’s director of research, states that his team was initially skeptical when the detector flagged so many tracks, “pretty convinced we had an issue.”

The tool quickly flagged The Velvet Sundown – the band that went viral over the summer – as being “100% AI-generated.”

Other platforms have recently announced initiatives to enhance transparency.

In September, Spotify announced that it would roll out a new spam filter later this year to identify “bad actors” and prevent “slop” from being recommended to listeners. Over the past year, the platform has removed more than 75 million spam tracks.

Spotify is also supporting a system, developed by an industry consortium called DDEX, to enable artists to indicate where and how AI was used in a track. This information will be included in the metadata and displayed on its app.

Spotify states that its efforts are driven by a recognition of listeners’ desire for more information, as well as a commitment to “strengthening trust.”

“It’s not about punishing artists who use AI responsibly or down-ranking tracks for disclosing information about how they were made.”

If you’ve become a devoted fan of a new artist, does the use of AI matter?

Some argue that the presence of AI is irrelevant – engagement is based on enjoyment, and if music is pleasing, it serves its primary function.

Others maintain that music fans should have the ability to make informed decisions about their listening choices.

Artists have voiced serious concerns about the impact of AI, and hundreds of musicians, including Dua Lipa and Sir Elton John, have protested the use of their songs in the training of AI tools.

For LJ Rich, the use of AI in music raises numerous “weird and beautiful ethical questions” that remain unresolved.

“Like if the music makes the hairs on the back of your neck stand up, does it matter if an AI wrote it or not?”

In September, Liberal MP and Parliamentary Secretary for Combating Crime Vince Gasparro announced the ban “on behalf” of the Canadian government.

The lead researcher believes the machine will “offer new insights” into human behaviour.

The Maccabees, Two Door Cinema Club, The Wombats, CMAT and The Libertines will all play in 2026.

The Canadian singer will headline Friday night of the five-day event when it returns in July next year.

The musician was known for hits like You Can Get It If You Really Want and I Can See Clearly Now.