AI MUSIC

AI just dropped an album, no humans were involved

Velvet Sundown shot to online fame with over a million Spotify streams, until it turned out the band wasn’t real.

Everything from the music to their backstory was made by AI.

At first, they claimed to be human-led, but conflicting statements eventually revealed they’d used the generative music tool Suno, calling themselves “not quite human. Not quite machine.”

The whole thing has reignited questions around transparency in music.

Right now, platforms like Spotify don’t have to tell users when a track is made by AI.

That’s raising eyebrows, especially from industry leaders who think listeners should know what they’re tuning into.

Here’s what’s worth knowing:

  • Velvet Sundown went viral before revealing they were AI-generated.

  • Industry groups are pushing for platforms to label AI music clearly.

  • Without updated rules, AI could repeat the pattern of tech profits and artist struggles.

Spotify’s not snitching

There’s also growing concern that AI models might be trained on music from real artists, without permission or pay.

Some services like Deezer are already tagging AI-generated tracks, but not all platforms are following suit.

Many are now calling for proper labelling, stronger copyright rules, and a clearer system to protect human creators.

The question is, why does a fake band have better branding than I?

Keep Reading

No posts found