“SoundCloud’s AI Training Policy Sparks Outrage Among Musicians”
SoundCloud’s Sneaky AI Move: Why Musicians Are Pissed
Alright, let’s talk about something that’s been buzzing in the music world—SoundCloud’s super shady update to their terms of service. If you’re an artist who uploads music there, you might wanna sit down for this one.
What Just Happened?
So, back in February 2024, SoundCloud quietly slipped some new language into their terms of service (TOS). And by quietly, I mean they didn’t exactly send out a press release. The gist? If you upload your music to SoundCloud, unless you’ve got some special separate agreemen t, your tracks can now be used to train AI models.
Yep. That song you spent months perfecting in your bedroom studio? Fair game for AI companies to slurp up and use as training data.
Here’s the exact line from their TOS:
“You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services.”
Oof.
Why Are Artists Freaking Out?
Well, for starters, nobody likes being volun-told that their work is now AI food. Musicians found out about this months after the fact, and let’s just say the reaction hasn’t been… polite.
Take The Flight, a duo who’ve scored big films and games. They posted on Bluesky like, “Cool, cool… deleted all our songs and closing our account. Bye!” Composer Adam Humphreys chimed in with a “NOPE.” And Ed Newton-Rex, CEO of Fairly Trained (a nonprofit fighting for ethical AI), straight-up called SoundCloud out on X (formerly Twitter), saying they’ve got “major questions to answer.”
Can’t blame ‘em. If I found out my music was being fed to AI without my say-so, I’d be livid too.
SoundCloud’s AI Push
This isn’t coming out of nowhere. SoundCloud’s been cozying up to AI for a while. In early 2024, they rolled out “assistive AI” tools like:
– Tuney – Lets you remix, edit, and generate new tracks.
– AIBeatz – Claims to help you make “pro-level beats” with AI.
– Starmony – Promises “high-quality tracks” at the push of a button.
They even signed some “Principles for Music Creation with AI” pledge, promising “responsible and ethical” AI use. But… sneaking AI training into the TOS without a heads-up? Not a great look.
The Bigger Problem
This isn’t just a SoundCloud thing. Every platform with user-uploaded content is eyeing AI training as a goldmine. Pinterest did it. Google’s doing it. And now, SoundCloud—a platform that used to be the holy grail for indie artists—just joined the club.
The worst part? Most artists had no idea. SoundCloud didn’t email users like, “Hey, we’re gonna let AI chew on your music—cool?” Nope. They just… updated the TOS and hoped nobody’d notice.
What Can You Do?
If you’re an artist on SoundCloud, here’s your playbook:
1. Read the TOS – Yeah, it’s boring, but now you gotta.
2. Delete your tracks – If you’re not okay with AI training, pull ‘em down.
3. Look into alternatives – Bandcamp, DistroKid, or even self-hosting might be safer bets now.
Final Thoughts
I get it—AI’s here, and it’s not going away. But the way companies are handling this? Sneaky. Musicians already get screwed on royalties, exposure, and now… their own work is being used to replace them? Not cool.
SoundCloud was the place for underground artists. Now? Feels like another corporate machine cashing in on creators.
What do you think? Would you keep uploading if you knew AI was training on your music?