“SoundCloud Updates Terms to Allow AI Training on User Content”
SoundCloud’s New AI Policy: What It Means for Musicians & Producers
So, SoundCloud just quietly updated its terms of service, and—surprise!—now they can use your uploaded tracks to train AI. Yeah, you read that right. If you’ve been uploading beats, demos, or full tracks to SoundCloud, there’s a chance your music might end up feeding some machine-learning model.
Now, before you panic (or celebrate, if you’re into AI music), let’s break this down.
Wait, What Exactly Changed?
According to tech ethicist Ed Newton-Rex, SoundCloud slipped in a new clause in their terms of use back in February 2024 that says:
“You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies…”
Translation: Your music could be used to train AI models—unless you’ve got a special deal with them (like major labels do).
But… Can You Opt Out?
Good question. As of now, there’s no clear opt-out button in the settings. SoundCloud hasn’t made a big announcement about this, either. Kinda sketchy, right?
Why Is SoundCloud Doing This?
Well, they’ve been cozying up to AI for a while. Last year, they partnered with AI music tool companies to offer stuff like:
– AI remixing
– Vocal generation
– Custom sample creation
They promised ethical AI use, but now they’ve quietly given themselves permission to train AI on your tracks. Hmm.
Is This Legal?
Technically, yes—because you agreed to their terms when you signed up. But ethically? That’s a whole debate.
What About Other Platforms?
SoundCloud isn’t the only one doing this:
– X (Twitter) updated its privacy policy to let AI companies scrape posts.
– LinkedIn tweaked its terms to allow AI training.
– YouTube lets creators opt out of AI training (but you have to manually disable it).
The big question: Should this be opt-in instead of opt-out? A lot of artists think so.
SoundCloud’s Response
After some backlash, SoundCloud clarified:
– They aren’t currently using music to train AI.
– They added a “no AI” tag to block scraping.
– The terms were meant for internal AI tools (like recommendations, fraud detection, etc.).
But… the wording in their terms is pretty broad. So, who knows what happens next?
What Should Musicians Do?
If you’re worried:
1. Read the fine print before uploading.
2. Check for updates—SoundCloud might add an opt-out later.
3. Consider other platforms if this bothers you (Bandcamp, anyone?).
Final Thoughts
AI in music isn’t going away. Some artists will love it (free stem splitting, AI mastering, etc.), others will hate it (hello, deepfake Drake).
But platforms should be transparent about how they use your work. Sneaky terms updates? Not cool, SoundCloud.
What do you think—is AI training on your music a dealbreaker?