“SoundCloud Updates AI Policy to Protect Artists, Requires Opt-In for AI Training”
SoundCloud’s AI Policy Backlash – What Happened & Why It Matters
Okay, so here’s the deal: if you’re an artist uploading music to SoundCloud, you probably care a lot about who gets to use your work—especially when it comes to AI. And recently, SoundCloud found itself in hot water over some sneaky wording in its terms of service that had people freaking out.
Let’s break it down.
What Went Wrong?
Back in February 2024, SoundCloud quietly updated its Terms of Use to say that by uploading your music, you agreed your content could be used to:
“inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies.”
Yikes.
Now, if you’re like me, your first thought is: Wait, does that mean AI companies can just scrape my music to train their models without asking? And yeah, that’s exactly how a bunch of artists and AI ethics folks took it.
One of the loudest critics was Ed Newton-Rex, founder of Fairly Trained (a nonprofit pushing for ethical AI training). He wasn’t having it—and honestly, neither were a lot of musicians.
Why This Was a Big Deal
AI in music is already a messy topic. Big labels are suing AI companies left and right (looking at you, UMG), and tech giants like OpenAI are lobbying governments to weaken copyright laws so they can train AI on whatever they want.
So when SoundCloud slipped this into their terms, it felt like another case of artists getting screwed over. Even though SoundCloud had previously partnered with AI tools like Fadr, Soundful, and Voice-Swap—which are supposed to be artist-friendly—this move made people question their real stance.
SoundCloud’s Response: “Our Bad, We’ll Fix It”
Fast-forward to May 2025, and SoundCloud’s CEO, Eliah Seton, steps in to clean up the mess. In an open letter, he admitted:
“The language in the Terms of Use was too broad and wasn’t clear enough. It created confusion, and that’s on us.”
Here’s what they changed:
- No More Sneaky AI Training – The new terms now say SoundCloud won’t use your music to train AI models that clone your voice or style unless you explicitly opt in.
- Artist Control is Key – Any future AI tools on the platform will require consent and transparency.
- They Added an “AI-Free” Tag – You can now tag your tracks to block AI training altogether.
Why This Matters for Musicians
If you’re uploading music online, you should care about where it ends up. AI is here to stay, but that doesn’t mean companies should get a free pass to use your work without permission.
SoundCloud’s update is a step in the right direction, but it’s also a reminder to always read the fine print. Platforms change policies all the time, and sometimes those changes aren’t in your favor.
Key Takeaways
- Check your platform’s terms – Especially if you’re uploading original music.
- Opt-in > Opt-out – SoundCloud’s new policy means AI training is opt-in only, which is way better than the alternative.
- AI isn’t evil, but misuse is – Tools like AI mastering or vocal synthesis can be helpful, but only if artists have control.
Final Thoughts
SoundCloud messed up, but they fixed it (mostly). The bigger lesson? Always know what you’re agreeing to when you upload your music. AI isn’t going anywhere, but how it’s used should be up to the artists—not tech companies.
What do you think? Should all platforms require opt-in consent for AI training, or is this just the start of a bigger fight?
References & Further Reading
Stay sharp, and keep making noise (literally). 🎵