Benn Jordan's AI poison pill and the weird world of adversarial noise

https://news.ycombinator.com/rss Hits: 6
Summary

Benn Jordan’s latest video proposes a way to fight back when generative AI music services rip off music for their data sets. It’s not ready for prime time yet, but it does offer a window into the wild, wonderful world of adversarial noise poisoning attacks. Now, if you run in circles like mine, you’ve already gotten, “Hey, have you seen this new Benn Jordan video?” already, and I suspect gotten as far as watching it, but here you go: VIDEO Benn’s approaches should have some real legs. There are two reasons to be optimistic. One, this family of techniques works on audio, so it covers the so-called “analog loophole“: it functions anywhere sound is heard. Two, there’s a potential to use different methods, thus obfuscating the results. You can also validate the results, meaning these could be updated if services react. It’s funny; when I spoke to Roland’s Paul McCabe about that company’s AI initiatives, I suggested a speculative design where you could press a button and block a performance from being trained. Benn actually went directly to the data science researchers to find out how that could be done – even in a live instrumental performance. Of course, you count as a CDM reader if your favorite music in the entire video is the targeted pressure wave attack at 22:00. The big gotcha – spoiler alert – is that this requires high-end GPUs and a massive amount of electricity and time to pull off. Computation doesn’t magically consume less power on its own, either – least of all with semiconductor trade wars looming. But now that the idea is out there, the challenge would be devising a more efficient method; this at least works as a proof of concept. In short, I’m for it. And I do expect a fear of training will stop some people from sending music to streaming services. It’s not hard to envision, as Benn does, a world where distributors license this technology to give producers added peace of mind. Remember in the early 2000s when we worried about protecting music from human...

First seen: 2025-04-15 17:12

Last seen: 2025-04-16 02:15