AI: Pirate or Robin Hood?

July 30, 2025

One of the most notorious things about the Internet is pirating creative properties. The biggest victim is the movie industry followed closely by publishing. Creative works that people spend endless hours making are freely distributed without proper payment to the creators and related staff. It sounds like a Robin Hood scenario, but creative folks are the ones suffering. Best selling author David Baldacci ripped into Big Tech for training their AI on stolen creative properties and he demanded that the federal government step in to rein them in.

LSE says that only a small amount of AI developers support using free and pirated data for trading models: “Most AI Researchers Reject Free Use Of Public Data To Train AI Models.” Data from UCL shows AI developers want there to be ethical standards for training data and many are in favor of asking permission from content creators. The current UK government places the responsibility on content creators to “opt out” of their work being used for AI models. Anyone with a brain knows that the AI developers skirt around those regulations.

When LSE polled people about who should protecting content creators and regulating AI, their opinions were split between the usual suspects: tech companies, governments, independent people, and international standards bodies.

Let’s see what creative genius Paul McCartney said:

While there are gaps between researchers’ and the views of authors, it would be a mistake to see these only as gaps in understanding. Song writer and surviving Beatle Paul McCartney’s comments to the BBC are a case in point: “I think AI is great, and it can do lots of great things,” McCartney told Laura Kuensberg, but it shouldn’t rip creative people off.  It’s clear that McCartney gets the opportunities AI offers. For instance, he used AI to help bring to life the voice of former bandmate John Lennon in a recent single. But like the writers protesting outside of Meta’s office, he has a clear take on what AI is doing wrong and who should be responsible. These views and the views of over members of the public should be taken seriously, rather than viewed as misconceptions that will improve with education or the further development of technologies.

Authors want protection. Publishers want money. AI companies want to do exactly what they want. This is a three intellectual body problem with no easy solution.

Whitney Grace, July 30, 2025

Comments

Got something to say?





  • Archives

  • Recent Posts

  • Meta